-
Notifications
You must be signed in to change notification settings - Fork 7
Using bia-bob with azure hosted OpenAI dodel #236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @sebi06 , thanks for reaching out! I never tested an azure-endpoint that is not github-models. So this might be indeed buggy. I presume we could make it to work by modifying the git-bob comment on how to do this. |
This message was generated by git-bob (version: 0.20.0, model: gpt-4o-2024-08-06, log), an experimental AI-based assistant. It can make mistakes and has limitations. Check its messages carefully. To adapt bia-bob for using Azure-hosted OpenAI models, you might need to modify the def generate_response(chat_history, image, model, system_prompt, user_prompt, vision_system_prompt):
from ._machinery import Context
from .endpoints._azure import generate_response_from_azure
if Context.endpoint == "azure" and "gpt-" in model:
full_response = generate_response_from_azure(
model, system_prompt, user_prompt, chat_history, image,
base_url=Context.endpoint, api_key=Context.api_key,
vision_model=Context.vision_model,
vision_system_prompt=vision_system_prompt
)
# rest of your conditions...
return full_response In this modification, it ensures that Azure-based |
Hi @sebi06 , ok, the AI gives wrong advice here. And I checked the example here which should actually work for you too. I just tested this on my machine and it works (using github models though):
Would you mind testing this with your url and api key? If it does not work, could you please copy the entire error message here for debugging? Thanks! Best, |
this step seems to work fine (using our endpoint, which also ends with *.azure.com ) - I see no errors etc. import os
from bia_bob import bob
bob.initialize(endpoint="https://xyz.azure.com",
api_key=os.environ.get('AZURE_API_KEY'),
model="gpt-4o") But when I try:
I get the error:
|
Can you please carefully check if
And may I ask what the value of Also from your code snippet on the very top:
There are import statments missing. What library are you using for this? Does this work too? (source)
|
I use This always worked for me: client1 = AzureOpenAI(
api_version=os.environ["AZURE_OPENAI_API_VERSION"],
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"],
)
print("\n".join([model.id for model in client1.models.list().data])) And your code client2 = OpenAI(base_url=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"])
print("\n".join([model.id for model in client2.models.list().data])) does fail with:
|
|
Good point. I checked and it is: |
Any news or things I can try out? |
7796
Hi @haesleinhuepf
I tried to test you really cool "bia-bob" a bit internally, but so far I just cannot get it to work. And I think it might be due to our internal environment.
later in my notebook I tried:
bob.initialize(endpoint="azure", model="gpt-4o-2024-08-06")
and also using the endpoint directly, but always get errors. I tried to modify "_utilities.py" because I think our internal endpoint confuses the code because:get called but we are using gpt-models, but not hosted by open AI. I then leads to:
NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Our endpoint looks like:
https://xyz.openai.azure.com/
Do you have a hint what I need to modify to make this work?
The text was updated successfully, but these errors were encountered: