-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Bug: SK can't call AzureOpenAI o1 model #10201
Comments
Relate to Azure/azure-sdk-for-net#47809 |
Hi @rwjdk, can you please make sure you are indeed sending the |
@moonbox3: As I'm using AzureOpenAI services I'm not sending a version at any point in the call (just the deployment name which in my case is "o1") I've checked the raw request to make sure and no version is mentioned in there In Azure AI Studio the version is 2024-12-17 (the only option) But taking above image I saw something odd - The target URI that you do not choose yourself (and never use in SK says api-version=2024-12-01-preview so it could be an Azure bug [I'm in swedenCentral Azure Region]) Can one in SK alter the target URI version? |
Did a quick test with the Azure.AI.OpenAI (v2.1.0) nuget package directly and it gives the same error so I guess SK Team depends on having this dependency supporting/working first :-/ |
Yes, you can use a different Azure API version. Tagging @SergeyMenshykh for a specific example on how to in .Net. |
@moonbox3 I tried that but as the underlying platform do not yet support the version it still fails with a not implemented exception :-/ Assume this is what you mean: |
It's not supported by the Azure.AI.OpenAI SDK yet - https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClientOptions.cs |
Apologies for misleading @rwjdk. The ability to specify a custom API version is allowed in SK Python, but per @SergeyMenshykh, this isn't supported in the Azure.AI.OpenAI SDK yet. |
As of now, the SDK doesn't have an option for Blocking this as this depends on Azure SDK to support the version. Tracking issue: |
As a workaround you can add a Handler to your connectors thru a Important This is a breaking glass scenario and should be dropped as soon the Azure OpenAI SDK supports Usageusing var httpClient = new HttpClient(new AzureOverrideHandler(apiVersion));
var apiKey = config["AzureOpenAI:ApiKey"]!;
var endpoint = config["AzureOpenAI:Endpoint"]!;
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion("o1-mini", endpoint, new AzureCliCredential(), httpClient: httpClient)
.Build(); Http Handler (For version override, and max token count fix)public partial class AzureOverrideHandler: HttpClientHandler
{
private string? _overrideApiVersion;
public AzureOverrideHandler(string? overrideApiVersion = null)
{
this._overrideApiVersion = overrideApiVersion;
}
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
var options = new JsonSerializerOptions()
{
WriteIndented = true
};
using var oldContent = request.Content;
if (oldContent is not null && request.RequestUri is not null)
{
var requestBody = await oldContent.ReadAsStringAsync(cancellationToken);
if (requestBody.IndexOf("\"model\":\"o1") > 0 && requestBody.IndexOf("\"max_tokens\":") > 0)
{
requestBody = requestBody.Replace("\"max_tokens\":", "\"max_completion_tokens\":");
request.Content = new StringContent(requestBody, new MediaTypeHeaderValue("application/json"));
}
// Console.WriteLine("Request body: " + JsonSerializer.Serialize(JsonSerializer.Deserialize<JsonElement>(requestBody), options));
}
if (this._overrideApiVersion is not null && request.RequestUri is not null)
{
var requestUri = request.RequestUri.ToString();
var currentVersion = CurrentApiVersionRegex().Match(requestUri).Value;
if (!string.IsNullOrEmpty(currentVersion))
{
request.RequestUri = new Uri(requestUri.Replace(currentVersion, this._overrideApiVersion));
}
else
{
request.RequestUri = new Uri($"{requestUri}?api-version={this._overrideApiVersion}");
}
// Console.WriteLine(request.RequestUri);
}
return await base.SendAsync(request, cancellationToken);
}
[GeneratedRegex(@"\d{4}-\d{2}-\d{2}(-preview)?$")]
public static partial Regex CurrentApiVersionRegex();
} |
Cool. Thank you @RogerBarreto . Will try it out first thing tomorrow |
Describe the bug
When doing an agent.InvokeAsync call against the o1 model (version:2024-12-17) in Azure Open AI service you get error:
"HTTP 400 (BadRequest)\r\n\r\nModel o1 is enabled only for api versions 2024-12-01-preview and later"
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A normal LLM respose (in this case json due to structured output)
Platform
Additional context
Issue is properly the transative use of the azure.ai.openai beta2 package. I tried manually bumping this to latest, but it gave an feature not implemented exception
The text was updated successfully, but these errors were encountered: