Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Removed Azure and OpenAI text generation services #4246

Closed

Conversation

dmytrostruk
Copy link
Member

Motivation and Context

Resolves: #3154

Removed Azure and OpenAI text generation services in favor of chat completion services.

Contribution Checklist

@dmytrostruk dmytrostruk self-assigned this Dec 13, 2023
@dmytrostruk dmytrostruk requested a review from a team as a code owner December 13, 2023 21:05
@shawncal shawncal added .NET Issue or Pull requests regarding .NET code kernel Issues or pull requests impacting the core kernel labels Dec 13, 2023
@dmytrostruk dmytrostruk added the PR: ready for review All feedback addressed, ready for reviews label Dec 13, 2023
@dmytrostruk dmytrostruk added this pull request to the merge queue Dec 14, 2023
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Dec 14, 2023
Copy link
Member

@markwallace-microsoft markwallace-microsoft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Blocking until I've had a chance to review

Copy link
Member

@RogerBarreto RogerBarreto left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion

@@ -435,7 +438,7 @@ public async Task MultipleServiceLoadPromptConfigTestAsync()
@"{
""name"": ""FishMarket2"",
""execution_settings"": {
""azure-open-ai"": {
""azure-text-davinci-003"": {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this?

@@ -101,6 +101,7 @@ private void ConfigureAzureOpenAI(IKernelBuilder kernelBuilder)

kernelBuilder.AddAzureOpenAIChatCompletion(
deploymentName: azureOpenAIConfiguration.DeploymentName,
modelId: azureOpenAIConfiguration.ModelId,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
modelId: azureOpenAIConfiguration.ModelId,
modelId: azureOpenAIConfiguration.ChatModelId,

@@ -12,16 +12,25 @@ internal sealed class AzureOpenAIConfiguration

public string DeploymentName { get; set; }

public string ModelId { get; set; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
public string ModelId { get; set; }

{
this.ServiceId = serviceId;
this.DeploymentName = deploymentName;
this.ModelId = modelId ?? deploymentName;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
this.ModelId = modelId ?? deploymentName;

@@ -10,12 +10,14 @@ internal sealed class OpenAIConfiguration
{
public string ServiceId { get; set; }
public string ModelId { get; set; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
public string ModelId { get; set; }

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RogerBarreto this change won't work because there is another settings for embeddings which uses ModelId and we use the same C# class for all the settings:

"OpenAIEmbeddings": {
  "ServiceId": "text-embedding-ada-002",
  "ModelId": "text-embedding-ada-002",
  "ApiKey": ""
},

That's why I removed Chat prefix initially, reverted it back and proposed to keep it as it is for now as alternative, because settings and testing infrastructure need to be revisited completely.

public string ApiKey { get; set; }

public OpenAIConfiguration(string serviceId, string modelId, string apiKey)
public OpenAIConfiguration(string serviceId, string modelId, string apiKey, string chatModelId)
{
this.ServiceId = serviceId;
this.ModelId = modelId;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
this.ModelId = modelId;

@@ -1,12 +1,14 @@
{
"OpenAI": {
"ServiceId": "open-ai",
"ModelId": "gpt-3.5-turbo",
"ModelId": "text-davinci-003",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"ModelId": "text-davinci-003",

"ApiKey": ""
},
"AzureOpenAI": {
"ServiceId": "azure-open-ai",
"DeploymentName": "gpt-35-turbo",
"DeploymentName": "text-davinci-003",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"DeploymentName": "text-davinci-003",

@dmytrostruk
Copy link
Member Author

Closing this PR as text generation services can work with gpt-3.5-turbo-instruct model which is not going to be deprecated.

@stephentoub
Copy link
Member

Closing this PR as text generation services can work with gpt-3.5-turbo-instruct model which is not going to be deprecated.

The ITextGenerationService implementation in OpenAIChatCompletionService doesn't work with it? Isn't it identical to the implementation used by OpenAITextGenerationService?

@dmytrostruk
Copy link
Member Author

The ITextGenerationService implementation in OpenAIChatCompletionService doesn't work with it? Isn't it identical to the implementation used by OpenAITextGenerationService?

When you use kernel with AddOpenAIChatCompletion extention method and use gpt-3.5-turbo-instruct model - it will throw following exception:

Microsoft.SemanticKernel.Http.HttpOperationException: 'This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?
Status: 404 (Not Found)

That's because OpenAIChatCompletionService implements both IChatCompletionService and ITextGenerationService and we check for chat interface first in our logic:

if (aiService is IChatCompletionService chatCompletion)
{
var chatContent = await chatCompletion.GetChatMessageContentAsync(renderedPrompt, executionSettings, kernel, cancellationToken).ConfigureAwait(false);
this.CaptureUsageDetails(chatContent.ModelId, chatContent.Metadata, this._logger);
return new FunctionResult(this, chatContent, kernel.Culture, chatContent.Metadata);
}
if (aiService is ITextGenerationService textGeneration)
{
var textContent = await textGeneration.GetTextContentWithDefaultParserAsync(renderedPrompt, executionSettings, kernel, cancellationToken).ConfigureAwait(false);
this.CaptureUsageDetails(textContent.ModelId, textContent.Metadata, this._logger);
return new FunctionResult(this, textContent, kernel.Culture, textContent.Metadata);
}

So, in order to use gpt-3.5-turbo-instruct, the service should implement just ITextGenerationService, as it is in case of OpenAITextGenerationService and usage of AddOpenAITextGeneration extension method.

But I think ideally OpenAIChatCompletionService should implement just IChatCompletionService and OpenAITextGenerationService should implement just ITextGenerationService for clear differentiation. Users shouldn't pass text generation AI model to chat completion service class or extension method.

@dmytrostruk
Copy link
Member Author

The reason why OpenAIChatCompletionService implements both IChatCompletionService and ITextGenerationService interfaces is because previously kernel and functions worked only with ITextGenerationService interface and it was a limitation. Now, since we have this differentiation between interfaces inside a core logic, I don't think that OpenAIChatCompletionService still needs to implement ITextGenerationService.

@stephentoub
Copy link
Member

I see, thanks. Yeah, that's a fairly unfortunate combination, but I get it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kernel Issues or pull requests impacting the core kernel .NET Issue or Pull requests regarding .NET code PR: ready for review All feedback addressed, ready for reviews
Projects
None yet
Development

Successfully merging this pull request may close these issues.

.Net: Why do AzureTextCompletion and OpenAITextCompletion exist?
6 participants