-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: ADR with proposal for the multiple LLM support use cases #3040
.Net: ADR with proposal for the multiple LLM support use cases #3040
Conversation
docs/decisions/0012-semantic-function-multiple-model-support.md
Outdated
Show resolved
Hide resolved
docs/decisions/0012-semantic-function-multiple-model-support.md
Outdated
Show resolved
Hide resolved
5c1da27
to
8bd8527
Compare
…ultiple-llm-support-adr
…ultiple-llm-support-adr
Could we also support a scenario where the kernel keys off of "Model ID" instead of "Service ID"? In most cases, the semantic function author doesn't know the names of all the services that the kernel invoker has added, they'll just know the standard model names that are out in the wild (e.g., gpt-4, gpt-3.5-turbo, etc.). Leveraging "Model ID" should likely be the main/predominant scenario. |
Right now you could make the service if the same as the model id and that would work. Currently matching specifically on model id is called out of scope because we have no way to store arbitrary attributes for a service. I'll add this as an additional task and will address in a follow up PR. |
I'm not sure, but it seems we need both the Service ID and Model ID to cover scenarios in which one connector provides multiple models and when the same model is provided by multiple connectors. Otherwise, the connector will have to be registered many times – one time per model it supports for the former scenario, or it won't be possible to register many connectors supporting the same model for the latter scenario. |
So right now we can do this: IKernel kernel = new KernelBuilder()
.WithAzureChatCompletionService(serviceId: "azure-gpt-4", deploymentName: "gpt-4", endpoint: endpoint, apiKey: apiKey)
.WithAzureChatCompletionService(serviceId: "azure-gpt-3.5", deploymentName: "gpt-3.5", endpoint: endpoint, apiKey: apiKey)
.WithOpenAIChatCompletionService(serviceId: "openai-gpt-4", modelId: "gpt-4", apiKey: openAIApiKey)
.WithOpenAIChatCompletionService(serviceId: "openai-gpt-3.5", modelId: "gpt-3.5", apiKey: openAIApiKey)
.Build(); I can write an IAIServiceSelector that picks the right service by service id In the future we would like to be able to support this: IKernel kernel = new KernelBuilder()
.WithAzureChatCompletionService(modelId: "gpt-4", deploymentName: "foo", endpoint: endpoint, apiKey: apiKey)
.WithAzureChatCompletionService(modelId: "gpt-3.5", deploymentName: "bar", endpoint: endpoint, apiKey: apiKey)
.WithOpenAIChatCompletionService(modelId: "gpt-4", apiKey: openAIApiKey)
.WithOpenAIChatCompletionService(modelId: "gpt-3.5", apiKey: openAIApiKey)
.Build(); We need to solve how to write an IAIServiceSelector that picks the right service by model id and service type |
a01649b
to
6416771
Compare
dotnet/src/SemanticKernel.Abstractions/Orchestration/SKContext.cs
Outdated
Show resolved
Hide resolved
docs/decisions/0015-semantic-function-multiple-model-support.md
Outdated
Show resolved
Hide resolved
docs/decisions/0015-semantic-function-multiple-model-support.md
Outdated
Show resolved
Hide resolved
d7bef18
to
368f03e
Compare
64f742a
to
305b32c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be marked as a Breaking Change after making ServiceProvider
internal.
…soft#3040) ### Motivation and Context - ADR describing the design for multiple LLM support. This will be expanded in the future to allow additional use cases such as selecting an AI service by LLM model id. - The `IAIServiceSelector` has changed so that it is added to the `IKernel` instance, see example 62 for an example of a custom `IAIServiceSelector` implementation. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄
Motivation and Context
IAIServiceSelector
has changed so that it is added to theIKernel
instance, see example 62 for an example of a customIAIServiceSelector
implementation.Contribution Checklist