Skip to content

Commit

Permalink
Merge branch 'main' into dependabot/nuget/dotnet/Roslynator.Formattin…
Browse files Browse the repository at this point in the history
…g.Analyzers-4.6.1
  • Loading branch information
SergeyMenshykh authored Oct 26, 2023
2 parents d60f73d + bcbdadd commit ba37719
Show file tree
Hide file tree
Showing 48 changed files with 274 additions and 113 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ If you like Semantic Kernel, you may also be interested in other repos the Seman
| [Chat Copilot](https://github.com/microsoft/chat-copilot) | A reference application that demonstrates how to build a chatbot with Semantic Kernel. |
| [Semantic Kernel Docs](https://github.com/MicrosoftDocs/semantic-kernel-docs) | The home for Semantic Kernel documentation that appears on the Microsoft learn site. |
| [Semantic Kernel Starters](https://github.com/microsoft/semantic-kernel-starters) | Starter projects for Semantic Kernel to make it easier to get started. |
| [Semantic Memory](https://github.com/microsoft/semantic-memory) | A service that allows you to create pipelines for ingesting, storing, and querying knowledge. |
| [Kernel Memory](https://github.com/microsoft/kernel-memory) | A scalable Memory service to store information and ask questions using the RAG pattern. |

## Join the community

Expand Down
145 changes: 145 additions & 0 deletions docs/decisions/0014-chat-completion-roles-in-prompt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
---
# These are optional elements. Feel free to remove any of them.
status: proposed
date: 2023-10-23
deciders: markwallace, mabolan
consulted:
informed:
---
# SK prompt syntax for chat completion roles

## Context and Problem Statement
Today, SK does not have the ability to mark a block of text in a prompt as a message with a specific role, such as assistant, system, or user. As a result, SK can't chunk the prompt into the list of messages required by chat completion connectors.

Additionally, prompts can be defined using a range of template syntaxes supported by various template engines, such as Handlebars, Jinja, and others. Each of these syntaxes may represent chat messages or roles in a distinct way. Consequently, the template engine syntax may leak into SK's domain if no proper abstraction is put in place, coupling SK with the template engines and making it impossible to support new ones.

<!-- This is an optional element. Feel free to remove. -->
## Decision Drivers
* It should be possible to mark a block of text in a prompt as a message with a role so that it can be converted into a list of chat messages for use by chat completion connectors.
* The syntax specific to the template engine message/role should be mapped to the SK message/role syntax to abstract SK from a specific template engine syntax.

## Considered Options
**1. Message/role tags are generated by functions specified in a prompt.** This option relies on the fact that many template engines can invoke functions specified in the template. Therefore, an internal function can be registered with a template engine, and the function will create a message/model tag based on the provided arguments. The prompt template engine will execute the function and emit the function result into the prompt template, and the rendered prompt will have a section for each message/role decorated with these tags. Here's an example of how this can be done using the SK basic template engine and Handlebars:

Function:
```csharp
internal class SystemFunctions
{
public string Message(string role)
{
return $"<message role=\"{role}\">";
}
}
```

Prompt:

```bash
{{message role="system"}}
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
{{message role="system"}}

{{message role="user"}}
I want to {{$input}}
{{message role="user"}}
```

Rendered prompt:

```xml
<message role="system">
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
</message>
<message role="user">
I want to buy a house.
</message>
```

**2. Message/role tags are generated by a prompt-specific mechanism.** This option utilizes template engine syntax constructions, helpers, and handlers other than functions to inject SK message/role tags into the final prompt.
In the example below, to parse the prompt that uses the handlebars syntax we need to register a block helper (a callback that is invoked when the Handlebars engine encounters it) to emit the SK message/role tags in the resulting prompt.

Block helpers:
```csharp
this.handlebarsEngine.RegisterHelper("system", (EncodedTextWriter output, Context context, Arguments arguments) => {
//Emit the <message role="system"> tags
});
this.handlebarsEngine.RegisterHelper("user", (EncodedTextWriter output, Context context, Arguments arguments) => {
//Emit the <message role="user"> tags
});
```

Prompt:
```bash
{{#system~}}
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
{{~/system}}
{{#user~}}
I want to {{$input}}
{{~/user}}
```

Rendered prompt:
```xml
<message role="system">
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
</message>
<message role="user">
I want to buy a house.
</message>
```

**3. Message/role tags are applied on top of prompt template engine**. This option presumes specifying the SK message/role tags directly in a prompt to denote message/role blocks in way that template engine does not parse/handle them and considers them as a regular text.
In the example below, the prompt the `<message role="*">` tags are marking boundaries of the system and user messages and SK basic template engine consider them as regular text without processing them.

Prompt:
```xml
<message role="system">
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
</message>
<message role="user">
I want to {{$input}}
</message>
```

Rendered prompt:
```xml
<message role="system">
You are a bank manager. Be helpful, respectful, appreciate diverse language styles.
</message>
<message role="user">
I want to buy a house.
</message>
```

## Pros and Cons
**1. Message/role tags are generated by functions specified in a prompt**

Pros:
* Functions can be defined once and reused in prompt templates that support function calling.

Cons:
* Functions might not be supported by some template engines.
* The system/internal functions should be pre-registered by SK so users don't need to import them.
* Each prompt template engine will have how to discover and call the system/internal functions.

**2. Message/role tags are generated by prompt specific mechanism**

Pros:
* Enables message/role representation with the optimal template engine syntax constructions, aligning with other constructions for that specific engine.

Cons:
* Each prompt template engine will have to register callbacks/handlers to handle template syntax constructions rendering to emit SK message/role tags.

**3. Message/role tags are applied on top of prompt template engine**

Pros:
* No changes are required to prompt template engines.

Cons:
* The message/role tag syntax may not align with other syntax constructions for that template engine.
* Syntax errors in message/role tags will be detected by components parsing the prompt and not by prompt template engines.

## Decision Outcome
It was agreed not to limit ourselves to only one possible option because it may not be feasible to apply that option to new template engines we might need to support in the future. Instead, each time a new template engine is added, every option should be considered, and the optimal one should be preferred for that particular template engine.

It was also agreed that, at the moment, we will go with the "3. Message/role tags are applied on top of the prompt template engine" option to support the message/role prompt syntax in SK, which currently uses the `BasicPromptTemplateEngine` engine.
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ public static async Task RunAsync()
return;
}

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithOpenAIChatCompletionService(
modelId: openAIModelId,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ private static async Task CustomHandlerAsync()

private static KernelBuilder InitializeKernelBuilder()
{
return Kernel.Builder
return new KernelBuilder()
.WithLoggerFactory(InfoLogger.LoggerFactory)
// OpenAI settings - you can set the OpenAI.ApiKey to an invalid value to see the retry policy in play
.WithOpenAIChatCompletionService(TestConfiguration.OpenAI.ChatModelId, "BAD_KEY");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ public static async Task RunAsync()
{
Console.WriteLine("======== Native function types ========");

var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithOpenAIChatCompletionService(TestConfiguration.OpenAI.ChatModelId, TestConfiguration.OpenAI.ApiKey)
.Build();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ public static Task RunAsync()
{
Console.WriteLine("======== Describe all plugins and functions ========");

var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithOpenAIChatCompletionService(
modelId: TestConfiguration.OpenAI.ChatModelId,
apiKey: TestConfiguration.OpenAI.ApiKey)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ public static async Task RunAsync()
{
Console.WriteLine("======== WebSearchQueries ========");

IKernel kernel = Kernel.Builder.WithLoggerFactory(ConsoleLogger.LoggerFactory).Build();
IKernel kernel = new KernelBuilder().WithLoggerFactory(ConsoleLogger.LoggerFactory).Build();

// Load native plugins
var plugin = new SearchUrlPlugin();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ private static async Task GetConversationTopicsAsync()

private static IKernel InitializeKernel()
{
IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAzureChatCompletionService(
TestConfiguration.AzureOpenAI.ChatDeploymentName,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ private static IMemoryStore CreateSampleKustoMemoryStore()

private static async Task RunWithStoreAsync(IMemoryStore memoryStore, CancellationToken cancellationToken)
{
var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithOpenAIChatCompletionService(TestConfiguration.OpenAI.ChatModelId, TestConfiguration.OpenAI.ApiKey)
.WithOpenAITextEmbeddingGenerationService(TestConfiguration.OpenAI.EmbeddingModelId, TestConfiguration.OpenAI.ApiKey)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ private static async Task UseKernelInDIPowerAppAsync()
//Registering Kernel
collection.AddTransient<IKernel>((serviceProvider) =>
{
return Kernel.Builder
return new KernelBuilder()
.WithLoggerFactory(serviceProvider.GetRequiredService<ILoggerFactory>())
.WithOpenAITextCompletionService(TestConfiguration.OpenAI.ModelId, TestConfiguration.OpenAI.ApiKey)
.Build();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ public static Task RunAsync()
/// </summary>
private static void UseDefaultHttpClient()
{
var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithOpenAIChatCompletionService(
modelId: TestConfiguration.OpenAI.ChatModelId,
apiKey: TestConfiguration.OpenAI.ApiKey) // If you need to use the default HttpClient from the SK SDK, simply omit the argument for the httpMessageInvoker parameter.
Expand All @@ -47,7 +47,7 @@ private static void UseCustomHttpClient()
using var httpClient = new HttpClient();

// If you need to use a custom HttpClient, simply pass it as an argument for the httpClient parameter.
var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithOpenAIChatCompletionService(
modelId: TestConfiguration.OpenAI.ModelId,
apiKey: TestConfiguration.OpenAI.ApiKey,
Expand All @@ -68,7 +68,7 @@ private static void UseBasicRegistrationWithHttpClientFactory()
{
var factory = sp.GetRequiredService<IHttpClientFactory>();

var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithOpenAIChatCompletionService(
modelId: TestConfiguration.OpenAI.ChatModelId,
apiKey: TestConfiguration.OpenAI.ApiKey,
Expand Down Expand Up @@ -99,7 +99,7 @@ private static void UseNamedRegistrationWitHttpClientFactory()
{
var factory = sp.GetRequiredService<IHttpClientFactory>();

var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithOpenAIChatCompletionService(
modelId: TestConfiguration.OpenAI.ChatModelId,
apiKey: TestConfiguration.OpenAI.ApiKey,
Expand Down
22 changes: 11 additions & 11 deletions dotnet/samples/KernelSyntaxExamples/Example42_KernelBuilder.cs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

// ==========================================================================================================
// The easier way to instantiate the Semantic Kernel is to use KernelBuilder.
// You can access the builder using either Kernel.Builder or KernelBuilder.
// You can access the builder using new KernelBuilder().

#pragma warning disable CA1852

Expand Down Expand Up @@ -39,20 +39,20 @@ public static Task RunAsync()
string azureOpenAIEmbeddingDeployment = TestConfiguration.AzureOpenAIEmbeddings.DeploymentName;

#pragma warning disable CA1852 // Seal internal types
IKernel kernel1 = Kernel.Builder.Build();
IKernel kernel1 = new KernelBuilder().Build();
#pragma warning restore CA1852 // Seal internal types

IKernel kernel2 = Kernel.Builder.Build();
IKernel kernel2 = new KernelBuilder().Build();

// ==========================================================================================================
// Kernel.Builder returns a new builder instance, in case you want to configure the builder differently.
// new KernelBuilder() returns a new builder instance, in case you want to configure the builder differently.
// The following are 3 distinct builder instances.

var builder1 = new KernelBuilder();

var builder2 = Kernel.Builder;
var builder2 = new KernelBuilder();

var builder3 = Kernel.Builder;
var builder3 = new KernelBuilder();

// ==========================================================================================================
// A builder instance can create multiple kernel instances, e.g. in case you need
Expand Down Expand Up @@ -108,7 +108,7 @@ public static Task RunAsync()
// ==========================================================================================================
// The AI services are defined with the builder

var kernel7 = Kernel.Builder
var kernel7 = new KernelBuilder()
.WithAzureChatCompletionService(
deploymentName: azureOpenAIChatCompletionDeployment,
endpoint: azureOpenAIEndpoint,
Expand All @@ -121,7 +121,7 @@ public static Task RunAsync()
// The default behavior can be configured or a custom retry handler can be injected that will apply to all
// AI requests (when using the kernel).

var kernel8 = Kernel.Builder.WithRetryBasic(
var kernel8 = new KernelBuilder().WithRetryBasic(
new BasicRetryConfig
{
MaxRetryCount = 3,
Expand All @@ -147,11 +147,11 @@ public static Task RunAsync()
(ex, timespan, retryCount, _)
=> logger?.LogWarning(ex, "Error executing action [attempt {RetryCount} of 3], pausing {PausingMilliseconds}ms", retryCount, timespan.TotalMilliseconds));

var kernel9 = Kernel.Builder.WithHttpHandlerFactory(new PollyHttpRetryHandlerFactory(retryThreeTimesPolicy)).Build();
var kernel9 = new KernelBuilder().WithHttpHandlerFactory(new PollyHttpRetryHandlerFactory(retryThreeTimesPolicy)).Build();

var kernel10 = Kernel.Builder.WithHttpHandlerFactory(new PollyRetryThreeTimesFactory()).Build();
var kernel10 = new KernelBuilder().WithHttpHandlerFactory(new PollyRetryThreeTimesFactory()).Build();

var kernel11 = Kernel.Builder.WithHttpHandlerFactory(new MyCustomHandlerFactory()).Build();
var kernel11 = new KernelBuilder().WithHttpHandlerFactory(new MyCustomHandlerFactory()).Build();

return Task.CompletedTask;
}
Expand Down
2 changes: 1 addition & 1 deletion dotnet/samples/KernelSyntaxExamples/Example52_ApimAuth.cs
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ public static async Task RunAsync()
.AddConsole();
});

var kernel = Kernel.Builder
var kernel = new KernelBuilder()
.WithLoggerFactory(loggerFactory)
.WithAIService<IChatCompletion>(TestConfiguration.AzureOpenAI.ChatDeploymentName, (loggerFactory) =>
new AzureChatCompletion(TestConfiguration.AzureOpenAI.ChatDeploymentName, openAIClient, loggerFactory))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ public static async Task RunAsync()
return;
}

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAzureChatCompletionService(
deploymentName: deploymentName,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ public static async Task RunAsync()
return;
}

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAzureChatCompletionService(
deploymentName: chatDeploymentName,
Expand Down
2 changes: 1 addition & 1 deletion dotnet/src/Connectors/Connectors.Memory.Chroma/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ const string endpoint = "http://localhost:8000";

ChromaMemoryStore memoryStore = new(endpoint);

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLogger(logger)
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", "OPENAI_API_KEY")
.WithMemoryStorage(memoryStore)
Expand Down
2 changes: 1 addition & 1 deletion dotnet/src/Connectors/Connectors.Memory.Kusto/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ using Kusto.Data;
var connectionString = new KustoConnectionStringBuilder("https://kvc123.eastus.kusto.windows.net").WithAadUserPromptAuthentication();
KustoMemoryStore memoryStore = new(connectionString, "MyDatabase");

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLogger(ConsoleLogger.Log)
.WithOpenAITextCompletionService(modelId: TestConfiguration.OpenAI.ModelId, apiKey: TestConfiguration.OpenAI.ApiKey)
.WithOpenAITextEmbeddingGenerationService(modelId: TestConfiguration.OpenAI.EmbeddingModelId,apiKey: TestConfiguration.OpenAI.ApiKey)
Expand Down
2 changes: 1 addition & 1 deletion dotnet/src/Connectors/Connectors.Memory.Milvus/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ docker-compose up -d
```csharp
using MilvusMemoryStore memoryStore = new("localhost");

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLogger(logger)
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", "OPENAI_API_KEY")
.WithMemoryStorage(memoryStore)
Expand Down
2 changes: 1 addition & 1 deletion dotnet/src/Connectors/Connectors.Memory.Postgres/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ NpgsqlDataSource dataSource = dataSourceBuilder.Build();

PostgresMemoryStore memoryStore = new PostgresMemoryStore(dataSource, vectorSize: 1536/*, schema: "public" */);

IKernel kernel = Kernel.Builder
IKernel kernel = new KernelBuilder()
.WithLogger(ConsoleLogger.Logger)
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", Env.Var("OPENAI_API_KEY"))
.WithMemoryStorage(memoryStore)
Expand Down
Loading

0 comments on commit ba37719

Please sign in to comment.