-See the [Feature Matrix](https://learn.microsoft.com/en-us/semantic-kernel/get-started/supported-languages) to see a breakdown of feature parity between our currently supported languages.
-
The quickest way to get started with the basics is to get an API key
-(OpenAI or Azure OpenAI)
-and to run one of the C# or Python console applications/scripts:
+from either OpenAI or Azure OpenAI and to run one of the C#, Python, and Java console applications/scripts below.
### For C#:
@@ -85,48 +78,75 @@ and to run one of the C# or Python console applications/scripts:
4. Copy the code from [here](python/README.md) into the `hello-world.py` script.
5. Run the python script.
-## Sample apps ⚡
+### For Java:
+
+1. Clone and checkout the experimental Java branch: `git clone -b experimental-java https://github.com/microsoft/semantic-kernel.git`
+2. Follow the instructions [here](https://github.com/microsoft/semantic-kernel/blob/experimental-java/java/samples/sample-code/README.md)
+
+## Learning how to use Semantic Kernel
+
+The fastest way to learn how to use Semantic Kernel is with our C# and Python Jupyter notebooks. These notebooks
+demonstrate how to use Semantic Kernel with code snippets that you can run with a push of a button.
+
+- [Getting Started with C# notebook](dotnet/notebooks/00-getting-started.ipynb)
+- [Getting Started with Python notebook](python/notebooks/00-getting-started.ipynb)
+
+Once you've finished the getting started notebooks, you can then check out the main walkthroughs
+on our Learn site. Each sample comes with a completed C# and Python project that you can run locally.
+
+1. 📖 [Overview of the kernel](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/)
+1. 🔌 [Understanding AI plugins](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/plugins)
+1. 👄 [Creating semantic functions](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/semantic-functions)
+1. 💽 [Creating native functions](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/native-functions)
+1. ⛓️ [Chaining functions together](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/chaining-functions)
+1. 🤖 [Auto create plans with planner](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/planner)
+1. 💡 [Create and run a ChatGPT plugin](https://learn.microsoft.com/en-us/semantic-kernel/ai-orchestration/chatgpt-plugins)
+
+Finally, refer to our API references for more details on the C# and Python APIs:
+
+- [C# API reference](https://learn.microsoft.com/en-us/dotnet/api/microsoft.semantickernel?view=semantic-kernel-dotnet)
+- Python API reference (coming soon)
+
+## Chat Copilot: see what's possible with Semantic Kernel
-The repository includes some sample applications, with a React frontend and
-a backend web service using Semantic Kernel.
+If you're interested in seeing a full end-to-end example of how to use Semantic Kernel, check out
+our [Chat Copilot](https://github.com/microsoft/chat-copilot) reference application. Chat Copilot
+is a chatbot that demonstrates the power of Semantic Kernel. By combining plugins, planners, and personas,
+we demonstrate how you can build a chatbot that can maintain long-running conversations with users while
+also leveraging plugins to integrate with other services.
-Follow the links for more information and instructions about running these apps.
+![Chat Copilot answering a question](https://learn.microsoft.com/en-us/semantic-kernel/media/chat-copilot-in-action.gif)
-| | |
-| ----------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- |
-| [Simple chat summary](samples/apps/chat-summary-webapp-react/README.md) | Use ready-to-use plugins and get plugins into your app easily. |
-| [Book creator](samples/apps/book-creator-webapp-react/README.md) | Use planner to deconstruct a complex goal and envision using the planner in your app. |
-| [Authentication and APIs](samples/apps/auth-api-webapp-react/README.md) | Use a basic connector pattern to authenticate and connect to an API and imagine integrating external data into your app's LLM AI. |
-| [GitHub repository Q&A](samples/apps/github-qna-webapp-react/README.md) | Use embeddings and memory to store recent data and allow you to query against it. |
-| [Copilot Chat Sample App](samples/apps/copilot-chat-app/README.md) | Build your own chat experience based on Semantic Kernel. |
+You can run the app yourself by downloading it from its [GitHub repo](https://github.com/microsoft/chat-copilot).
-**Requirements:**
+## Visual Studio Code extension: design semantic functions with ease
-- You will need an
- [Open AI API Key](https://openai.com/api/) or
- [Azure Open AI service key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=rest-api)
- to get started.
-- [Azure Functions Core Tools](https://learn.microsoft.com/azure/azure-functions/functions-run-local)
- are required to run the kernel as a local web service, used by the sample web apps.
-- [.NET 6 SDK](https://dotnet.microsoft.com/download/dotnet/6.0) or [.NET 7 SDK](https://dotnet.microsoft.com/download/dotnet/7.0)
-- [Yarn](https://yarnpkg.com/getting-started/install) is used for installing web apps' dependencies.
+The [Semantic Kernel extension for Visual Studio Code](https://learn.microsoft.com/en-us/semantic-kernel/vs-code-tools/)
+makes it easy to design and test semantic functions. The extension provides an interface for
+designing semantic functions and allows you to test them with a push of a button with your
+existing models and data.
-## Deploy Semantic Kernel to Azure in a web app service ☁️
+![Semantic Kernel extension for Visual Studio Code](https://learn.microsoft.com/en-us/semantic-kernel/media/vs-code-extension.png)
-Getting Semantic Kernel deployed to Azure as web app service is easy with one-click deployments. Click [here](https://aka.ms/sk-docs-azuredeploy) to learn more on how to deploy to Azure.
+In the above screenshot, you can see the extension in action:
-## Jupyter Notebooks ⚡
+- Syntax highlighting for semantic functions
+- Code completion for semantic functions
+- LLM model picker
+- Run button to test the semantic function with your input data
-For a more hands-on overview, you can also check out the C# and Python Jupyter notebooks, starting
-from here:
+## Check out our other repos!
-- [Getting Started with C# notebook](samples/notebooks/dotnet/00-getting-started.ipynb)
-- [Getting Started with Python notebook](samples/notebooks/python/00-getting-started.ipynb)
+If you like Semantic Kernel, you may also be interested in other repos the Semantic Kernel team supports:
-**Requirements:** C# notebooks require [.NET 7](https://dotnet.microsoft.com/download)
-and the VS Code [Polyglot extension](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.dotnet-interactive-vscode).
+| Repo | Description |
+| --------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------- |
+| [Chat Copilot](https://github.com/microsoft/chat-copilot) | A reference application that demonstrates how to build a chatbot with Semantic Kernel. |
+| [Semantic Kernel Docs](https://github.com/MicrosoftDocs/semantic-kernel-docs) | The home for Semantic Kernel documentation that appears on the Microsoft learn site. |
+| [Semantic Kernel Starters](https://github.com/microsoft/semantic-kernel-starters) | Starter projects for Semantic Kernel to make it easier to get started. |
+| [Semantic Memory](https://github.com/microsoft/semantic-memory) | A service that allows you to create pipelines for ingesting, storing, and querying knowledge. |
-## Contributing and Community
+## Join the community
We welcome your contributions and suggestions to SK community! One of the easiest
ways to participate is to engage in discussions in the GitHub repository.
@@ -139,7 +159,7 @@ in a different direction, but also to consider the impact on the larger ecosyste
To learn more and get started:
- Read the [documentation](https://aka.ms/sk/learn)
-- Learn how to [contribute](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) to the project
+- Learn how to [contribute](https://learn.microsoft.com/en-us/semantic-kernel/get-started/contributing) to the project
- Join the [Discord community](https://aka.ms/SKDiscord)
- Attend [regular office hours and SK community events](COMMUNITY.md)
- Follow the team on our [blog](https://aka.ms/sk/blog)
diff --git a/docs/GLOSSARY.md b/docs/GLOSSARY.md
index 93f4a7c5aa29..53ff34978910 100644
--- a/docs/GLOSSARY.md
+++ b/docs/GLOSSARY.md
@@ -3,19 +3,19 @@
To wrap your mind around the concepts we present throughout the kernel, here is a glossary of
commonly used terms
-**Semantic Kernel (SK)** - The orchestrator that fulfills a user's ASK with SK's available [SKILLS](SKILLS.md).
+**Semantic Kernel (SK)** - The orchestrator that fulfills a user's ASK with SK's available [PLUGINS](PLUGINS.md).
**Ask** - What a user requests to the Semantic Kernel to help achieve the user's goal.
- "We make ASKs to the SK"
-**Skill** - A domain-specific collection made available to the SK as a group of finely-tuned functions.
+**Plugins** - A domain-specific collection made available to the SK as a group of finely-tuned functions.
-- "We have a SKILL for using Office better"
+- "We have a PLUGIN for using Office better"
-**Function** - A computational machine comprised of Semantic AI and/or native code that's available in a [SKILL](SKILLS.md).
+**Function** - A computational machine comprised of Semantic AI and/or native code that's available in a [PLUGIN](PLUGINS.md).
-- "The Office SKILL has many FUNCTIONS"
+- "The Office PLUGIN has many FUNCTIONS"
**Native Function** - expressed with traditional computing language (C#, Python, Typescript)
and easily integrates with SK
diff --git a/docs/SKILLS.md b/docs/PLUGINS.md
similarity index 100%
rename from docs/SKILLS.md
rename to docs/PLUGINS.md
diff --git a/docs/decisions/0001-madr-architecture-decisions.md b/docs/decisions/0001-madr-architecture-decisions.md
index f9e3b7125438..faa92433ed8a 100644
--- a/docs/decisions/0001-madr-architecture-decisions.md
+++ b/docs/decisions/0001-madr-architecture-decisions.md
@@ -15,7 +15,7 @@ We need a way to keep the implementations aligned with regard to key architectur
semantic function configuration (config.json) and when this change is agreed it must be reflected in all of the Semantic Kernel implementations.
MADR is a lean template to capture any decisions in a structured way. The template originated from capturing architectural decisions and developed to a template allowing to capture any decisions taken.
-For more information [see](https://adr.github.io/madr/)
+For more information [see](https://adr.github.io/)
## Decision Drivers
diff --git a/docs/decisions/0005-function-execution-handlers.md b/docs/decisions/0005-function-execution-handlers.md
new file mode 100644
index 000000000000..b3b4e2939dab
--- /dev/null
+++ b/docs/decisions/0005-function-execution-handlers.md
@@ -0,0 +1,133 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 2023-05-29
+deciders: @rogerbarreto, @shawncal, @stephentoub
+consulted:
+informed:
+---
+
+# Kernel/Function Handlers
+
+## Context and Problem Statement
+
+A Kernel function caller needs to be able to handle/intercept any function execution in the Kernel before and after it was attempted. Allowing it to modify the prompt, abort the execution, or modify the output and many other scenarios as follows:
+
+- Pre-Execution / Running
+
+ - Get: Original prompt template
+ - Get: Prompt generated by the current Kernel `TemplateEngine` before calling the LLM
+ - Get: Current settings used
+ - Get: Parameters used
+ - Get: SKContext
+ - Set: Modify a prompt content before sending it to LLM
+ - Set: Abort/Cancel function execution
+
+- In-Execution / Stream Processing
+
+ - Get: Text Prompt generated per stream block `IAsyncEnumerable`
+ - Set: Filter/Change a prompt stream block
+ - Set: Skip a prompt stream block
+
+- Post-Execution / Ran
+
+ - Get: LLM Model Result (Tokens Usage, Stop Sequence, ...)
+ - Get: Generated Prompt
+ - Get: SKContext
+ - Get: Output parameters
+
+ - Set: Modify a prompt content after getting it from LLM
+ - Set: Modify output parameters content (before returning the output)
+
+## Decision Drivers
+
+- Architecture changes and the associated decision making process should be transparent to the community.
+- Decision records are stored in the repository and are easily discoverable for teams involved in the various language ports.
+
+## Considered Options
+
+- Callback Registration + Recursive
+- Single Callback
+- Event Based Registration
+- Middleware
+
+## Pros and Cons of the Options
+
+### Callback Registration Recursive Delegate (Kernel, Plan, Function)
+
+- Specified on plan and function level as a configuration be able to specify what are the callback Handlers that will be triggered.
+
+Pros:
+
+- Common pattern for observing and also changing data exposed as parameter into the delegate signature for (Get/Set) scenarios
+- Registering a callback gives back the registration object that can be used to cancel the execution of the function in the future.
+- Recursive approach, allows to register multiple callbacks for the same event, and also allows to register callbacks on top of pre existing callbacks.
+
+Cons:
+
+- Registrations may use more memory and might not be garbage collected in the recursive approach, only when the function or the plan is disposed.
+
+### Single Callback Delegate (Kernel, Plan, Function)
+
+- Specified on kernel level as a configuration be able to specify what are the callback Handlers that will be triggered.
+ - Specified on function creation: As part of the function constructor be able to specify what are the callback Handlers that will be triggered.
+ - Specified on function invocation: As part of the function invoke be able to specify what are the callback Handlers as a parameter that will be triggered.
+
+Pros:
+
+- Common pattern for observing and also changing data exposed as parameter into the delegate signature for (Get/Set) scenarios
+
+Cons:
+
+- Limited to only one method observing a specific event (Pre Post and InExecution). - Function When used as parameter, three new parameters would be needed as part of the function. (Specified on function invocation) - Extra Cons on
+
+### Event Base Registration (Kernel only)
+
+Expose events on both IKernel and ISKFunction that the call can can be observing to interact.
+
+Pros:
+
+- Multiple Listeners can registered for the same event
+- Listeners can be registered and unregistered at will
+- Common pattern (EventArgs) for observing and also changing data exposed as parameter into the event signature for (Get/Set) scenarios
+
+Cons:
+
+- Event handlers are void, making the EventArgs by reference the only way to modify the data.
+- Not clear how supportive is this approach for asynchronous pattern/multi threading
+- Won't support `ISKFunction.InvokeAsync`
+
+### Middleware (Kernel Only)
+
+Specified on Kernel level, and would only be used using IKernel.RunAsync operation, this pattern would be similar to asp.net core middlewares, running the pipelines with a context and a requestdelegate next for controlling (Pre/Post conditions)
+
+Pros:
+
+- Common pattern for handling Pre/Post Setting/Filtering data
+
+Cons:
+
+- Functions can run on their own instance, middlewares suggest more complexity and the existence of an external container/manager (Kernel) to intercept/observe function calls.
+
+## Main Questions
+
+- Q: Post Execution Handlers should execute right after the LLM result or before the end of the function execution itself?
+ A: Currently post execution Handlers are executed after function execution.
+
+- Q: Should Pre/Post Handlers be many (pub/sub) allowing registration/deregistration?
+ A: By using the standard .NET event implementation, this already supports multiple registrations as well as deregistrations managed by the caller.
+
+- Q: Setting Handlers on top of pre existing Handlers should be allowed or throw an error?
+ A: By using the standard .NET event implementation, the stander behavior will not throw an error and will execute all the registered handlers.
+
+- Q: Setting Handlers on Plans should automatically cascade this Handlers for all the inner steps + overriding existing ones in the process?
+ A: Handlers will be triggered before and after each step is executed the same way the Kernel RunAsync pipeline works.
+
+- Q: When a pre execution handler cancel the execution of the function,
+
+- Q: When a pre function execution handler intents to cancel the execution, should further handlers in the chain be called or not?
+ A: Currently the standard .net behavior is to call all the registered handlers. This way function execution will solely depends on the final boolean state of the Cancel property after all handlers were called.
+
+## Decision Outcome
+
+TBD.
diff --git a/docs/decisions/0005-open-api-dynamic-payload-and-namespaces.md b/docs/decisions/0005-open-api-dynamic-payload-and-namespaces.md
new file mode 100644
index 000000000000..ffe87013c573
--- /dev/null
+++ b/docs/decisions/0005-open-api-dynamic-payload-and-namespaces.md
@@ -0,0 +1,79 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 2023-08-15
+deciders: shawncal
+consulted:
+informed:
+---
+# Dynamic payload building for PUT and POST RestAPI operations and parameter namespacing
+
+## Context and Problem Statement
+Currently, the SK OpenAPI does not allow the dynamic creation of payload/body for PUT and POST RestAPI operations, even though all the required metadata is available. One of the reasons the functionality was not fully developed originally, and eventually removed is that JSON payload/body content of PUT and POST RestAPI operations might contain properties with identical names at various levels. It was not clear how to unambiguously resolve their values from the flat list of context variables. Another reason the functionality has not been added yet is that the 'payload' context variable, along with RestAPI operation data contract schema(OpenAPI, JSON schema, Typings?) should have been sufficient for LLM to provide fully fleshed-out JSON payload/body content without the need to build it dynamically.
+
+
+## Decision Drivers
+* Create a mechanism that enables the dynamic construction of the payload/body for PUT and POST RestAPI operations.
+* Develop a mechanism(namespacing) that allows differentiation of payload properties with identical names at various levels for PUT and POST RestAPI operations.
+* Aim to minimize breaking changes and maintain backward compatibility of the code as much as possible.
+
+## Considered Options
+* Enable the dynamic creation of payload and/or namespacing by default.
+* Enable the dynamic creation of payload and/or namespacing based on configuration.
+
+## Decision Outcome
+Chosen option: "Enable the dynamic creation of payload and/or namespacing based on configuration". This option keeps things compatible, so the change won't affect any SK consumer code. Additionally, it lets SK consumer code easily control both mechanisms, turning them on or off based on the scenario.
+
+## Additional details
+
+### Enabling dynamic creation of payload
+In order to enable the dynamic creation of payloads/bodies for PUT and POST RestAPI operations, please set the `EnableDynamicPayload` property of the `OpenApiSkillExecutionParameters` execution parameters to `true` when importing the AI plugin:
+
+```csharp
+var plugin = await kernel.ImportPluginFunctionsAsync("", new Uri(""), new OpenApiSkillExecutionParameters(httpClient) { EnableDynamicPayload = true });
+```
+
+To dynamically construct a payload for a RestAPI operation that requires payload like this:
+```json
+{
+ "value": "secret-value",
+ "attributes": {
+ "enabled": true
+ }
+}
+```
+
+Please register the following arguments in context variables collection:
+
+```csharp
+var contextVariables = new ContextVariables();
+contextVariables.Set("value", "secret-value");
+contextVariables.Set("enabled", true);
+```
+
+### Enabling namespacing
+To enable namespacing, set the `EnablePayloadNamespacing` property of the `OpenApiSkillExecutionParameters` execution parameters to `true` when importing the AI plugin:
+
+```csharp
+var plugin = await kernel.ImportPluginFunctionsAsync("", new Uri(""), new OpenApiSkillExecutionParameters(httpClient) { EnablePayloadNamespacing = true });
+```
+Remember that the namespacing mechanism depends on prefixing parameter names with their parent parameter name, separated by dots. So, use the 'namespaced' parameter names when adding arguments for them to the context variables. Let's consider this JSON:
+
+```json
+{
+ "upn": "",
+ "receiver": {
+ "upn": ""
+ },
+ "cc": {
+ "upn": ""
+ }
+}
+```
+It contains `upn` properties at different levels. The the argument registration for the parameters(property values) will look like:
+```csharp
+var contextVariables = new ContextVariables();
+contextVariables.Set("upn", "");
+contextVariables.Set("receiver.upn", "");
+contextVariables.Set("cc.upn", "");
+```
diff --git a/docs/decisions/0006-prompt-extract-template-engine.md b/docs/decisions/0006-prompt-extract-template-engine.md
new file mode 100644
index 000000000000..a72f5adc1413
--- /dev/null
+++ b/docs/decisions/0006-prompt-extract-template-engine.md
@@ -0,0 +1,32 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: accepted
+date: 2023-08-25
+deciders: shawncal
+consulted:
+informed:
+---
+# Extract the Prompt Template Engine from Semantic Kernel core
+
+## Context and Problem Statement
+
+The Semantic Kernel includes a default prompt template engine which is used to render Semantic Kernel prompts i.e., `skprompt.txt` files. The prompt template is rendered before being send to the AI to allow the prompt to be generated dynamically e.g., include input parameters or the result of a native or semantic function execution.
+To reduce the complexity and API surface of the Semantic Kernel the prompt template engine is going to be extracted and added to it's own package.
+
+The long term goal is to enable the following scenarios:
+
+1. Implement a custom template engine e.g., using Handlebars templates. This is supported now but we want to simplify the API to be implemented.
+2. Support using zero or many template engines.
+
+## Decision Drivers
+
+* Reduce API surface and complexity of the Semantic Kernel core.
+* Simplify the `IPromptTemplateEngine` interface to make it easier to implement a custom template engine.
+* Make the change without breaking existing clients.
+
+## Decision Outcome
+
+* Create a new package called `Microsoft.SemanticKernel.TemplateEngine`.
+* Maintain the existing namespace for all prompt template engine code.
+* Simplify the `IPromptTemplateEngine` interface to just require implementation of `RenderAsync`.
+* Dynamically load the existing `PromptTemplateEngine` if the `Microsoft.SemanticKernel.TemplateEngine` assembly is available.
diff --git a/docs/decisions/0007-support-multiple-named-args-in-template-function-calls.md b/docs/decisions/0007-support-multiple-named-args-in-template-function-calls.md
new file mode 100644
index 000000000000..412d0901a406
--- /dev/null
+++ b/docs/decisions/0007-support-multiple-named-args-in-template-function-calls.md
@@ -0,0 +1,110 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 6/16/2023
+deciders: shawncal, hario90
+consulted: dmytrostruk, matthewbolanos
+informed: lemillermicrosoft
+---
+# Add support for multiple named arguments in template function calls
+
+## Context and Problem Statement
+
+Native functions now support multiple parameters, populated from context values with the same name. Semantic functions currently only support calling native functions with no more than 1 argument. The purpose of these changes is to add support for calling native functions within semantic functions with multiple named arguments.
+
+## Decision Drivers
+
+* Parity with Guidance
+* Readability
+* Similarity to languages familiar to SK developers
+* YAML compatibility
+
+## Considered Options
+
+### Syntax idea 1: Using commas
+
+```handlebars
+{{Skill.MyFunction street: "123 Main St", zip: "98123", city:"Seattle", age: 25}}
+```
+
+Pros:
+
+* Commas could make longer function calls easier to read, especially if spaces before and after the arg separator (a colon in this case) are allowed.
+
+Cons:
+
+* Guidance doesn't use commas
+* Spaces are already used as delimiters elsewhere so the added complexity of supporting commas isn't necessary
+
+### Syntax idea 2: JavaScript/C#-Style delimiter (colon)
+
+```handlebars
+
+{{MyFunction street:"123 Main St" zip:"98123" city:"Seattle" age: "25"}}
+
+```
+
+Pros:
+
+* Resembles JavaScript Object syntax and C# named argument syntax
+
+Cons:
+
+* Doesn't align with Guidance syntax which uses equal signs as arg part delimiters
+* Too similar to YAML key/value pairs if we support YAML prompts in the future. It's likely possible to support colons as delimiters but would be better to have a separator that is distinct from normal YAML syntax.
+
+### Syntax idea 3: Python/Guidance-Style delimiter
+
+```handlebars
+{{MyFunction street="123 Main St" zip="98123" city="Seattle"}}
+```
+
+Pros:
+
+* Resembles Python's keyword argument syntax
+* Resembles Guidance's named argument syntax
+* Not too similar to YAML key/value pairs if we support YAML prompts in the future.
+
+Cons:
+
+* Doesn't align with C# syntax
+
+### Syntax idea 4: Allow whitespace between arg name/value delimiter
+
+```handlebars
+{{MyFunction street = "123 Main St" zip = "98123" city = "Seattle"}}
+```
+
+Pros:
+
+* Follows the convention followed by many programming languages of whitespace flexibility where spaces, tabs, and newlines within code don't impact a program's functionality
+
+Cons:
+
+* Promotes code that is harder to read unless commas can be used (see [Using Commas](#syntax-idea-1-using-commas))
+* More complexity to support
+* Doesn't align with Guidance which doesn't support spaces before and after the = sign.
+
+## Decision Outcome
+
+Chosen options: "Syntax idea 3: Python/Guidance-Style keyword arguments", because it aligns well with Guidance's syntax and is the most compatible with YAML and "Syntax idea 4: Allow whitespace between arg name/value delimiter" for more flexible developer experience.
+
+Additional decisions:
+
+* Continue supporting up to 1 positional argument for backward compatibility. Currently, the argument passed to a function is assumed to be the `$input` context variable.
+
+Example
+
+```handlebars
+
+{{MyFunction "inputVal" street="123 Main St" zip="98123" city="Seattle"}}
+
+```
+
+* Allow arg values to be defined as strings or variables ONLY, e.g.
+
+```handlebars
+{{MyFunction street=$street zip="98123" city='Seattle'}}
+```
+
+If function expects a value other than a string for an argument, the SDK will use the corresponding TypeConverter to parse the string provided when evaluating the expression.
diff --git a/docs/decisions/0008-support-generic-llm-request-settings.md b/docs/decisions/0008-support-generic-llm-request-settings.md
new file mode 100644
index 000000000000..98f11afff66c
--- /dev/null
+++ b/docs/decisions/0008-support-generic-llm-request-settings.md
@@ -0,0 +1,231 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 2023-=9-15
+deciders: shawncal
+consulted: stoub, lemiller, dmytrostruk
+informed:
+---
+# Refactor to support generic LLM request settings
+
+## Context and Problem Statement
+
+The Semantic Kernel abstractions package includes a number of classes ([CompleteRequestSettings](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel.Abstractions/AI/TextCompletion/CompleteRequestSettings.cs), [ChatRequestSettings](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel.Abstractions/AI/ChatCompletion/ChatRequestSettings.cs) [PromptTemplateConfig.CompletionConfig](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel.Abstractions/SemanticFunctions/PromptTemplateConfig.cs#L18C1-L82C6)) which are used to support:
+
+1. Passing LLM request settings when invoking an AI service
+2. Deserialization of LLM requesting settings when loading the `config.json` associated with a Semantic Function
+
+The problem with these classes is they include OpenAI specific properties only. A developer can only pass OpenAI specific requesting settings which means:
+
+1. Settings may be passed that have no effect e.g., passing `MaxTokens` to Huggingface
+2. Settings that do not overlap with the OpenAI properties cannot be sent e.g., Oobabooga supports additional parameters e.g., `do_sample`, `typical_p`, ...
+
+Link to issue raised by the implementer of the Oobabooga AI service:
+
+## Decision Drivers
+
+* Semantic Kernel abstractions must be AI Service agnostic i.e., remove OpenAI specific properties.
+* Solution must continue to support loading Semantic Function configuration (which includes AI request settings) from `config.json`.
+* Provide good experience for developers e.g., must be able to program with type safety, intellisense, etc.
+* Provide a good experience for implementors of AI services i.e., should be clear how to define the appropriate AI Request Settings abstraction for the service they are supporting.
+* Semantic Kernel implementation and sample code should avoid specifying OpenAI specific request settings in code that is intended to be used with multiple AI services.
+* Semantic Kernel implementation and sample code must be clear if an implementation is intended to be OpenAI specific.
+
+## Considered Options
+
+* Use `dynamic` to pass request settings
+* Use `object` to pass request settings
+* Define a base class for AI request settings which all implementations must extend
+
+Note: Using generics was discounted during an earlier investigation which Dmytro conducted.
+
+## Decision Outcome
+
+**Proposed:** Define a base class for AI request settings which all implementations must extend.
+
+## Pros and Cons of the Options
+
+### Use `dynamic` to pass request settings
+
+The `IChatCompletion` interface would look like this:
+
+```csharp
+public interface IChatCompletion : IAIService
+{
+ ChatHistory CreateNewChat(string? instructions = null);
+
+ Task> GetChatCompletionsAsync(
+ ChatHistory chat,
+ dynamic? requestSettings = null,
+ CancellationToken cancellationToken = default);
+
+ IAsyncEnumerable GetStreamingChatCompletionsAsync(
+ ChatHistory chat,
+ dynamic? requestSettings = null,
+ CancellationToken cancellationToken = default);
+}
+```
+
+Developers would have the following options to specify the requesting settings for a semantic function:
+
+```csharp
+// Option 1: Use an anonymous type
+await kernel.InvokeSemanticFunctionAsync("Hello AI, what can you do for me?", requestSettings: new { MaxTokens = 256, Temperature = 0.7 });
+
+// Option 2: Use an OpenAI specific class
+await kernel.InvokeSemanticFunctionAsync(prompt, requestSettings: new OpenAIRequestSettings() { MaxTokens = 256, Temperature = 0.7 });
+
+// Option 3: Load prompt template configuration from a JSON payload
+string configPayload = @"{
+ ""schema"": 1,
+ ""description"": ""Say hello to an AI"",
+ ""type"": ""completion"",
+ ""completion"": {
+ ""max_tokens"": 60,
+ ""temperature"": 0.5,
+ ""top_p"": 0.0,
+ ""presence_penalty"": 0.0,
+ ""frequency_penalty"": 0.0
+ }
+}";
+var templateConfig = JsonSerializer.Deserialize(configPayload);
+var func = kernel.CreateSemanticFunction(prompt, config: templateConfig!, "HelloAI");
+await kernel.RunAsync(func);
+```
+
+PR:
+
+* Good, SK abstractions contain no references to OpenAI specific request settings
+* Neutral, because anonymous types can be used which allows a developer to pass in properties that may be supported by multiple AI services e.g., `temperature` or combine properties for different AI services e.g., `max_tokens` (OpenAI) and `max_new_tokens` (Oobabooga).
+* Bad, because it's not clear to developers what they should pass when creating a semantic function
+* Bad, because it's not clear to implementors of a chat/text completion service what they should accept or how to add service specific properties.
+* Bad, there is no compiler type checking for code paths where the dynamic argument has not been resolved which will impact code quality. Type issues manifest as `RuntimeBinderException`'s and may be difficult to troubleshoot. Special care needs to be taken with return types e.g., may be necessary to specify an explicit type rather than just `var` again to avoid errors such as `Microsoft.CSharp.RuntimeBinder.RuntimeBinderException : Cannot apply indexing with [] to an expression of type 'object'`
+
+### Use `object` to pass request settings
+
+The `IChatCompletion` interface would look like this:
+
+```csharp
+public interface IChatCompletion : IAIService
+{
+ ChatHistory CreateNewChat(string? instructions = null);
+
+ Task> GetChatCompletionsAsync(
+ ChatHistory chat,
+ object? requestSettings = null,
+ CancellationToken cancellationToken = default);
+
+ IAsyncEnumerable GetStreamingChatCompletionsAsync(
+ ChatHistory chat,
+ object? requestSettings = null,
+ CancellationToken cancellationToken = default);
+}
+```
+
+The calling pattern is the same as for the `dynamic` case i.e. use either an anonymous type, an AI service specific class e.g., `OpenAIRequestSettings` or load from JSON.
+
+PR:
+
+* Good, SK abstractions contain no references to OpenAI specific request settings
+* Neutral, because anonymous types can be used which allows a developer to pass in properties that may be supported by multiple AI services e.g., `temperature` or combine properties for different AI services e.g., `max_tokens` (OpenAI) and `max_new_tokens` (Oobabooga).
+* Bad, because it's not clear to developers what they should pass when creating a semantic function
+* Bad, because it's not clear to implementors of a chat/text completion service what they should accept or how to add service specific properties.
+* Bad, code is needed to perform type checks and explicit casts. The situation is slightly better than for the `dynamic` case.
+
+### Define a base class for AI request settings which all implementations must extend
+
+The `IChatCompletion` interface would look like this:
+
+```csharp
+public interface IChatCompletion : IAIService
+{
+ ChatHistory CreateNewChat(string? instructions = null);
+
+ Task> GetChatCompletionsAsync(
+ ChatHistory chat,
+ AIRequestSettings? requestSettings = null,
+ CancellationToken cancellationToken = default);
+
+ IAsyncEnumerable GetStreamingChatCompletionsAsync(
+ ChatHistory chat,
+ AIRequestSettings? requestSettings = null,
+ CancellationToken cancellationToken = default);
+}
+```
+
+`AIRequestSettings` is defined as follows:
+
+```csharp
+public class AIRequestSettings
+{
+ ///
+ /// Service identifier.
+ ///
+ [JsonPropertyName("service_id")]
+ [JsonPropertyOrder(1)]
+ public string? ServiceId { get; set; } = null;
+
+ ///
+ /// Extra properties
+ ///
+ [JsonExtensionData]
+ public Dictionary? ExtensionData { get; set; }
+}
+```
+
+Developers would have the following options to specify the requesting settings for a semantic function:
+
+```csharp
+// Option 1: Invoke the semantic function and pass an OpenAI specific instance
+var result = await kernel.InvokeSemanticFunctionAsync(prompt, requestSettings: new OpenAIRequestSettings() { MaxTokens = 256, Temperature = 0.7 });
+Console.WriteLine(result.Result);
+
+// Option 2: Load prompt template configuration from a JSON payload
+string configPayload = @"{
+ ""schema"": 1,
+ ""description"": ""Say hello to an AI"",
+ ""type"": ""completion"",
+ ""completion"": {
+ ""max_tokens"": 60,
+ ""temperature"": 0.5,
+ ""top_p"": 0.0,
+ ""presence_penalty"": 0.0,
+ ""frequency_penalty"": 0.0
+ }
+}";
+var templateConfig = JsonSerializer.Deserialize(configPayload);
+var func = kernel.CreateSemanticFunction(prompt, config: templateConfig!, "HelloAI");
+
+await kernel.RunAsync(func);
+```
+
+It would also be possible to use the following pattern:
+
+```csharp
+this._summarizeConversationFunction = kernel.CreateSemanticFunction(
+ SemanticFunctionConstants.SummarizeConversationDefinition,
+ skillName: nameof(ConversationSummarySkill),
+ description: "Given a section of a conversation, summarize conversation.",
+ requestSettings: new AIRequestSettings()
+ {
+ ExtensionData = new Dictionary()
+ {
+ { "Temperature", 0.1 },
+ { "TopP", 0.5 },
+ { "MaxTokens", MaxTokens }
+ }
+ });
+
+```
+
+The caveat with this pattern is, assuming a more specific implementation of `AIRequestSettings` uses JSON serialization/deserialization to hydrate an instance from the base `AIRequestSettings`, this will only work if all properties are supported by the default JsonConverter e.g.,
+
+* If we have `MyAIRequestSettings` which includes a `Uri` property. The implementation of `MyAIRequestSettings` would make sure to load a URI converter so that it can serialize/deserialize the settings correctly.
+* If the settings for `MyAIRequestSettings` are sent to an AI service which relies on the default JsonConverter then a `NotSupportedException` exception will be thrown.
+
+PR:
+
+* Good, SK abstractions contain no references to OpenAI specific request settings
+* Good, because it is clear to developers what they should pass when creating a semantic function and it is easy to discover what service specific request setting implementations exist.
+* Good, because it is clear to implementors of a chat/text completion service what they should accept and how to extend the base abstraction to add service specific properties.
+* Neutral, because `ExtensionData` can be used which allows a developer to pass in properties that may be supported by multiple AI services e.g., `temperature` or combine properties for different AI services e.g., `max_tokens` (OpenAI) and `max_new_tokens` (Oobabooga).
diff --git a/docs/decisions/0009-dotnet-project-structure.md b/docs/decisions/0009-dotnet-project-structure.md
new file mode 100644
index 000000000000..42a929266f23
--- /dev/null
+++ b/docs/decisions/0009-dotnet-project-structure.md
@@ -0,0 +1,291 @@
+---
+
+# These are optional elements. Feel free to remove any of them
+
+status: accepted
+date: 2023-09-29
+deciders: semenshi, dmytrostruk, rbarreto
+consulted: shawncal, stoub, lemiller
+informed: {list everyone who is kept up-to-date on progress; and with whom there is a one-way communication}
+---
+
+# DotNet Project Structure for 1.0 Release
+
+## Context and Problem Statement
+
+- Provide a cohesive, well-defined set of assemblies that developers can easily combine based on their needs.
+ - Semantic Kernel core should only contain functionality related to AI orchestration
+ - Remove prompt template engine and semantic functions
+ - Semantic Kernel abstractions should only interfaces, abstract classes and minimal classes to support these
+- Remove `Skills` naming from NuGet packages and replace with `Plugins`
+ - Clearly distinguish between plugin implementations (`Skills.MsGraph`) and plugin integration (`Skills.OpenAPI`)
+- Have consistent naming for assemblies and their root namespaces
+ - See [Naming Patterns](#naming-patterns) section for examples of current patterns
+
+## Decision Drivers
+
+- Avoid having too many assemblies because of impact of signing these and to reduce complexity
+- Follow .Net naming guidelines
+ - [Names of Assemblies and DLLs](https://learn.microsoft.com/en-us/dotnet/standard/design-guidelines/names-of-assemblies-and-dlls)
+ - [Names of Namespaces](https://learn.microsoft.com/en-us/dotnet/standard/design-guidelines/names-of-namespaces)
+
+## Considered Options
+
+- Option #1: New `planning`, `functions` and `plugins` project areas
+- Option #2: Folder naming matches assembly name
+
+In all cases the following changes will be made:
+
+- Move non core Connectors to a separate repository
+- Merge prompt template engine and semantic functions into a single package
+
+## Decision Outcome
+
+Chosen option: Option #2: Folder naming matches assembly name, because:
+
+1. It provides a way for developers to easily discover where code for a particular assembly is located
+1. It is consistent with other e.g., [azure-sdk-for-net](https://github.com/Azure/azure-sdk-for-net)
+
+Main categories for the projects will be:
+
+1. `Connectors`: ***A connector project allows the Semantic Kernel to connect to AI and Memory services***. Some of the existing connector projects may move to other repositories.
+1. `Planners`: ***A planner project provides one or more planner implementations which take an ask and convert it into an executable plan to achieve that ask***. This category will include the current action, sequential and stepwise planners (these could be merged into a single project). Additional planning implementations e.g., planners that generate Powershell or Python code can be added as separate projects.
+1. `Functions`: ***A function project that enables the Semantic Kernel to access the functions it will orchestrate***. This category will include:
+ 1. Semantic functions i.e., prompts executed against an LLM
+ 1. GRPC remote procedures i.e., procedures executed remotely using the GRPC framework
+ 1. Open API endpoints i.e., REST endpoints that have Open API definitions executed remotely using the HTTP protocol
+1. `Plugins`: ***A plugin project contains the implementation(s) of a Semantic Kernel plugin***. A Semantic Kernel plugin is contains a concrete implementation of a function e.g., a plugin may include code for basic text operations.
+
+### Option #1: New `planning`, `functions` and `plugins` project areas
+
+```text
+SK-dotnet
+├── samples/
+└── src/
+ ├── connectors/
+ │ ├── Connectors.AI.OpenAI*
+ │ ├── Connectors.AI.HuggingFace
+ │ ├── Connectors.Memory.AzureCognitiveSearch
+ │ ├── Connectors.Memory.Qdrant
+ │ ├── ...
+ │ └── Connectors.UnitTests
+ ├── planners/
+ │ ├── Planners.Action*
+ │ ├── Planners.Sequential*
+ │ └── Planners.Stepwise*
+ ├── functions/
+ │ ├── Functions.Native*
+ │ ├── Functions.Semantic*
+ │ ├── Functions.Planning*
+ │ ├── Functions.Grpc
+ │ ├── Functions.OpenAPI
+ │ └── Functions.UnitTests
+ ├── plugins/
+ │ ├── Plugins.Core*
+ │ ├── Plugins.Document
+ │ ├── Plugins.MsGraph
+ │ ├── Plugins.WebSearch
+ │ └── Plugins.UnitTests
+ ├── InternalUtilities/
+ ├── IntegrationTests
+ ├── SemanticKernel*
+ ├── SemanticKernel.Abstractions*
+ ├── SemanticKernel.MetaPackage
+ └── SemanticKernel.UnitTests
+```
+
+### Changes
+
+| Project | Description |
+|-------------------------------------|-------------|
+| `Functions.Native` | Extract native functions from Semantic Kernel core and abstractions. |
+| `Functions.Semantic` | Extract semantic functions from Semantic Kernel core and abstractions. Include the prompt template engine. |
+| `Functions.Planning` | Extract planning from Semantic Kernel core and abstractions. |
+| `Functions.Grpc` | Old `Skills.Grpc` project |
+| `Functions.OpenAPI` | Old `Skills.OpenAPI` project |
+| `Plugins.Core` | Old `Skills.Core` project |
+| `Plugins.Document` | Old `Skills.Document` project |
+| `Plugins.MsGraph` | Old `Skills.MsGraph` project |
+| `Plugins.WebSearch` | Old `Skills.WebSearch` project |
+
+### Semantic Kernel Skills and Functions
+
+This diagram how functions and plugins would be integrated with the Semantic Kernel core.
+
+
+
+### Option #2: Folder naming matches assembly name
+
+```text
+SK-dotnet
+├── samples/
+└── libraries/
+ ├── SK-dotnet.sln
+ │
+ ├── Microsoft.SemanticKernel.Connectors.AI.OpenAI*
+ │ ├── src
+ │ └── tests
+ │ (Not shown but all projects will have src and tests subfolders)
+ ├── Microsoft.SemanticKernel.Connectors.AI.HuggingFace
+ ├── Microsoft.SemanticKernel.Connectors.Memory.AzureCognitiveSearch
+ ├── Microsoft.SemanticKernel.Connectors.Memory.Qdrant
+ │
+ ├── Microsoft.SemanticKernel.Planners*
+ │
+ ├── Microsoft.SemanticKernel.Reliability.Basic*
+ ├── Microsoft.SemanticKernel.Reliability.Polly
+ │
+ ├── Microsoft.SemanticKernel.TemplateEngines.Basic*
+ │
+ ├── Microsoft.SemanticKernel.Functions.Semantic*
+ ├── Microsoft.SemanticKernel.Functions.Grpc
+ ├── Microsoft.SemanticKernel.Functions.OpenAPI
+ │
+ ├── Microsoft.SemanticKernel.Plugins.Core*
+ ├── Microsoft.SemanticKernel.Plugins.Document
+ ├── Microsoft.SemanticKernel.Plugins.MsGraph
+ ├── Microsoft.SemanticKernel.Plugins.Web
+ │
+ ├── InternalUtilities
+ │
+ ├── IntegrationTests
+ │
+ ├── Microsoft.SemanticKernel.Core*
+ ├── Microsoft.SemanticKernel.Abstractions*
+ └── Microsoft.SemanticKernel.MetaPackage
+```
+
+***Notes:***
+
+- There will only be a single solution file (initially).
+- Projects will be grouped in the solution i.e., connectors, planners, plugins, functions, extensions, ...
+- Each project folder contains a `src` and `tests` folder.
+- There will be a gradual process to move existing unit tests to the correct location as some projects will need to be broken up.
+
+## More Information
+
+### Current Project Structure
+
+```text
+SK-dotnet
+├── samples/
+└── src/
+ ├── connectors/
+ │ ├── Connectors.AI.OpenAI*
+ │ ├── Connectors...
+ │ └── Connectors.UnitTests
+ ├── extensions/
+ │ ├── Planner.ActionPlanner*
+ │ ├── Planner.SequentialPlanner*
+ │ ├── Planner.StepwisePlanner
+ │ ├── TemplateEngine.PromptTemplateEngine*
+ │ └── Extensions.UnitTests
+ ├── InternalUtilities/
+ ├── skills/
+ │ ├── Skills.Core
+ │ ├── Skills.Document
+ │ ├── Skills.Grpc
+ │ ├── Skills.MsGraph
+ │ ├── Skills.OpenAPI
+ │ ├── Skills.Web
+ │ └── Skills.UnitTests
+ ├── IntegrationTests
+ ├── SemanticKernel*
+ ├── SemanticKernel.Abstractions*
+ ├── SemanticKernel.MetaPackage
+ └── SemanticKernel.UnitTests
+```
+
+\\* - Means the project is part of the Semantic Kernel meta package
+
+### Project Descriptions
+
+| Project | Description |
+|-------------------------------------|-------------|
+| Connectors.AI.OpenAI | Azure OpenAI and OpenAI service connectors |
+| Connectors... | Collection of other AI service connectors, some of which will move to another repository |
+| Connectors.UnitTests | Connector unit tests |
+| Planner.ActionPlanner | Semantic Kernel implementation of an action planner |
+| Planner.SequentialPlanner | Semantic Kernel implementation of a sequential planner |
+| Planner.StepwisePlanner | Semantic Kernel implementation of a stepwise planner |
+| TemplateEngine.Basic | Prompt template engine basic implementations which are used by Semantic Functions only |
+| Extensions.UnitTests | Extensions unit tests |
+| InternalUtilities | Internal utilities which are reused by multiple NuGet packages (all internal) |
+| Skills.Core | Core set of native functions which are provided to support Semantic Functions |
+| Skills.Document | Native functions for interacting with Microsoft documents |
+| Skills.Grpc | Semantic Kernel integration for GRPC based endpoints |
+| Skills.MsGraph | Native functions for interacting with Microsoft Graph endpoints |
+| Skills.OpenAPI | Semantic Kernel integration for OpenAI endpoints and reference Azure Key Vault implementation |
+| Skills.Web | Native functions for interacting with Web endpoints e.g., Bing, Google, File download |
+| Skills.UnitTests | Skills unit tests |
+| IntegrationTests | Semantic Kernel integration tests |
+| SemanticKernel | Semantic Kernel core implementation |
+| SemanticKernel.Abstractions | Semantic Kernel abstractions i.e., interface, abstract classes, supporting classes, ... |
+| SemanticKernel.MetaPackage | Semantic Kernel meta package i.e., a NuGet package that references other required Semantic Kernel NuGet packages |
+| SemanticKernel.UnitTests | Semantic Kernel unit tests |
+
+### Naming Patterns
+
+Below are some different examples of Assembly and root namespace naming that are used in the projects.
+
+```xml
+ Microsoft.SemanticKernel.Abstractions
+ Microsoft.SemanticKernel
+
+ Microsoft.SemanticKernel.Core
+ Microsoft.SemanticKernel
+
+ Microsoft.SemanticKernel.Planning.ActionPlanner
+ Microsoft.SemanticKernel.Planning.Action
+
+ Microsoft.SemanticKernel.Skills.Core
+ $(AssemblyName)
+```
+
+### Current Folder Structure
+
+```text
+dotnet/
+├── samples/
+│ ├── ApplicationInsightsExample/
+│ ├── KernelSyntaxExamples/
+│ └── NCalcSkills/
+└── src/
+ ├── Connectors/
+ │ ├── Connectors.AI.OpenAI*
+ │ ├── Connectors...
+ │ └── Connectors.UnitTests
+ ├── Extensions/
+ │ ├── Planner.ActionPlanner
+ │ ├── Planner.SequentialPlanner
+ │ ├── Planner.StepwisePlanner
+ │ ├── TemplateEngine.PromptTemplateEngine
+ │ └── Extensions.UnitTests
+ ├── InternalUtilities/
+ ├── Skills/
+ │ ├── Skills.Core
+ │ ├── Skills.Document
+ │ ├── Skills.Grpc
+ │ ├── Skills.MsGraph
+ │ ├── Skills.OpenAPI
+ │ ├── Skills.Web
+ │ └── Skills.UnitTests
+ ├── IntegrationTests/
+ ├── SemanticKernel/
+ ├── SemanticKerne.Abstractions/
+ ├── SemanticKernel.MetaPackage/
+ └── SemanticKernel.UnitTests/
+
+```
+
+### Semantic Kernel Skills and Functions
+
+This diagram show current skills are integrated with the Semantic Kernel core.
+
+***Note:***
+
+- This is not a true class hierarchy diagram. It show some class relationships and dependencies.
+- Namespaces are abbreviated to remove Microsoft.SemanticKernel prefix. Namespaces use `_` rather than `.`.
+
+
+
diff --git a/docs/decisions/0009-function-and-kernel-result-types.md b/docs/decisions/0009-function-and-kernel-result-types.md
new file mode 100644
index 000000000000..bf5d6552837b
--- /dev/null
+++ b/docs/decisions/0009-function-and-kernel-result-types.md
@@ -0,0 +1,84 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: accepted
+date: 2023-09-21
+deciders: shawncal, dmytrostruk
+consulted:
+informed:
+---
+# Replace SKContext as Function/Kernel result type with FunctionResult and KernelResult models
+
+## Context and Problem Statement
+
+Methods `function.InvokeAsync` and `kernel.RunAsync` return `SKContext` as result type. This has several problems:
+
+1. `SKContext` contains property `Result`, which is `string`. Based on that, it's not possible to return complex type or implement streaming capability in Kernel.
+2. `SKContext` contains property `ModelResults`, which is coupled to LLM-specific logic, so it's only applicable to semantic functions in specific cases.
+3. `SKContext` as a mechanism of passing information between functions in pipeline should be internal implementation. Caller of Kernel should provide input/request and receive some result, but not `SKContext`.
+4. `SKContext` contains information related to the last executed function without a way to access information about specific function in pipeline.
+
+## Decision Drivers
+
+1. Kernel should be able to return complex type as well as support streaming capability.
+2. Kernel should be able to return data related to function execution (e.g. amount of tokens used) in a way, when it's not coupled to AI logic.
+3. `SKContext` should work as internal mechanism of passing information between functions.
+4. There should be a way how to differentiate function result from kernel result, since these entities are different by nature and may contain different set of properties in the future.
+5. The possibility to access specific function result in the middle of pipeline will provide more insights to the users how their functions performed.
+
+## Considered Options
+
+1. Use `dynamic` as return type - this option provides some flexibility, but on the other hand removes strong typing, which is preferred option in .NET world. Also, there will be no way how to differentiate function result from Kernel result.
+2. Define new types - `FunctionResult` and `KernelResult` - chosen approach.
+
+## Decision Outcome
+
+New `FunctionResult` and `KernelResult` return types should cover scenarios like returning complex types from functions, supporting streaming and possibility to access result of each function separately.
+
+### Complex Types and Streaming
+
+For complex types and streaming, property `object Value` will be defined in `FunctionResult` to store single function result, and in `KernelResult` to store result from last function in execution pipeline. For better usability, generic method `GetValue` will allow to cast `object Value` to specific type.
+
+Examples:
+
+```csharp
+// string
+var text = (await kernel.RunAsync(function)).GetValue();
+
+// complex type
+var myComplexType = (await kernel.RunAsync(function)).GetValue();
+
+// streaming
+var results = (await kernel.RunAsync(function)).GetValue>();
+
+await foreach (var result in results)
+{
+ Console.WriteLine(result);
+}
+```
+
+When `FunctionResult`/`KernelResult` will store `TypeA` and caller will try to cast it to `TypeB` - in this case `InvalidCastException` will be thrown with details about types. This will provide some information to the caller which type should be used for casting.
+
+### Metadata
+
+To return additional information related to function execution - property `Dictionary Metadata` will be added to `FunctionResult`. This will allow to pass any kind of information to the caller, which should provide some insights how function performed (e.g. amount of tokens used, AI model response etc.)
+
+Examples:
+
+```csharp
+var functionResult = await function.InvokeAsync(context);
+Console.WriteLine(functionResult.Metadata["MyInfo"]);
+```
+
+### Multiple function results
+
+`KernelResult` will contain collection of function results - `IReadOnlyCollection FunctionResults`. This will allow to get specific function result from `KernelResult`. Properties `FunctionName` and `PluginName` in `FunctionResult` will help to get specific function from collection.
+
+Example:
+
+```csharp
+var kernelResult = await kernel.RunAsync(function1, function2, function3);
+
+var functionResult2 = kernelResult.FunctionResults.First(l => l.FunctionName == "Function2" && l.PluginName == "MyPlugin");
+
+Assert.Equal("Result2", functionResult2.GetValue());
+```
diff --git a/docs/decisions/0010-openai-function-calling.md b/docs/decisions/0010-openai-function-calling.md
new file mode 100644
index 000000000000..06844ca2afa1
--- /dev/null
+++ b/docs/decisions/0010-openai-function-calling.md
@@ -0,0 +1,68 @@
+---
+status: proposed
+date: 2023-09-21
+deciders: gitri-ms, shawncal
+consulted: lemillermicrosoft, awharrison-28, dmytrostruk, nacharya1
+informed: eavanvalkenburg, kevdome3000
+---
+# OpenAI Function Calling Support
+
+## Context and Problem Statement
+
+The [function calling](https://platform.openai.com/docs/guides/gpt/function-calling) capability of OpenAI's Chat Completions API allows developers to describe functions to the model, and have the model decide whether to output a JSON object specifying a function and appropriate arguments to call in response to the given prompt. This capability is enabled by two new API parameters to the `/v1/chat/completions` endpoint:
+- `function_call` - auto (default), none, or a specific function to call
+- `functions` - JSON descriptions of the functions available to the model
+
+Functions provided to the model are injected as part of the system message and are billed/counted as input tokens.
+
+We have received several community requests to provide support for this capability when using SK with the OpenAI chat completion models that support it.
+
+## Decision Drivers
+
+* Minimize changes to the core kernel for OpenAI-specific functionality
+* Cost concerns with including a long list of function descriptions in the request
+* Security and cost concerns with automatically executing functions returned by the model
+
+## Considered Options
+
+* Support sending/receiving functions via chat completions endpoint _with_ modifications to interfaces
+* Support sending/receiving functions via chat completions endpoint _without_ modifications to interfaces
+* Implement a planner around the function calling capability
+
+## Decision Outcome
+
+Chosen option: "Support sending/receiving functions via chat completions endpoint _without_ modifications to interfaces"
+
+With this option, we utilize the existing request settings object to send functions to the model. The app developer controls what functions are included and is responsible for validating and executing the function result.
+
+### Consequences
+
+* Good, because avoids breaking changes to the core kernel
+* Good, because OpenAI-specific functionality is contained to the OpenAI connector package
+* Good, because allows app to control what functions are available to the model (including non-SK functions)
+* Good, because keeps the option open for integrating with planners in the future
+* Neutral, because requires app developer to validate and execute resulting function
+* Bad, because not as obvious how to use this capability and access the function results
+
+## Pros and Cons of the Options
+
+### Support sending/receiving functions _with_ modifications to chat completions interfaces
+
+This option would update the `IChatCompletion` and `IChatResult` interfaces to expose parameters/methods for providing and accessing function information.
+
+* Good, because provides a clear path for using the function calling capability
+* Good, because allows app to control what functions are available to the model (including non-SK functions)
+* Neutral, because requires app developer to validate and execute resulting function
+* Bad, because introduces breaking changes to core kernel abstractions
+* Bad, because OpenAI-specific functionality would be included in core kernel abstractions and would need to be ignored by other model providers
+
+### Implement a planner around the function calling capability
+
+Orchestrating external function calls fits within SK's concept of planning. With this approach, we would implement a planner that would take the function calling result and produce a plan that the app developer could execute (similar to SK's ActionPlanner).
+
+* Good, because producing a plan result makes it easy for the app developer to execute the chosen function
+* Bad, because functions would need to be registered with the kernel in order to be executed
+* Bad, because would create confusion about when to use which planner
+
+## Additional notes
+There has been much discussion and debate over the pros and cons of automatically invoking a function returned by the OpenAI model, if it is registered with the kernel. As there are still many open questions around this behavior and its implications, we have decided to not include this capability in the initial implementation. We will continue to explore this option and may include it in a future update.
\ No newline at end of file
diff --git a/docs/decisions/0011-memory-as-plugin.md b/docs/decisions/0011-memory-as-plugin.md
new file mode 100644
index 000000000000..99d8e0955c43
--- /dev/null
+++ b/docs/decisions/0011-memory-as-plugin.md
@@ -0,0 +1,45 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 2023-09-21
+deciders: shawncal, dmytrostruk
+consulted:
+informed:
+---
+# Move all Memory-related logic to separate Plugin
+
+## Context and Problem Statement
+
+Memory-related logic is located across different C# projects:
+
+- `SemanticKernel.Abstractions`
+ - `IMemoryStore`
+ - `ISemanticTextMemory`
+ - `MemoryRecord`
+ - `NullMemory`
+- `SemanticKernel.Core`
+ - `MemoryConfiguration`
+ - `SemanticTextMemory`
+ - `VolatileMemoryStore`
+- `Plugins.Core`
+ - `TextMemoryPlugin`
+
+Property `ISemanticTextMemory Memory` is also part of `Kernel` type, but kernel itself doesn't use it. This property is needed to inject Memory capabilities in Plugins. At the moment, `ISemanticTextMemory` interface is main dependency of `TextMemoryPlugin`, and in some examples `TextMemoryPlugin` is initialized as `new TextMemoryPlugin(kernel.Memory)`.
+
+While this approach works for Memory, there is no way how to inject `MathPlugin` into other Plugin at the moment. Following the same approach and adding `Math` property to `Kernel` type is not scalable solution, as it's not possible to define separate properties for each available Plugin.
+
+## Decision Drivers
+
+1. Memory should not be a property of `Kernel` type if it's not used by the kernel.
+2. Memory should be treated in the same way as other plugins or services, that may be required by specific Plugins.
+3. There should be a way how to register Memory capability with attached Vector DB and inject that capability in Plugins that require it.
+
+## Decision Outcome
+
+Move all Memory-related logic to separate project called `Plugins.Memory`. This will allow to simplify Kernel logic and use Memory in places where it's needed (other Plugins).
+
+High-level tasks:
+
+1. Move Memory-related code to separate project.
+2. Implement a way how to inject Memory in Plugins that require it.
+3. Remove `Memory` property from `Kernel` type.
diff --git a/docs/decisions/0012-kernel-service-registration.md b/docs/decisions/0012-kernel-service-registration.md
new file mode 100644
index 000000000000..050bd13b50b5
--- /dev/null
+++ b/docs/decisions/0012-kernel-service-registration.md
@@ -0,0 +1,180 @@
+---
+# These are optional elements. Feel free to remove any of them.
+status: proposed
+date: 2023-10-03
+deciders: dmytrostruk
+consulted: semenshi, rbarreto, markwallace
+informed:
+---
+# Kernel Service Registration
+
+## Context and Problem Statement
+
+Plugins may have dependencies to support complex scenarios. For example, there is `TextMemoryPlugin`, which supports functions like `retrieve`, `recall`, `save`, `remove`. Constructor is implemented in following way:
+
+```csharp
+public TextMemoryPlugin(ISemanticTextMemory memory)
+{
+ this._memory = memory;
+}
+```
+
+`TextMemoryPlugin` depends on `ISemanticTextMemory` interface. In similar way, other Plugins may have multiple dependencies and there should be a way how to resolve required dependencies manually or automatically.
+
+At the moment, `ISemanticTextMemory` is a property of `IKernel` interface, which allows to inject `ISemanticTextMemory` into `TextMemoryPlugin` during Plugin initialization:
+
+```csharp
+kernel.ImportFunctions(new TextMemoryPlugin(kernel.Memory));
+```
+
+There should be a way how to support not only Memory-related interface, but any kind of service, which can be used in Plugin - `ISemanticTextMemory`, `IPromptTemplateEngine`, `IDelegatingHandlerFactory` or any other service.
+
+## Considered Options
+
+### Solution #1.1 (available by default)
+
+User is responsible for all Plugins initialization and dependency resolution with **manual** approach.
+
+```csharp
+var memoryStore = new VolatileMemoryStore();
+var embeddingGeneration = new OpenAITextEmbeddingGeneration(modelId, apiKey);
+var semanticTextMemory = new SemanticTextMemory(memoryStore, embeddingGeneration);
+
+var memoryPlugin = new TextMemoryPlugin(semanticTextMemory);
+
+var kernel = Kernel.Builder.Build();
+
+kernel.ImportFunctions(memoryPlugin);
+```
+
+Note: this is native .NET approach how to resolve service dependencies manually, and this approach should always be available by default. Any other solutions which could help to improve dependency resolution can be added on top of this approach.
+
+### Solution #1.2 (available by default)
+
+User is responsible for all Plugins initialization and dependency resolution with **dependency injection** approach.
+
+```csharp
+var serviceCollection = new ServiceCollection();
+
+serviceCollection.AddTransient();
+serviceCollection.AddTransient(
+ (serviceProvider) => new OpenAITextEmbeddingGeneration(modelId, apiKey));
+
+serviceCollection.AddTransient();
+
+var services = serviceCollection.BuildServiceProvider();
+
+// In theory, TextMemoryPlugin can be also registered in DI container.
+var memoryPlugin = new TextMemoryPlugin(services.GetService());
+
+var kernel = Kernel.Builder.Build();
+
+kernel.ImportFunctions(memoryPlugin);
+```
+
+Note: in similar way as Solution #1.1, this way should be supported out of the box. Users always can handle all the dependencies on their side and just provide required Plugins to Kernel.
+
+### Solution #2.1
+
+Custom service collection and service provider on Kernel level to simplify dependency resolution process, as addition to Solution #1.1 and Solution #1.2.
+
+Interface `IKernel` will have its own service provider `KernelServiceProvider` with minimal functionality to get required service.
+
+```csharp
+public interface IKernelServiceProvider
+{
+ T? GetService(string? name = null);
+}
+
+public interface IKernel
+{
+ IKernelServiceProvider Services { get; }
+}
+```
+
+```csharp
+var kernel = Kernel.Builder
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAITextEmbeddingGenerationService(modelId, apiKey)
+ .WithService(),
+ .WithService()
+ .Build();
+
+var semanticTextMemory = kernel.Services.GetService();
+var memoryPlugin = new TextMemoryPlugin(semanticTextMemory);
+
+kernel.ImportFunctions(memoryPlugin);
+```
+
+Pros:
+
+- No dependency on specific DI container library.
+- Lightweight implementation.
+- Possibility to register only those services that can be used by Plugins (isolation from host application).
+- Possibility to register same interface multiple times by **name**.
+
+Cons:
+
+- Implementation and maintenance for custom DI container, instead of using already existing libraries.
+- To import Plugin, it still needs to be initialized manually to inject specific service.
+
+### Solution #2.2
+
+This solution is an improvement for last disadvantage of Solution #2.1 to handle case, when Plugin instance should be initialized manually. This will require to add new way how to import Plugin into Kernel - not with object **instance**, but with object **type**. In this case, Kernel will be responsible for `TextMemoryPlugin` initialization and injection of all required dependencies from custom service collection.
+
+```csharp
+// Instead of this
+var semanticTextMemory = kernel.Services.GetService();
+var memoryPlugin = new TextMemoryPlugin(semanticTextMemory);
+
+kernel.ImportFunctions(memoryPlugin);
+
+// Use this
+kernel.ImportFunctions();
+```
+
+### Solution #3
+
+Instead of custom service collection and service provider in Kernel, use already existing DI library - `Microsoft.Extensions.DependencyInjection`.
+
+```csharp
+var serviceCollection = new ServiceCollection();
+
+serviceCollection.AddTransient();
+serviceCollection.AddTransient(
+ (serviceProvider) => new OpenAITextEmbeddingGeneration(modelId, apiKey));
+
+serviceCollection.AddTransient();
+
+var services = serviceCollection.BuildServiceProvider();
+
+var kernel = Kernel.Builder
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAITextEmbeddingGenerationService(modelId, apiKey)
+ .WithServices(services) // Pass all registered services from host application to Kernel
+ .Build();
+
+// Plugin Import - option #1
+var semanticTextMemory = kernel.Services.GetService();
+var memoryPlugin = new TextMemoryPlugin(semanticTextMemory);
+
+kernel.ImportFunctions(memoryPlugin);
+
+// Plugin Import - option #2
+kernel.ImportFunctions();
+```
+
+Pros:
+
+- No implementation is required for dependency resolution - just use already existing .NET library.
+- The possibility to inject all registered services at once in already existing applications and use them as Plugin dependencies.
+
+Cons:
+
+- Additional dependency for Semantic Kernel package - `Microsoft.Extensions.DependencyInjection`.
+- No possibility to include specific list of services (lack of isolation from host application).
+- Possibility of `Microsoft.Extensions.DependencyInjection` version mismatch and runtime errors (e.g. users have `Microsoft.Extensions.DependencyInjection` `--version 2.0` while Semantic Kernel uses `--version 6.0`)
+
+## Decision Outcome
+
+As for now, support Solution #1.1 and Solution #1.2 only, to keep Kernel as unit of single responsibility. Plugin dependencies should be resolved before passing Plugin instance to the Kernel.
diff --git a/docs/decisions/README.md b/docs/decisions/README.md
index fe06013125ee..0de7ecc1a989 100644
--- a/docs/decisions/README.md
+++ b/docs/decisions/README.md
@@ -1,7 +1,8 @@
-# Markdown Any Decision Records
+# Architectural Decision Records (ADRs)
-MADR is a lean template to capture any decisions in a structured way. The template originated from capturing architectural decisions and developed to a template allowing to capture any decisions taken.
-For more information [see](https://adr.github.io/madr/)
+An Architectural Decision (AD) is a justified software design choice that addresses a functional or non-functional requirement that is architecturally significant. An Architectural Decision Record (ADR) captures a single AD and its rationale.
+
+For more information [see](https://adr.github.io/)
## How are we using ADR's to track technical decisions?
diff --git a/docs/decisions/diagrams/skfunctions-preview.mmd b/docs/decisions/diagrams/skfunctions-preview.mmd
new file mode 100644
index 000000000000..e0642cdb9b6a
--- /dev/null
+++ b/docs/decisions/diagrams/skfunctions-preview.mmd
@@ -0,0 +1,68 @@
+---
+title: Semantic Kernel Functions (preview)
+---
+classDiagram
+ %% Use https://mermaid.live/ to preview this diagram. The VS Code extension does not handle namespaces.
+ direction RL
+ namespace SkillDefinition {
+ class ISKFunction {
+ <>
+ String : Name
+ String : SkillName
+ String : Description
+ CompleteRequestSettings : RequestSettings
+ Describe(...)
+ InvokeAsync(...)
+ SetDefaultSkillCollection(...)
+ SetAIService(...)
+ SetAIConfiguration(...)
+ }
+ class NativeFunction
+ class SemanticFunction
+ }
+
+ namespace Skills_Grpc {
+ class KernelGrpcExtensions
+ }
+
+ namespace Skills_OpenApi {
+ class KernelOpenApiExtensions
+ }
+
+ namespace Skills_MsGraph {
+ class CalendarSkill
+ }
+
+ namespace Skills_Web {
+ class SearchUrlSkill
+ }
+
+ namespace Skills_Document {
+ class DocumentSkill
+ }
+
+ namespace Skills_Core {
+ class TextSkill
+ class ConversationSummarySkill
+ }
+
+ namespace Planning {
+ class Plan
+ class ActionPlanner
+ class SequentialPlanner
+ class StepwisePlanner
+ }
+
+ ISKFunction <|.. NativeFunction
+ ISKFunction <|.. SemanticFunction
+ ISKFunction <|.. Plan
+ NativeFunction <.. KernelGrpcExtensions
+ NativeFunction <.. KernelOpenApiExtensions
+ NativeFunction <.. CalendarSkill
+ NativeFunction <.. SearchUrlSkill
+ NativeFunction <.. DocumentSkill
+ NativeFunction <.. TextSkill
+ SemanticFunction <.. ConversationSummarySkill
+ Plan <.. ActionPlanner
+ Plan <.. SequentialPlanner
+ Plan <.. StepwisePlanner
diff --git a/docs/decisions/diagrams/skfunctions-preview.png b/docs/decisions/diagrams/skfunctions-preview.png
new file mode 100644
index 000000000000..49f2189c1302
Binary files /dev/null and b/docs/decisions/diagrams/skfunctions-preview.png differ
diff --git a/docs/decisions/diagrams/skfunctions-v1.mmd b/docs/decisions/diagrams/skfunctions-v1.mmd
new file mode 100644
index 000000000000..e64cc9db91fe
--- /dev/null
+++ b/docs/decisions/diagrams/skfunctions-v1.mmd
@@ -0,0 +1,77 @@
+---
+title: Semantic Kernel Functions (v1.0)
+---
+classDiagram
+ %% Use https://mermaid.live/ to preview this diagram. The VS Code extension does not handle namespaces.
+ direction RL
+ namespace Function {
+ class ISKFunction {
+ <>
+ String : Name
+ String : SkillName
+ String : Description
+ JsonObject : Configuration
+ Describe(...)
+ InvokeAsync(...)
+ SetPluginProvider(...)
+ SetAIServiceProvider(...)
+ SetConfiguration(...)
+ }
+ }
+
+ namespace Functions_Native {
+ class NativeFunction
+ }
+
+ namespace Functions_Semantic {
+ class SemanticFunction
+ }
+
+ namespace Functions_Planning {
+ class Plan
+ }
+
+ namespace Functions_Grpc {
+ class KernelGrpcExtensions
+ }
+
+ namespace Functions_OpenApi {
+ class KernelOpenApiExtensions
+ }
+
+ namespace Plugins_MsGraph {
+ class CalendarPlugin
+ }
+
+ namespace Plugins_Web {
+ class SearchUrlPlugin
+ }
+
+ namespace Plugins_Document {
+ class DocumentPlugin
+ }
+
+ namespace Plugins_Core {
+ class TextPlugin
+ class ConversationSummaryPlugin
+ }
+
+ namespace Planners {
+ class ActionPlanner
+ class SequentialPlanner
+ class StepwisePlanner
+ }
+
+ ISKFunction <|.. NativeFunction
+ ISKFunction <|.. SemanticFunction
+ ISKFunction <|.. Plan
+ NativeFunction .. KernelGrpcExtensions
+ NativeFunction .. KernelOpenApiExtensions
+ NativeFunction .. CalendarPlugin
+ NativeFunction .. SearchUrlPlugin
+ NativeFunction .. DocumentPlugin
+ NativeFunction .. TextPlugin
+ SemanticFunction .. ConversationSummaryPlugin
+ Plan <.. ActionPlanner
+ Plan <.. SequentialPlanner
+ Plan <.. StepwisePlanner
diff --git a/docs/decisions/diagrams/skfunctions-v1.png b/docs/decisions/diagrams/skfunctions-v1.png
new file mode 100644
index 000000000000..e7f569d0c63f
Binary files /dev/null and b/docs/decisions/diagrams/skfunctions-v1.png differ
diff --git a/dotnet/Directory.Build.props b/dotnet/Directory.Build.props
index 504a41201560..66b7b6667062 100644
--- a/dotnet/Directory.Build.props
+++ b/dotnet/Directory.Build.props
@@ -16,13 +16,8 @@
disable
-
- true
- full
-
-
-
- portable
+
+ True
diff --git a/dotnet/Directory.Packages.props b/dotnet/Directory.Packages.props
index 4fa0edafc6bb..cfdaca52cec4 100644
--- a/dotnet/Directory.Packages.props
+++ b/dotnet/Directory.Packages.props
@@ -5,19 +5,24 @@
true
-
-
-
+
+
+
+
+
-
-
-
-
+
+
+
+
+
+
+
@@ -29,18 +34,20 @@
-
-
+
+
-
-
+
+
+
-
+
+
-
-
-
+
+
+
@@ -48,26 +55,17 @@
-
+
+
+
+
+
-
all
@@ -78,12 +76,12 @@
allruntime; build; native; contentfiles; analyzers; buildtransitive
-
+ allruntime; build; native; contentfiles; analyzers; buildtransitive
-
+ allruntime; build; native; contentfiles; analyzers; buildtransitive
diff --git a/dotnet/README.md b/dotnet/README.md
index 7329454fd18f..28ca20228fb7 100644
--- a/dotnet/README.md
+++ b/dotnet/README.md
@@ -4,8 +4,8 @@
To run the LLM prompts and semantic functions in the examples below, make sure
you have an
-[Open AI API Key](https://openai.com/api/) or
-[Azure Open AI service key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=rest-api).
+[OpenAI API Key](https://openai.com/product/) or
+[Azure OpenAI Service Key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=rest-api).
## Nuget package
@@ -22,17 +22,18 @@ Copy and paste the following code into your project, with your Azure OpenAI key
```csharp
using Microsoft.SemanticKernel;
+using Microsoft.SemanticKernel.Connectors.AI.OpenAI;
var builder = new KernelBuilder();
-builder.WithAzureTextCompletionService(
- "text-davinci-003", // Azure OpenAI Deployment Name
+builder.WithAzureChatCompletionService(
+ "gpt-35-turbo", // Azure OpenAI Deployment Name
"https://contoso.openai.azure.com/", // Azure OpenAI Endpoint
"...your Azure OpenAI Key..."); // Azure OpenAI Key
// Alternative using OpenAI
-//builder.WithOpenAITextCompletionService(
-// "text-davinci-003", // OpenAI Model name
+//builder.WithOpenAIChatCompletionService(
+// "gpt-3.5-turbo", // OpenAI Model name
// "...your OpenAI API Key..."); // OpenAI API Key
var kernel = builder.Build();
@@ -41,7 +42,7 @@ var prompt = @"{{$input}}
One line TLDR with the fewest words.";
-var summarize = kernel.CreateSemanticFunction(prompt);
+var summarize = kernel.CreateSemanticFunction(prompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 100 });
string text1 = @"
1st Law of Thermodynamics - Energy cannot be created or destroyed.
@@ -53,9 +54,9 @@ string text2 = @"
2. The acceleration of an object depends on the mass of the object and the amount of force applied.
3. Whenever one object exerts a force on another object, the second object exerts an equal and opposite on the first.";
-Console.WriteLine(await summarize.InvokeAsync(text1));
+Console.WriteLine(await kernel.RunAsync(text1, summarize));
-Console.WriteLine(await summarize.InvokeAsync(text2));
+Console.WriteLine(await kernel.RunAsync(text2, summarize));
// Output:
// Energy conserved, entropy increases, zero entropy at 0K.
@@ -80,8 +81,8 @@ string summarizePrompt = @"{{$input}}
Give me a TLDR with the fewest words.";
-var translator = kernel.CreateSemanticFunction(translationPrompt);
-var summarize = kernel.CreateSemanticFunction(summarizePrompt);
+var translator = kernel.CreateSemanticFunction(translationPrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 200 });
+var summarize = kernel.CreateSemanticFunction(summarizePrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 100 });
string inputText = @"
1st Law of Thermodynamics - Energy cannot be created or destroyed.
@@ -101,27 +102,27 @@ Console.WriteLine(output);
The repository contains also a few C# Jupyter notebooks that demonstrates
how to get started with the Semantic Kernel.
-See [here](../samples/notebooks/dotnet/README.md) for the full list, with
+See [here](./notebooks/README.md) for the full list, with
requirements and setup instructions.
-1. [Getting started](../samples/notebooks//dotnet/00-getting-started.ipynb)
-2. [Loading and configuring Semantic Kernel](../samples/notebooks//dotnet/01-basic-loading-the-kernel.ipynb)
-3. [Running AI prompts from file](../samples/notebooks//dotnet/02-running-prompts-from-file.ipynb)
-4. [Creating Semantic Functions at runtime (i.e. inline functions)](../samples/notebooks//dotnet/03-semantic-function-inline.ipynb)
-5. [Using Context Variables to Build a Chat Experience](../samples/notebooks//dotnet/04-context-variables-chat.ipynb)
-6. [Creating and Executing Plans](../samples/notebooks//dotnet/05-using-the-planner.ipynb)
-7. [Building Memory with Embeddings](../samples/notebooks//dotnet/06-memory-and-embeddings.ipynb)
-8. [Creating images with DALL-E 2](../samples/notebooks//dotnet/07-DALL-E-2.ipynb)
-9. [Chatting with ChatGPT and Images](../samples/notebooks//dotnet/08-chatGPT-with-DALL-E-2.ipynb)
+1. [Getting started](./notebooks/00-getting-started.ipynb)
+2. [Loading and configuring Semantic Kernel](./notebooks/01-basic-loading-the-kernel.ipynb)
+3. [Running AI prompts from file](./notebooks/02-running-prompts-from-file.ipynb)
+4. [Creating Semantic Functions at runtime (i.e. inline functions)](./notebooks/03-semantic-function-inline.ipynb)
+5. [Using Context Variables to Build a Chat Experience](./notebooks/04-context-variables-chat.ipynb)
+6. [Creating and Executing Plans](./notebooks/05-using-the-planner.ipynb)
+7. [Building Memory with Embeddings](./notebooks/06-memory-and-embeddings.ipynb)
+8. [Creating images with DALL-E 2](./notebooks/07-DALL-E-2.ipynb)
+9. [Chatting with ChatGPT and Images](./notebooks/08-chatGPT-with-DALL-E-2.ipynb)
# Nuget packages
Semantic Kernel provides a set of nuget packages to allow extending the core with
-more features, such as connectors to services and Skills to perform specific actions.
+more features, such as connectors to services and plugins to perform specific actions.
Unless you need to optimize which packages to include in your app, you will usually
start by installing this meta-package first:
-* **Microsoft.SemanticKernel**
+- **Microsoft.SemanticKernel**
This meta package includes core packages and OpenAI connectors, allowing to run
most samples and build apps with OpenAI and Azure OpenAI.
@@ -129,23 +130,24 @@ most samples and build apps with OpenAI and Azure OpenAI.
Packages included in **Microsoft.SemanticKernel**:
1. **Microsoft.SemanticKernel.Abstractions**: contains common interfaces and classes
- used by the core and other SK components.
+ used by the core and other SK components.
1. **Microsoft.SemanticKernel.Core**: contains the core logic of SK, such as prompt
- engineering, semantic memory and semantic functions definition and orchestration.
+ engineering, semantic memory and semantic functions definition and orchestration.
1. **Microsoft.SemanticKernel.Connectors.AI.OpenAI**: connectors to OpenAI and Azure
- OpenAI, allowing to run semantic functions, chats, image generation with GPT3,
- GPT3.5, GPT4, DALL-E2. Includes also GPT tokenizers.
+ OpenAI, allowing to run semantic functions, chats, image generation with GPT3,
+ GPT3.5, GPT4, DALL-E2.
Other SK packages available at nuget.org:
1. **Microsoft.SemanticKernel.Connectors.Memory.Qdrant**: Qdrant connector for
- skills and semantic memory.
+ plugins and semantic memory.
2. **Microsoft.SemanticKernel.Connectors.Memory.Sqlite**: SQLite connector for
- skills and semantic memory
-3. **Microsoft.SemanticKernel.Skills.Document**: Document Skill: Word processing,
+ plugins and semantic memory
+3. **Microsoft.SemanticKernel.Plugins.Document**: Document Plugin: Word processing,
OpenXML, etc.
-4. **Microsoft.SemanticKernel.Skills.MsGraph**: Microsoft Graph Skill: access your
+4. **Microsoft.SemanticKernel.Plugins.MsGraph**: Microsoft Graph Plugin: access your
tenant data, schedule meetings, send emails, etc.
-5. **Microsoft.SemanticKernel.Skills.OpenAPI**: OpenAPI skill.
-6. **Microsoft.SemanticKernel.Skills.Web**: Web Skill: search the web, download
- files, etc.
\ No newline at end of file
+5. **Microsoft.SemanticKernel.Plugins.OpenAPI**: OpenAPI Plugin.
+6. **Microsoft.SemanticKernel.Plugins.Web**: Web Plugin: search the web, download
+ files, etc.
+7. **Microsoft.SemanticKernel.Reliability.Polly**: Extension for http resiliency.
diff --git a/dotnet/SK-dotnet.sln b/dotnet/SK-dotnet.sln
index b6cc713e2268..45273e9ad18c 100644
--- a/dotnet/SK-dotnet.sln
+++ b/dotnet/SK-dotnet.sln
@@ -2,7 +2,7 @@ Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.4.33213.308
MinimumVisualStudioVersion = 10.0.40219.1
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticKernel", "src\SemanticKernel\SemanticKernel.csproj", "{A284C7EB-2248-4A75-B112-F5DCDE65410D}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticKernel.Core", "src\SemanticKernel.Core\SemanticKernel.Core.csproj", "{A284C7EB-2248-4A75-B112-F5DCDE65410D}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "src", "src", "{831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}"
EndProject
@@ -15,17 +15,17 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "samples", "samples", "{FA37
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "KernelSyntaxExamples", "samples\KernelSyntaxExamples\KernelSyntaxExamples.csproj", "{47C6F821-5103-431F-B3B8-A2868A68BB78}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "MsGraphSkillsExample", "..\samples\dotnet\graph-api-skills\MsGraphSkillsExample.csproj", "{3EB61E99-C39B-4620-9482-F8DA18E48525}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "MsGraphPluginsExample", "..\samples\dotnet\MsGraphPluginsExample\MsGraphPluginsExample.csproj", "{3EB61E99-C39B-4620-9482-F8DA18E48525}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "KernelHttpServer", "..\samples\dotnet\KernelHttpServer\KernelHttpServer.csproj", "{34A7F1EF-D243-4160-A413-D713FEABCD94}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "IntegrationTests", "src\IntegrationTests\IntegrationTests.csproj", "{E4B777A1-28E1-41BE-96AE-7F3EC61FD5D4}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.Document", "src\Skills\Skills.Document\Skills.Document.csproj", "{F94D1938-9DB7-4B24-9FF3-166DDFD96330}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.Document", "src\Plugins\Plugins.Document\Plugins.Document.csproj", "{F94D1938-9DB7-4B24-9FF3-166DDFD96330}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.MsGraph", "src\Skills\Skills.MsGraph\Skills.MsGraph.csproj", "{689A5041-BAE7-448F-9BDC-4672E96249AA}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.MsGraph", "src\Plugins\Plugins.MsGraph\Plugins.MsGraph.csproj", "{689A5041-BAE7-448F-9BDC-4672E96249AA}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.Web", "src\Skills\Skills.Web\Skills.Web.csproj", "{EEA87FBC-4ED5-458C-ABD3-BEAEEB535BAF}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.Web", "src\Plugins\Plugins.Web\Plugins.Web.csproj", "{EEA87FBC-4ED5-458C-ABD3-BEAEEB535BAF}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{158A4E5E-AEE0-4D60-83C7-8E089B2D881D}"
ProjectSection(SolutionItems) = preProject
@@ -40,9 +40,9 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticKernel.UnitTests", "src\SemanticKernel.UnitTests\SemanticKernel.UnitTests.csproj", "{37E39C68-5A40-4E63-9D3C-0C66AD98DFCB}"
EndProject
-Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "skills", "skills", "{9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}"
+Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "functions", "functions", "{9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.UnitTests", "src\Skills\Skills.UnitTests\Skills.UnitTests.csproj", "{107156B4-5A8B-45C7-97A2-4544D7FA19DE}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Functions.UnitTests", "src\Functions\Functions.UnitTests\Functions.UnitTests.csproj", "{107156B4-5A8B-45C7-97A2-4544D7FA19DE}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "nuget", "nuget", "{F4243136-252A-4459-A7C4-EE8C056D6B0B}"
ProjectSection(SolutionItems) = preProject
@@ -51,7 +51,7 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "nuget", "nuget", "{F4243136
nuget\NUGET.md = nuget\NUGET.md
EndProjectSection
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.OpenAPI", "src\Skills\Skills.OpenAPI\Skills.OpenAPI.csproj", "{F2A1F81E-700E-4C0E-B021-B9EF29AA20BD}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Functions.OpenAPI", "src\Functions\Functions.OpenAPI\Functions.OpenAPI.csproj", "{F2A1F81E-700E-4C0E-B021-B9EF29AA20BD}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "connectors", "connectors", "{0247C2C9-86C3-45BA-8873-28B0948EDC0C}"
EndProject
@@ -61,8 +61,6 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Qdrant",
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Sqlite", "src\Connectors\Connectors.Memory.Sqlite\Connectors.Memory.Sqlite.csproj", "{EC004F12-2F60-4EDD-B3CD-3A504900D929}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.CosmosDB", "src\Connectors\Connectors.Memory.CosmosDB\Connectors.Memory.CosmosDB.csproj", "{EA61C289-7928-4B78-A9C1-7AAD61F907CD}"
-EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Postgres", "src\Connectors\Connectors.Memory.Postgres\Connectors.Memory.Postgres.csproj", "{C9F957FA-A70F-4A6D-8F95-23FCD7F4FB87}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Redis", "src\Connectors\Connectors.Memory.Redis\Connectors.Memory.Redis.csproj", "{3720F5ED-FB4D-485E-8A93-CDE60DEF0805}"
@@ -75,19 +73,15 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticKernel.Abstractions
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticKernel.MetaPackage", "src\SemanticKernel.MetaPackage\SemanticKernel.MetaPackage.csproj", "{E3299033-EB81-4C4C-BCD9-E8DC40937969}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Planning.ActionPlanner", "src\Extensions\Planning.ActionPlanner\Planning.ActionPlanner.csproj", "{994BEF0B-E277-4D10-BB13-FE670D26620D}"
-EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "extensions", "extensions", "{078F96B4-09E1-4E0E-B214-F71A4F4BF633}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Extensions.UnitTests", "src\Extensions\Extensions.UnitTests\Extensions.UnitTests.csproj", "{F51017A9-15C8-472D-893C-080046D710A6}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Planning.SequentialPlanner", "src\Extensions\Planning.SequentialPlanner\Planning.SequentialPlanner.csproj", "{A350933D-F9D5-4AD3-8C4F-B856B5020297}"
-EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.AzureCognitiveSearch", "src\Connectors\Connectors.Memory.AzureCognitiveSearch\Connectors.Memory.AzureCognitiveSearch.csproj", "{EC3BB6D1-2FB2-4702-84C6-F791DE533ED4}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Pinecone", "src\Connectors\Connectors.Memory.Pinecone\Connectors.Memory.Pinecone.csproj", "{4D226C2F-AE9F-4EFB-AF2D-45C8FE5CB34E}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.Grpc", "src\Skills\Skills.Grpc\Skills.Grpc.csproj", "{E52F805C-794A-4CA9-B684-DFF358B18820}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Functions.Grpc", "src\Functions\Functions.Grpc\Functions.Grpc.csproj", "{E52F805C-794A-4CA9-B684-DFF358B18820}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.AI.HuggingFace", "src\Connectors\Connectors.AI.HuggingFace\Connectors.AI.HuggingFace.csproj", "{136823BE-8665-4D57-87E0-EF41535539E2}"
EndProject
@@ -95,7 +89,7 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "InternalUtilities", "Intern
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Weaviate", "src\Connectors\Connectors.Memory.Weaviate\Connectors.Memory.Weaviate.csproj", "{6AAB0620-33A1-4A98-A63B-6560B9BA47A4}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenApiSkillsExample", "..\samples\dotnet\openapi-skills\OpenApiSkillsExample.csproj", "{4D91A3E0-C404-495B-AD4A-411C4E83CF54}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenApiPluginsExample", "..\samples\dotnet\OpenApiPluginsExample\OpenApiPluginsExample.csproj", "{4D91A3E0-C404-495B-AD4A-411C4E83CF54}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.DuckDB", "src\Connectors\Connectors.Memory.DuckDB\Connectors.Memory.DuckDB.csproj", "{50FAE231-6F24-4779-9D02-12ABBC9A49E2}"
EndProject
@@ -137,16 +131,41 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Http", "Http", "{1C19D805-3
src\InternalUtilities\src\Http\NonDisposableHttpClientHandler.cs = src\InternalUtilities\src\Http\NonDisposableHttpClientHandler.cs
EndProjectSection
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Skills.Core", "src\Skills\Skills.Core\Skills.Core.csproj", "{0D0C4DAD-E6BC-4504-AE3A-EEA4E35920C1}"
+Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "System", "System", "{3CDE10B2-AE8F-4FC4-8D55-92D4AD32E144}"
+ ProjectSection(SolutionItems) = preProject
+ src\InternalUtilities\src\System\EnvExtensions.cs = src\InternalUtilities\src\System\EnvExtensions.cs
+ EndProjectSection
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "NCalcSkills", "samples\NCalcSkills\NCalcSkills.csproj", "{E6EDAB8F-3406-4DBF-9AAB-DF40DC2CA0FA}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.Core", "src\Plugins\Plugins.Core\Plugins.Core.csproj", "{0D0C4DAD-E6BC-4504-AE3A-EEA4E35920C1}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.AI.Oobabooga", "src\Connectors\Connectors.AI.Oobabooga\Connectors.AI.Oobabooga.csproj", "{677F1381-7830-4115-9C1A-58B282629DC6}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "NCalcPlugins", "samples\NCalcPlugins\NCalcPlugins.csproj", "{E6EDAB8F-3406-4DBF-9AAB-DF40DC2CA0FA}"
EndProject
-Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Planning.StepwisePlanner", "src\Extensions\Planning.StepwisePlanner\Planning.StepwisePlanner.csproj", "{4762BCAF-E1C5-4714-B88D-E50FA333C50E}"
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.AI.Oobabooga", "src\Connectors\Connectors.AI.Oobabooga\Connectors.AI.Oobabooga.csproj", "{677F1381-7830-4115-9C1A-58B282629DC6}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "ApplicationInsightsExample", "samples\ApplicationInsightsExample\ApplicationInsightsExample.csproj", "{C754950A-E16C-4F96-9CC7-9328E361B5AF}"
EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Kusto", "src\Connectors\Connectors.Memory.Kusto\Connectors.Memory.Kusto.csproj", "{E07608CC-D710-4655-BB9E-D22CF3CDD193}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "TemplateEngine.Basic", "src\Extensions\TemplateEngine.Basic\TemplateEngine.Basic.csproj", "{10E4B697-D4E8-468D-872D-49670FD150FB}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Reliability.Polly", "src\Extensions\Reliability.Polly\Reliability.Polly.csproj", "{D4540A0F-98E3-4E70-9093-1948AE5B2AAD}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Reliability.Basic", "src\Extensions\Reliability.Basic\Reliability.Basic.csproj", "{3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}"
+EndProject
+Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "plugins", "plugins", "{D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.UnitTests", "src\Plugins\Plugins.UnitTests\Plugins.UnitTests.csproj", "{5CB78CE4-895B-4A14-98AA-716A37DEEBB1}"
+EndProject
+Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "planners", "planners", "{A21FAC7C-0C09-4EAD-843B-926ACEF73C80}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Planners.Core", "src\Planners\Planners.Core\Planners.Core.csproj", "{F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Planners.Core.UnitTests", "src\Planners\Planners.Core.UnitTests\Planners.Core.UnitTests.csproj", "{CC77DCFA-A419-4202-A98A-868CDF457792}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Connectors.Memory.Milvus", "src\Connectors\Connectors.Memory.Milvus\Connectors.Memory.Milvus.csproj", "{8B754E80-7A97-4585-8D7E-1D588FA5F727}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Plugins.Memory", "src\Plugins\Plugins.Memory\Plugins.Memory.csproj", "{E91365A1-8B01-4AB8-825F-67E3515EADCD}"
+EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
@@ -235,11 +254,6 @@ Global
{EC004F12-2F60-4EDD-B3CD-3A504900D929}.Publish|Any CPU.Build.0 = Publish|Any CPU
{EC004F12-2F60-4EDD-B3CD-3A504900D929}.Release|Any CPU.ActiveCfg = Release|Any CPU
{EC004F12-2F60-4EDD-B3CD-3A504900D929}.Release|Any CPU.Build.0 = Release|Any CPU
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD}.Debug|Any CPU.Build.0 = Debug|Any CPU
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD}.Publish|Any CPU.ActiveCfg = Release|Any CPU
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD}.Release|Any CPU.ActiveCfg = Release|Any CPU
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD}.Release|Any CPU.Build.0 = Release|Any CPU
{C9F957FA-A70F-4A6D-8F95-23FCD7F4FB87}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{C9F957FA-A70F-4A6D-8F95-23FCD7F4FB87}.Debug|Any CPU.Build.0 = Debug|Any CPU
{C9F957FA-A70F-4A6D-8F95-23FCD7F4FB87}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
@@ -276,24 +290,12 @@ Global
{E3299033-EB81-4C4C-BCD9-E8DC40937969}.Publish|Any CPU.Build.0 = Publish|Any CPU
{E3299033-EB81-4C4C-BCD9-E8DC40937969}.Release|Any CPU.ActiveCfg = Release|Any CPU
{E3299033-EB81-4C4C-BCD9-E8DC40937969}.Release|Any CPU.Build.0 = Release|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Debug|Any CPU.Build.0 = Debug|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Publish|Any CPU.Build.0 = Publish|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Release|Any CPU.ActiveCfg = Release|Any CPU
- {994BEF0B-E277-4D10-BB13-FE670D26620D}.Release|Any CPU.Build.0 = Release|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Debug|Any CPU.Build.0 = Debug|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Publish|Any CPU.ActiveCfg = Release|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Publish|Any CPU.Build.0 = Release|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Release|Any CPU.ActiveCfg = Release|Any CPU
{F51017A9-15C8-472D-893C-080046D710A6}.Release|Any CPU.Build.0 = Release|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Debug|Any CPU.Build.0 = Debug|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Publish|Any CPU.Build.0 = Publish|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Release|Any CPU.ActiveCfg = Release|Any CPU
- {A350933D-F9D5-4AD3-8C4F-B856B5020297}.Release|Any CPU.Build.0 = Release|Any CPU
{EC3BB6D1-2FB2-4702-84C6-F791DE533ED4}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{EC3BB6D1-2FB2-4702-84C6-F791DE533ED4}.Debug|Any CPU.Build.0 = Debug|Any CPU
{EC3BB6D1-2FB2-4702-84C6-F791DE533ED4}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
@@ -352,17 +354,65 @@ Global
{677F1381-7830-4115-9C1A-58B282629DC6}.Publish|Any CPU.Build.0 = Publish|Any CPU
{677F1381-7830-4115-9C1A-58B282629DC6}.Release|Any CPU.ActiveCfg = Release|Any CPU
{677F1381-7830-4115-9C1A-58B282629DC6}.Release|Any CPU.Build.0 = Release|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Debug|Any CPU.Build.0 = Debug|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Publish|Any CPU.Build.0 = Publish|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Release|Any CPU.ActiveCfg = Release|Any CPU
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E}.Release|Any CPU.Build.0 = Release|Any CPU
{C754950A-E16C-4F96-9CC7-9328E361B5AF}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{C754950A-E16C-4F96-9CC7-9328E361B5AF}.Debug|Any CPU.Build.0 = Debug|Any CPU
{C754950A-E16C-4F96-9CC7-9328E361B5AF}.Publish|Any CPU.ActiveCfg = Release|Any CPU
{C754950A-E16C-4F96-9CC7-9328E361B5AF}.Release|Any CPU.ActiveCfg = Release|Any CPU
{C754950A-E16C-4F96-9CC7-9328E361B5AF}.Release|Any CPU.Build.0 = Release|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Publish|Any CPU.Build.0 = Publish|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193}.Release|Any CPU.Build.0 = Release|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Publish|Any CPU.Build.0 = Publish|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {10E4B697-D4E8-468D-872D-49670FD150FB}.Release|Any CPU.Build.0 = Release|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Publish|Any CPU.Build.0 = Publish|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD}.Release|Any CPU.Build.0 = Release|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Publish|Any CPU.Build.0 = Publish|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567}.Release|Any CPU.Build.0 = Release|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Publish|Any CPU.ActiveCfg = Debug|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Publish|Any CPU.Build.0 = Debug|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1}.Release|Any CPU.Build.0 = Release|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Publish|Any CPU.ActiveCfg = Publish|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Publish|Any CPU.Build.0 = Publish|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731}.Release|Any CPU.Build.0 = Release|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Publish|Any CPU.ActiveCfg = Release|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Publish|Any CPU.Build.0 = Release|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {CC77DCFA-A419-4202-A98A-868CDF457792}.Release|Any CPU.Build.0 = Release|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Publish|Any CPU.ActiveCfg = Debug|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Publish|Any CPU.Build.0 = Debug|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727}.Release|Any CPU.Build.0 = Release|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Debug|Any CPU.Build.0 = Debug|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Publish|Any CPU.ActiveCfg = Debug|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Publish|Any CPU.Build.0 = Debug|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Release|Any CPU.ActiveCfg = Release|Any CPU
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
@@ -373,9 +423,9 @@ Global
{3EB61E99-C39B-4620-9482-F8DA18E48525} = {FA3720F1-C99A-49B2-9577-A940257098BF}
{34A7F1EF-D243-4160-A413-D713FEABCD94} = {FA3720F1-C99A-49B2-9577-A940257098BF}
{E4B777A1-28E1-41BE-96AE-7F3EC61FD5D4} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
- {F94D1938-9DB7-4B24-9FF3-166DDFD96330} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
- {689A5041-BAE7-448F-9BDC-4672E96249AA} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
- {EEA87FBC-4ED5-458C-ABD3-BEAEEB535BAF} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
+ {F94D1938-9DB7-4B24-9FF3-166DDFD96330} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
+ {689A5041-BAE7-448F-9BDC-4672E96249AA} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
+ {EEA87FBC-4ED5-458C-ABD3-BEAEEB535BAF} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
{37E39C68-5A40-4E63-9D3C-0C66AD98DFCB} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
{9ECD1AA0-75B3-4E25-B0B5-9F0945B64974} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
{107156B4-5A8B-45C7-97A2-4544D7FA19DE} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
@@ -385,17 +435,14 @@ Global
{EB3FC57F-E591-4C88-BCD5-B6A1BC635168} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{5DEBAA62-F117-496A-8778-FED3604B70E2} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{EC004F12-2F60-4EDD-B3CD-3A504900D929} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
- {EA61C289-7928-4B78-A9C1-7AAD61F907CD} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{C9F957FA-A70F-4A6D-8F95-23FCD7F4FB87} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{3720F5ED-FB4D-485E-8A93-CDE60DEF0805} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{185E0CE8-C2DA-4E4C-A491-E8EB40316315} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{AFA81EB7-F869-467D-8A90-744305D80AAC} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{627742DB-1E52-468A-99BD-6FF1A542D25B} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
{E3299033-EB81-4C4C-BCD9-E8DC40937969} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
- {994BEF0B-E277-4D10-BB13-FE670D26620D} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
{078F96B4-09E1-4E0E-B214-F71A4F4BF633} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
{F51017A9-15C8-472D-893C-080046D710A6} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
- {A350933D-F9D5-4AD3-8C4F-B856B5020297} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
{EC3BB6D1-2FB2-4702-84C6-F791DE533ED4} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{4D226C2F-AE9F-4EFB-AF2D-45C8FE5CB34E} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
{E52F805C-794A-4CA9-B684-DFF358B18820} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
@@ -410,11 +457,22 @@ Global
{B00AD427-0047-4850-BEF9-BA8237EA9D8B} = {958AD708-F048-4FAF-94ED-D2F2B92748B9}
{DB950192-30F1-48B1-88D7-F43FECCA1A1C} = {958AD708-F048-4FAF-94ED-D2F2B92748B9}
{1C19D805-3573-4477-BF07-40180FCDE1BD} = {958AD708-F048-4FAF-94ED-D2F2B92748B9}
- {0D0C4DAD-E6BC-4504-AE3A-EEA4E35920C1} = {9ECD1AA0-75B3-4E25-B0B5-9F0945B64974}
+ {3CDE10B2-AE8F-4FC4-8D55-92D4AD32E144} = {958AD708-F048-4FAF-94ED-D2F2B92748B9}
+ {0D0C4DAD-E6BC-4504-AE3A-EEA4E35920C1} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
{E6EDAB8F-3406-4DBF-9AAB-DF40DC2CA0FA} = {FA3720F1-C99A-49B2-9577-A940257098BF}
{677F1381-7830-4115-9C1A-58B282629DC6} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
- {4762BCAF-E1C5-4714-B88D-E50FA333C50E} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
{C754950A-E16C-4F96-9CC7-9328E361B5AF} = {FA3720F1-C99A-49B2-9577-A940257098BF}
+ {E07608CC-D710-4655-BB9E-D22CF3CDD193} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
+ {10E4B697-D4E8-468D-872D-49670FD150FB} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
+ {D4540A0F-98E3-4E70-9093-1948AE5B2AAD} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
+ {3DC4DBD8-20A5-4937-B4F5-BB5E24E7A567} = {078F96B4-09E1-4E0E-B214-F71A4F4BF633}
+ {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
+ {5CB78CE4-895B-4A14-98AA-716A37DEEBB1} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
+ {A21FAC7C-0C09-4EAD-843B-926ACEF73C80} = {831DDCA2-7D2C-4C31-80DB-6BDB3E1F7AE0}
+ {F224B869-FA0E-4EEE-A6BF-C2D61FF8E731} = {A21FAC7C-0C09-4EAD-843B-926ACEF73C80}
+ {CC77DCFA-A419-4202-A98A-868CDF457792} = {A21FAC7C-0C09-4EAD-843B-926ACEF73C80}
+ {8B754E80-7A97-4585-8D7E-1D588FA5F727} = {0247C2C9-86C3-45BA-8873-28B0948EDC0C}
+ {E91365A1-8B01-4AB8-825F-67E3515EADCD} = {D6D598DF-C17C-46F4-B2B9-CDE82E2DE132}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {FBDC56A3-86AD-4323-AA0F-201E59123B83}
diff --git a/dotnet/SK-dotnet.sln.DotSettings b/dotnet/SK-dotnet.sln.DotSettings
index 4d5e6137e95a..78893c5aab58 100644
--- a/dotnet/SK-dotnet.sln.DotSettings
+++ b/dotnet/SK-dotnet.sln.DotSettings
@@ -183,6 +183,7 @@ public void It$SOMENAME$()
copy// Copyright (c) Microsoft. All rights reserved.
+ TrueTrueTrueTrue
@@ -202,6 +203,7 @@ public void It$SOMENAME$()
TrueTrueTrue
+ TrueTrueTrueTrue
@@ -209,6 +211,7 @@ public void It$SOMENAME$()
TrueTrueTrue
+ TrueTrueTrueTrue
@@ -221,6 +224,7 @@ public void It$SOMENAME$()
TrueTrueTrue
+ TrueTrueTrueTrue
diff --git a/dotnet/build.cmd b/dotnet/build.cmd
index 8d80cf630de1..ba30c180d8ae 100644
--- a/dotnet/build.cmd
+++ b/dotnet/build.cmd
@@ -1,7 +1,5 @@
@echo off
-
-cd dotnet
-
-dotnet build --configuration Release --interactive
-
-dotnet test --configuration Release --no-build --no-restore --interactive
+setlocal
+cd "%~dp0"
+dotnet build --configuration Release --interactive ^
+ && dotnet test --configuration Release --no-build --no-restore --interactive
diff --git a/dotnet/docs/TELEMETRY.md b/dotnet/docs/TELEMETRY.md
new file mode 100644
index 000000000000..a031ffb26a1f
--- /dev/null
+++ b/dotnet/docs/TELEMETRY.md
@@ -0,0 +1,156 @@
+# Telemetry
+
+Telemetry in Semantic Kernel (SK) .NET implementation includes _logging_, _metering_ and _tracing_.
+The code is instrumented using native .NET instrumentation tools, which means that it's possible to use different monitoring platforms (e.g. Application Insights, Prometheus, Grafana etc.).
+
+Code example using Application Insights can be found [here](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/ApplicationInsightsExample/Program.cs).
+
+## Logging
+
+The logging mechanism in this project relies on the `ILogger` interface from the `Microsoft.Extensions.Logging` namespace. Recent updates have introduced enhancements to the logger creation process. Instead of directly using the `ILogger` interface, instances of `ILogger` are now recommended to be created through an `ILoggerFactory` provided to components using the `WithLoggerFactory` method.
+
+By employing the `WithLoggerFactory` approach, logger instances are generated with precise type information, facilitating more accurate logging and streamlined control over log filtering across various classes.
+
+Log levels used in SK:
+
+- Trace - this type of logs **should not be enabled in production environments**, since it may contain sensitive data. It can be useful in test environments for better observability. Logged information includes:
+ - Goal/Ask to create a plan
+ - Prompt (template and rendered version) for AI to create a plan
+ - Created plan with function arguments (arguments may contain sensitive data)
+ - Prompt (template and rendered version) for AI to execute a function
+- Debug - contains more detailed messages without sensitive data. Can be enabled in production environments.
+- Information (default) - log level that is enabled by default and provides information about general flow of the application. Contains following data:
+ - AI model used to create a plan
+ - Plan creation status (Success/Failed)
+ - Plan creation execution time (in milliseconds)
+ - Created plan without function arguments
+ - AI model used to execute a function
+ - Function execution status (Success/Failed)
+ - Function execution time (in milliseconds)
+- Warning - includes information about unusual events that don't cause the application to fail.
+- Error - used for logging exception details.
+
+### Examples
+
+Enable logging for Kernel instance:
+
+```csharp
+var kernel = new KernelBuilder().WithLoggerFactory(loggerFactory);
+```
+
+Enable logging for Planner instance (_metering_ and _tracing_ will be enabled as well):
+
+```csharp
+var planner = new SequentialPlanner(kernel, plannerConfig).WithInstrumentation(loggerFactory);
+```
+
+### Log Filtering Configuration
+
+Log filtering configuration has been refined to strike a balance between visibility and relevance:
+
+```csharp
+builder.AddFilter("Microsoft", LogLevel.Warning);
+builder.AddFilter("Microsoft.SemanticKernel", LogLevel.Critical);
+builder.AddFilter("Microsoft.SemanticKernel.Reliability", LogLevel.Information);
+```
+
+## Metering
+
+Metering is implemented with `Meter` class from `System.Diagnostics.Metrics` namespace.
+
+Available meters:
+
+- _Microsoft.SemanticKernel.Planning.Action.InstrumentedActionPlanner_ - captures metrics for `ActionPlanner`. List of metrics:
+ - `SK.ActionPlanner.CreatePlan.ExecutionTime` - execution time of plan creation (in milliseconds)
+- _Microsoft.SemanticKernel.Planning.Sequential.InstrumentedSequentialPlanner_ - captures metrics for `SequentialPlanner`. List of metrics:
+ - `SK.SequentialPlanner.CreatePlan.ExecutionTime` - execution time of plan creation (in milliseconds)
+- _Microsoft.SemanticKernel.Planning.Stepwise.StepwisePlanner_ - captures metrics for `StepwisePlanner`. List of metrics:
+ - `SK.StepwisePlanner.CreatePlan.ExecutionTime` - execution time of plan creation (in milliseconds)
+- _Microsoft.SemanticKernel.Planning.Plan_ - captures metrics for `Plan`. List of metrics:
+ - `SK.Plan.Execution.ExecutionTime` - plan execution time (in milliseconds)
+ - `SK.Plan.Execution.ExecutionTotal` - total number of plan executions
+ - `SK.Plan.Execution.ExecutionSuccess` - number of successful plan executions
+ - `SK.Plan.Execution.ExecutionFailure` - number of failed plan executions
+- _Microsoft.SemanticKernel.SKFunction_ - captures metrics for `SKFunction`. List of metrics:
+ - `SK..ExecutionTime` - function execution time (in milliseconds)
+ - `SK..ExecutionTotal` - total number of function executions
+ - `SK..ExecutionSuccess` - number of successful function executions
+ - `SK..ExecutionFailure` - number of failed function executions
+- _Microsoft.SemanticKernel.Connectors.AI.OpenAI_ - captures metrics for OpenAI functionality. List of metrics:
+ - `SK.Connectors.OpenAI.PromptTokens` - number of prompt tokens used.
+ - `SK.Connectors.OpenAI.CompletionTokens` - number of completion tokens used.
+ - `SK.Connectors.OpenAI.TotalTokens` - total number of tokens used.
+
+### Examples
+
+Depending on monitoring tool, there are different ways how to subscribe to available meters. Following example shows how to subscribe to available meters and export metrics to Application Insights using `MeterListener`:
+
+```csharp
+var meterListener = new MeterListener();
+
+meterListener.InstrumentPublished = (instrument, listener) =>
+{
+ if (instrument.Meter.Name.StartsWith("Microsoft.SemanticKernel", StringComparison.Ordinal))
+ {
+ listener.EnableMeasurementEvents(instrument);
+ }
+};
+
+// Set callback to specific numeric type - double.
+meterListener.SetMeasurementEventCallback((instrument, measurement, tags, state) =>
+{
+ // Export to Application Insights using telemetry client instance
+ telemetryClient.GetMetric(instrument.Name).TrackValue(measurement);
+});
+
+meterListener.Start();
+```
+
+It's possible to control for what meters to subscribe. For example, following condition will allow to subscribe to all meters in Semantic Kernel:
+
+```csharp
+instrument.Meter.Name.StartsWith("Microsoft.SemanticKernel", StringComparison.Ordinal)
+```
+
+It's also possible to subscribe to specific meter. Following condition will allow to subscribe to meter for `SKFunction` only:
+
+```csharp
+instrument.Meter.Name.Equals("Microsoft.SemanticKernel.SKFunction", StringComparison.Ordinal)
+```
+
+## Tracing
+
+Tracing is implemented with `Activity` class from `System.Diagnostics` namespace.
+
+Available activity sources:
+
+- _Microsoft.SemanticKernel.Planning.Action.InstrumentedActionPlanner_ - creates activities for `ActionPlanner`.
+- _Microsoft.SemanticKernel.Planning.Sequential.InstrumentedSequentialPlanner_ - creates activities for `SequentialPlanner`.
+- _Microsoft.SemanticKernel.Planning.Stepwise.StepwisePlanner_ - creates activities for `StepwisePlanner`.
+- _Microsoft.SemanticKernel.Planning.Plan_ - creates activities for `Plan`.
+- _Microsoft.SemanticKernel.SKFunction_ - creates activities for `SKFunction`.
+
+### Examples
+
+Subscribe to available activity sources using `ActivityListener`:
+
+```csharp
+var activityListener = new ActivityListener();
+
+activityListener.ShouldListenTo =
+ activitySource => activitySource.Name.StartsWith("Microsoft.SemanticKernel", StringComparison.Ordinal);
+
+ActivitySource.AddActivityListener(activityListener);
+```
+
+Following condition will allow to subscribe to all activity sources in Semantic Kernel:
+
+```csharp
+activitySource.Name.StartsWith("Microsoft.SemanticKernel", StringComparison.Ordinal)
+```
+
+It's also possible to subscribe to specific activity source. Following condition will allow to subscribe to activity source for `SKFunction` only:
+
+```csharp
+activitySource.Name.Equals("Microsoft.SemanticKernel.SKFunction", StringComparison.Ordinal)
+```
diff --git a/samples/notebooks/dotnet/0-AI-settings.ipynb b/dotnet/notebooks/0-AI-settings.ipynb
similarity index 100%
rename from samples/notebooks/dotnet/0-AI-settings.ipynb
rename to dotnet/notebooks/0-AI-settings.ipynb
diff --git a/dotnet/notebooks/00-getting-started.ipynb b/dotnet/notebooks/00-getting-started.ipynb
new file mode 100644
index 000000000000..597e68c3a3ed
--- /dev/null
+++ b/dotnet/notebooks/00-getting-started.ipynb
@@ -0,0 +1,192 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### Watch the Getting Started Quick Start [Video](https://aka.ms/SK-Getting-Started-Notebook)\n",
+ "\n",
+ "> [!IMPORTANT]\n",
+ "> You will need an [.Net 7 SDK](https://dotnet.microsoft.com/en-us/download) and [Polyglot](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.dotnet-interactive-vscode) to get started with this notebook using .Net Interactive"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Step 1**: Configure your AI service credentials\n",
+ "\n",
+ "Use [this notebook](0-AI-settings.ipynb) first, to choose whether to run these notebooks with OpenAI or Azure OpenAI,\n",
+ "and to save your credentials in the configuration file."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Load some helper functions, e.g. to load values from settings.json\n",
+ "#!import config/Settings.cs "
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Step 2**: Import Semantic Kernel SDK from NuGet"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Import Semantic Kernel\n",
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\""
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Step 3**: Instantiate the Kernel"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel;\n",
+ "\n",
+ "//Create Kernel builder\n",
+ "var builder = new KernelBuilder();"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Configure AI service credentials used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "IKernel kernel = builder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Step 4**: Load and Run a Plugin"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Load the Plugins Directory\n",
+ "var pluginsDirectory = Path.Combine(System.IO.Directory.GetCurrentDirectory(), \"..\", \"..\", \"samples\", \"plugins\");\n",
+ "\n",
+ "// Load the FunPlugin from the Plugins Directory\n",
+ "var funPluginFunctions = kernel.ImportSemanticFunctionsFromDirectory(pluginsDirectory, \"FunPlugin\");\n",
+ "\n",
+ "// Run the Function called Joke\n",
+ "var result = await kernel.RunAsync(\"time travel to dinosaur age\", funPluginFunctions[\"Joke\"]);\n",
+ "var resultString = result.GetValue();\n",
+ "\n",
+ "// Return the result to the Notebook\n",
+ "Console.WriteLine(resultString);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Next Steps**: You know the basics, let's try this in a sample app so you can learn the core concepts!\n",
+ "\n",
+ "Sample app learning examples:\n",
+ "- [Simple chat summary](../../samples/apps/chat-summary-webapp-react/README.md) (**Recommended**) – learn how basic semantic functions can be added to an app\n",
+ "- [Book creator](../../samples/apps/book-creator-webapp-react/README.md) – learn how Planner and chaining of semantic functions can be used in your app\n",
+ "- [Authentication and APIs](../../samples/dotnet/MsGraphPluginsExample/README.md) – learn how to connect to external API's with authentication while using Semantic Kernel\n",
+ "- [GitHub repository Q&A](../../samples/apps/github-qna-webapp-react/README.md) - Use embeddings and memory to store and query your data\n",
+ "- [Copilot Chat](../../samples/apps/copilot-chat-app/README.md) – Build your own chatbot based on Semantic Kernel"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/01-basic-loading-the-kernel.ipynb b/dotnet/notebooks/01-basic-loading-the-kernel.ipynb
new file mode 100644
index 000000000000..01b143634017
--- /dev/null
+++ b/dotnet/notebooks/01-basic-loading-the-kernel.ipynb
@@ -0,0 +1,214 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Basic Loading of the Kernel"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The Semantic Kernel SDK can be imported from the following nuget feed:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\""
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "After adding the nuget package, you can instantiate the kernel in a few ways, depending on your use case.\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel;\n",
+ "\n",
+ "// Simple instance\n",
+ "IKernel kernel_1 = KernelBuilder.Create();"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.Extensions.Logging;\n",
+ "using Microsoft.Extensions.Logging.Abstractions;\n",
+ "\n",
+ "// Inject your logger \n",
+ "// see Microsoft.Extensions.Logging.ILogger @ https://learn.microsoft.com/dotnet/core/extensions/logging\n",
+ "ILoggerFactory myLoggerFactory = NullLoggerFactory.Instance;\n",
+ "IKernel kernel_2 = new KernelBuilder()\n",
+ " .WithLoggerFactory(myLoggerFactory)\n",
+ " .Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "When using the kernel for AI requests, the kernel needs some settings like URL and credentials to the AI models.\n",
+ "\n",
+ "The SDK currently supports OpenAI, Azure OpenAI and HuggingFace. It's also possible to create your own connector and use AI provider of your choice.\n",
+ "\n",
+ "If you need an Azure OpenAI key, go [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/quickstart?pivots=rest-api)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Kernel.Builder\n",
+ ".WithAzureChatCompletionService(\n",
+ " \"my-finetuned-model\", // Azure OpenAI *Deployment Name*\n",
+ " \"https://contoso.openai.azure.com/\", // Azure OpenAI *Endpoint*\n",
+ " \"...your Azure OpenAI Key...\", // Azure OpenAI *Key*\n",
+ " serviceId: \"Azure_curie\" // alias used in the prompt templates' config.json\n",
+ ")\n",
+ ".WithOpenAIChatCompletionService(\n",
+ " \"gpt-3.5-turbo\", // OpenAI Model Name\n",
+ " \"...your OpenAI API Key...\", // OpenAI API key\n",
+ " \"...your OpenAI Org ID...\", // *optional* OpenAI Organization ID\n",
+ " serviceId: \"OpenAI_davinci\" // alias used in the prompt templates' config.json\n",
+ ");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "When working with multiple backends and multiple models, the **first backend** defined\n",
+ "is also the \"**default**\" used in these scenarios:\n",
+ "\n",
+ "* a prompt configuration doesn't specify which AI backend to use\n",
+ "* a prompt configuration requires a backend unknown to the kernel\n",
+ "\n",
+ "The default can be set programmatically:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Kernel.Builder\n",
+ ".WithOpenAIChatCompletionService(\n",
+ " \"gpt-3.5-turbo\", // OpenAI Model Name\n",
+ " \"...your OpenAI API Key...\", // OpenAI API key\n",
+ " \"...your OpenAI Org ID...\", // *optional* OpenAI Organization ID\n",
+ " \"OpenAI_davinci\", // alias used in the prompt templates' config.json\n",
+ " true // This flag specifies that this service is the default one.\n",
+ ");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Great, now that you're familiar with setting up the Semantic Kernel, let's see [how we can use it to run prompts](02-running-prompts-from-file.ipynb)."
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "file_extension": ".cs",
+ "mimetype": "text/x-csharp",
+ "name": "C#",
+ "pygments_lexer": "csharp",
+ "version": "11.0"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/02-running-prompts-from-file.ipynb b/dotnet/notebooks/02-running-prompts-from-file.ipynb
new file mode 100644
index 000000000000..dadfe2466a53
--- /dev/null
+++ b/dotnet/notebooks/02-running-prompts-from-file.ipynb
@@ -0,0 +1,198 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# How to run a semantic plugins from file\n",
+ "Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Semantic Plugins and Semantic Functions stored on disk. \n",
+ "\n",
+ "A Semantic Plugin is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file. \n",
+ "\n",
+ "Refer to our [glossary](../../docs/GLOSSARY.md) for an in-depth guide to the terms.\n",
+ "\n",
+ "The repository includes some examples under the [samples](https://github.com/microsoft/semantic-kernel/tree/main/samples) folder.\n",
+ "\n",
+ "For instance, [this](../../samples/plugins/FunPlugin/Joke/skprompt.txt) is the **Joke function** part of the **FunPlugin plugin**:"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "```\n",
+ "WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW.\n",
+ "JOKE MUST BE:\n",
+ "- G RATED\n",
+ "- WORKPLACE/FAMILY SAFE\n",
+ "NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY.\n",
+ "BE CREATIVE AND FUNNY. I WANT TO LAUGH.\n",
+ "+++++\n",
+ "{{$input}}\n",
+ "+++++\n",
+ "```"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Note the special **`{{$input}}`** token, which is a variable that is automatically passed when invoking the function, commonly referred to as a \"function parameter\". \n",
+ "\n",
+ "We'll explore later how functions can accept multiple variables, as well as invoke other functions."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "In the same folder you'll notice a second [config.json](../../samples/plugins/FunPlugin/Joke/config.json) file. The file is optional, and is used to set some parameters for large language models like Temperature, TopP, Stop Sequences, etc.\n",
+ "\n",
+ "```\n",
+ "{\n",
+ " \"schema\": 1,\n",
+ " \"description\": \"Generate a funny joke\",\n",
+ " \"models\": [\n",
+ " {\n",
+ " \"max_tokens\": 500,\n",
+ " \"temperature\": 0.5,\n",
+ " \"top_p\": 0.5\n",
+ " }\n",
+ " ]\n",
+ "}\n",
+ "```"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Given a semantic function defined by these files, this is how to load and use a file based semantic function.\n",
+ "\n",
+ "Configure and create the kernel, as usual, loading also the AI backend settings defined in the [Setup notebook](0-AI-settings.ipynb):"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "IKernel kernel = builder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Import the plugin and all its functions:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// note: using plugins from the repo\n",
+ "var pluginsDirectory = Path.Combine(System.IO.Directory.GetCurrentDirectory(), \"..\", \"..\", \"samples\", \"plugins\");\n",
+ "\n",
+ "var funPluginFunctions = kernel.ImportSemanticFunctionsFromDirectory(pluginsDirectory, \"FunPlugin\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "How to use the plugin functions, e.g. generate a joke about \"*time travel to dinosaur age*\":"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var result = await kernel.RunAsync(\"time travel to dinosaur age\", funPluginFunctions[\"Joke\"]);\n",
+ "var resultString = result.GetValue();\n",
+ "\n",
+ "Console.WriteLine(resultString);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Great, now that you know how to load a plugin from disk, let's show how you can [create and run a semantic function inline.](./03-semantic-function-inline.ipynb)"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/03-semantic-function-inline.ipynb b/dotnet/notebooks/03-semantic-function-inline.ipynb
new file mode 100644
index 000000000000..d5326783be7b
--- /dev/null
+++ b/dotnet/notebooks/03-semantic-function-inline.ipynb
@@ -0,0 +1,374 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Running Semantic Functions Inline"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The [previous notebook](./02-running-prompts-from-file.ipynb)\n",
+ "showed how to define a semantic function using a prompt template stored on a file.\n",
+ "\n",
+ "In this notebook, we'll show how to use the Semantic Kernel to define functions inline with your C# code. This can be useful in a few scenarios:\n",
+ "\n",
+ "* Dynamically generating the prompt using complex rules at runtime\n",
+ "* Writing prompts by editing C# code instead of TXT files. \n",
+ "* Easily creating demos, like this document\n",
+ "\n",
+ "Prompt templates are defined using the SK template language, which allows to reference variables and functions. Read [this doc](https://aka.ms/sk/howto/configurefunction) to learn more about the design decisions for prompt templating. \n",
+ "\n",
+ "For now we'll use only the `{{$input}}` variable, and see more complex templates later.\n",
+ "\n",
+ "Almost all semantic function prompts have a reference to `{{$input}}`, which is the default way\n",
+ "a user can import content from the context variables."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Prepare a semantic kernel instance first, loading also the AI backend settings defined in the [Setup notebook](0-AI-settings.ipynb):"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.SemanticFunctions;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;\n",
+ "\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "IKernel kernel = builder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's create a semantic function used to summarize content:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "string skPrompt = \"\"\"\n",
+ "{{$input}}\n",
+ "\n",
+ "Summarize the content above.\n",
+ "\"\"\";"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's configure the prompt, e.g. allowing for some creativity and a sufficient number of tokens."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var aiRequestSettings = new OpenAIRequestSettings \n",
+ "{\n",
+ " MaxTokens = 2000,\n",
+ " Temperature = 0.2,\n",
+ " TopP = 0.5\n",
+ "};\n",
+ "\n",
+ "var promptConfig = new PromptTemplateConfig();\n",
+ "promptConfig.ModelSettings.Add(aiRequestSettings);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The following code prepares an instance of the template, passing in the TXT and configuration above, \n",
+ "and a couple of other parameters (how to render the TXT and how the template can access other functions)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var promptTemplate = new PromptTemplate(\n",
+ " skPrompt, // Prompt template defined in natural language\n",
+ " promptConfig, // Prompt configuration\n",
+ " kernel // SK instance\n",
+ ");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's transform the prompt template into a function that the kernel can execute:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var functionConfig = new SemanticFunctionConfig(promptConfig, promptTemplate);\n",
+ "\n",
+ "var summaryFunction = kernel.RegisterSemanticFunction(\"MyPlugin\", \"Summary\", functionConfig);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Set up some content to summarize, here's an extract about Demo, an ancient Greek poet, taken from [Wikipedia](https://en.wikipedia.org/wiki/Demo_(ancient_Greek_poet))."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var input = \"\"\"\n",
+ "Demo (ancient Greek poet)\n",
+ "From Wikipedia, the free encyclopedia\n",
+ "Demo or Damo (Greek: Δεμώ, Δαμώ; fl. c. AD 200) was a Greek woman of the Roman period, known for a single epigram, engraved upon the Colossus of Memnon, which bears her name. She speaks of herself therein as a lyric poetess dedicated to the Muses, but nothing is known of her life.[1]\n",
+ "Identity\n",
+ "Demo was evidently Greek, as her name, a traditional epithet of Demeter, signifies. The name was relatively common in the Hellenistic world, in Egypt and elsewhere, and she cannot be further identified. The date of her visit to the Colossus of Memnon cannot be established with certainty, but internal evidence on the left leg suggests her poem was inscribed there at some point in or after AD 196.[2]\n",
+ "Epigram\n",
+ "There are a number of graffiti inscriptions on the Colossus of Memnon. Following three epigrams by Julia Balbilla, a fourth epigram, in elegiac couplets, entitled and presumably authored by \"Demo\" or \"Damo\" (the Greek inscription is difficult to read), is a dedication to the Muses.[2] The poem is traditionally published with the works of Balbilla, though the internal evidence suggests a different author.[1]\n",
+ "In the poem, Demo explains that Memnon has shown her special respect. In return, Demo offers the gift for poetry, as a gift to the hero. At the end of this epigram, she addresses Memnon, highlighting his divine status by recalling his strength and holiness.[2]\n",
+ "Demo, like Julia Balbilla, writes in the artificial and poetic Aeolic dialect. The language indicates she was knowledgeable in Homeric poetry—'bearing a pleasant gift', for example, alludes to the use of that phrase throughout the Iliad and Odyssey.[a][2] \n",
+ "\"\"\";"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "...and run the summary function:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var summaryResult = await kernel.RunAsync(input, summaryFunction);\n",
+ "var summary = summaryResult.GetValue();\n",
+ "\n",
+ "Console.WriteLine(summary);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The code above shows all the steps, to understand how the function is composed step by step. However, the kernel\n",
+ "includes also some helpers to achieve the same more concisely.\n",
+ "\n",
+ "The same function above can be created with less code:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "string skPrompt = \"\"\"\n",
+ "{{$input}}\n",
+ "\n",
+ "Summarize the content above.\n",
+ "\"\"\";\n",
+ "\n",
+ "var summaryFunction = kernel.CreateSemanticFunction(skPrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 2000, Temperature = 0.2, TopP = 0.5 });\n",
+ "\n",
+ "var summaryResult = await kernel.RunAsync(input, summaryFunction);\n",
+ "var summary = summaryResult.GetValue();\n",
+ "\n",
+ "Console.WriteLine(summary);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Here's one more example of how to write an inline Semantic Function that gives a TLDR for a piece of text.\n",
+ "\n",
+ "\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": []
+ },
+ "outputs": [],
+ "source": [
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "var kernel = builder.Build();\n",
+ "\n",
+ "string skPrompt = @\"\n",
+ "{{$input}}\n",
+ "\n",
+ "Give me the TLDR in 5 words.\n",
+ "\";\n",
+ "\n",
+ "var textToSummarize = @\"\n",
+ " 1) A robot may not injure a human being or, through inaction,\n",
+ " allow a human being to come to harm.\n",
+ "\n",
+ " 2) A robot must obey orders given it by human beings except where\n",
+ " such orders would conflict with the First Law.\n",
+ "\n",
+ " 3) A robot must protect its own existence as long as such protection\n",
+ " does not conflict with the First or Second Law.\n",
+ "\";\n",
+ "\n",
+ "var tldrFunction = kernel.CreateSemanticFunction(skPrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 2000, Temperature = 0.2, TopP = 0.5 });\n",
+ "\n",
+ "var summaryResult = await kernel.RunAsync(textToSummarize, tldrFunction);\n",
+ "var summary = summaryResult.GetValue();\n",
+ "\n",
+ "Console.WriteLine(summary);\n",
+ "\n",
+ "// Output => Robots must not harm humans."
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/04-context-variables-chat.ipynb b/dotnet/notebooks/04-context-variables-chat.ipynb
new file mode 100644
index 000000000000..f5c33f746a9d
--- /dev/null
+++ b/dotnet/notebooks/04-context-variables-chat.ipynb
@@ -0,0 +1,390 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Creating a basic chat experience with context variables\n",
+ "\n",
+ "In this example, we show how you can build a simple chat bot by sending and updating context with your requests. \n",
+ "\n",
+ "We introduce the Context Variables object which in this demo functions similarly as a key-value store that you can use when running the kernel.\n",
+ "\n",
+ "The context is local (i.e. in your computer's RAM) and not persisted anywhere beyond the life of this Jupyter session.\n",
+ "\n",
+ "In future examples, we will show how to persist the context on disk so that you can bring it into your applications. \n",
+ "\n",
+ "In this chat scenario, as the user talks back and forth with the bot, the context gets populated with the history of the conversation. During each new run of the kernel, the context can provide the AI with its variables' content. "
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#!import config/Settings.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.SemanticFunctions;\n",
+ "using Microsoft.SemanticKernel.Orchestration;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;\n",
+ "\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "IKernel kernel = builder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's define a prompt outlining a dialogue chat bot."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string skPrompt = @\"\n",
+ "ChatBot can have a conversation with you about any topic.\n",
+ "It can give explicit instructions or say 'I don't know' if it does not have an answer.\n",
+ "\n",
+ "{{$history}}\n",
+ "User: {{$userInput}}\n",
+ "ChatBot:\";\n",
+ "\n",
+ "var aiRequestSettings = new OpenAIRequestSettings \n",
+ "{\n",
+ " MaxTokens = 2000,\n",
+ " Temperature = 0.7,\n",
+ " TopP = 0.5\n",
+ "};\n",
+ "\n",
+ "var promptConfig = new PromptTemplateConfig();\n",
+ "promptConfig.ModelSettings.Add(aiRequestSettings);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Register your semantic function"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var promptTemplate = new PromptTemplate(skPrompt, promptConfig, kernel);\n",
+ "var functionConfig = new SemanticFunctionConfig(promptConfig, promptTemplate);\n",
+ "var chatFunction = kernel.RegisterSemanticFunction(\"ChatBot\", \"Chat\", functionConfig);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Initialize your context"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var context = kernel.CreateNewContext();\n",
+ "\n",
+ "var history = \"\";\n",
+ "context.Variables[\"history\"] = history;"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Chat with the Bot"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var userInput = \"Hi, I'm looking for book suggestions\";\n",
+ "context.Variables[\"userInput\"] = userInput;\n",
+ "\n",
+ "var bot_answer = await chatFunction.InvokeAsync(context);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Update the history with the output and set this as the new input value for the next request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "history += $\"\\nUser: {userInput}\\nMelody: {bot_answer.GetValue()}\\n\";\n",
+ "context.Variables.Update(history);\n",
+ "\n",
+ "Console.WriteLine(context);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Keep Chatting!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Func Chat = async (string input) => {\n",
+ " // Save new message in the context variables\n",
+ " context.Variables[\"userInput\"] = input;\n",
+ "\n",
+ " // Process the user message and get an answer\n",
+ " var answer = await chatFunction.InvokeAsync(context);\n",
+ "\n",
+ " // Append the new interaction to the chat history\n",
+ " history += $\"\\nUser: {input}\\nMelody: {answer.GetValue()}\\n\"; \n",
+ " context.Variables[\"history\"] = history;\n",
+ " \n",
+ " // Show the response\n",
+ " Console.WriteLine(context);\n",
+ "};"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"I would like a non-fiction book suggestion about Greece history. Please only list one book.\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"that sounds interesting, what are some of the topics I will learn about?\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"Which topic from the ones you listed do you think most people find interesting?\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"could you list some more books I could read about the topic(s) you mentioned?\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "After chatting for a while, we have built a growing history, which we are attaching to each prompt and which contains the full conversation. Let's take a look!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Console.WriteLine(context.Variables[\"history\"]);"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "file_extension": ".cs",
+ "mimetype": "text/x-csharp",
+ "name": "C#",
+ "pygments_lexer": "csharp",
+ "version": "11.0"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/05-using-the-planner.ipynb b/dotnet/notebooks/05-using-the-planner.ipynb
new file mode 100644
index 000000000000..652884a33773
--- /dev/null
+++ b/dotnet/notebooks/05-using-the-planner.ipynb
@@ -0,0 +1,286 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Introduction to the Planner\n",
+ "\n",
+ "The Planner is one of the fundamental concepts of the Semantic Kernel. It makes use of the collection of plugins that have been registered to the kernel and using AI, will formulate a plan to execute a given ask.\n",
+ "\n",
+ "Read more about it [here](https://aka.ms/sk/concepts/planner)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "#!import config/Utils.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.Plugins.Core;\n",
+ "using Microsoft.SemanticKernel.Orchestration;\n",
+ "using Microsoft.SemanticKernel.Planning;\n",
+ "using Microsoft.SemanticKernel.Planners;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;\n",
+ "\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "var kernel = builder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Setting Up the Planner\n",
+ "The planner is located in the `Microsoft.SemanticKernel.Planners.Core` package and requires Orchestration"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Load native plugin into the kernel registry, sharing its functions with prompt templates\n",
+ "var planner = new SequentialPlanner(kernel);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Providing plugins to the planner\n",
+ "The planner needs to know what plugins are available to it. Here we'll give it access to the `SummarizePlugin` and `WriterPlugin` we have defined on disk."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var pluginsDirectory = Path.Combine(System.IO.Directory.GetCurrentDirectory(), \"..\", \"..\", \"samples\", \"plugins\");\n",
+ "kernel.ImportSemanticFunctionsFromDirectory(pluginsDirectory, \"SummarizePlugin\");\n",
+ "kernel.ImportSemanticFunctionsFromDirectory(pluginsDirectory, \"WriterPlugin\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Define your ASK. What do you want the Kernel to do?"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var ask = \"Tomorrow is Valentine's day. I need to come up with a few date ideas. My significant other likes poems so write them in the form of a poem.\";\n",
+ "var originalPlan = await planner.CreatePlanAsync(ask);\n",
+ "\n",
+ "Console.WriteLine(\"Original plan:\\n\");\n",
+ "Console.WriteLine(JsonSerializer.Serialize(originalPlan, new JsonSerializerOptions { WriteIndented = true }));"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "As you can see in the above plan, the Planner has taken the user's ask and converted it into a Plan object detailing how the AI would go about solving this task.\n",
+ "\n",
+ "It makes use of the plugins that the Kernel has available to it and determines which functions to call in order to fulfill the user's ask.\n",
+ "\n",
+ "The output of each step of the plan gets set as `setContextVariable` which makes it available as `input` to the next plugin."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's also define an inline plugin and have it be available to the Planner.\n",
+ "Be sure to give it a function name and plugin name."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "string skPrompt = \"\"\"\n",
+ "{{$input}}\n",
+ "\n",
+ "Rewrite the above in the style of Shakespeare.\n",
+ "\"\"\";\n",
+ "var shakespeareFunction = kernel.CreateSemanticFunction(skPrompt, \"Shakespeare\", \"ShakespearePlugin\", requestSettings: new OpenAIRequestSettings { MaxTokens = 2000, Temperature = 0.2, TopP = 0.5 });"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's update our ask using this new plugin."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var ask = @\"Tomorrow is Valentine's day. I need to come up with a few date ideas.\n",
+ "She likes Shakespeare so write using his style. Write them in the form of a poem.\";\n",
+ "\n",
+ "var newPlan = await planner.CreatePlanAsync(ask);\n",
+ "\n",
+ "Console.WriteLine(\"Updated plan:\\n\");\n",
+ "Console.WriteLine(JsonSerializer.Serialize(newPlan, new JsonSerializerOptions { WriteIndented = true }));"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Executing the plans\n",
+ "\n",
+ "Now that we have different plans, let's try to execute them! The Kernel can execute the plan using RunAsync."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var originalPlanResult = await kernel.RunAsync(originalPlan);\n",
+ "\n",
+ "Console.WriteLine(\"Original Plan results:\\n\");\n",
+ "Console.WriteLine(Utils.WordWrap(originalPlanResult.GetValue(), 100));"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now lets execute and print the new plan:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var newPlanResult = await kernel.RunAsync(newPlan);\n",
+ "\n",
+ "Console.WriteLine(\"New Plan results:\\n\");\n",
+ "Console.WriteLine(newPlanResult.GetValue());"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/06-memory-and-embeddings.ipynb b/dotnet/notebooks/06-memory-and-embeddings.ipynb
new file mode 100644
index 000000000000..69be621eb614
--- /dev/null
+++ b/dotnet/notebooks/06-memory-and-embeddings.ipynb
@@ -0,0 +1,566 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Building Semantic Memory with Embeddings\n",
+ "\n",
+ "So far, we've mostly been treating the kernel as a stateless orchestration engine.\n",
+ "We send text into a model API and receive text out. \n",
+ "\n",
+ "In a [previous notebook](04-context-variables-chat.ipynb), we used `context variables` to pass in additional\n",
+ "text into prompts to enrich them with more context. This allowed us to create a basic chat experience. \n",
+ "\n",
+ "However, if you solely relied on context variables, you would quickly realize that eventually your prompt\n",
+ "would grow so large that you would run into a the model's token limit. What we need is a way to persist state\n",
+ "and build both short-term and long-term memory to empower even more intelligent applications. \n",
+ "\n",
+ "To do this, we dive into the key concept of `Semantic Memory` in the Semantic Kernel. "
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#r \"nuget: System.Linq.Async, 6.0.1\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.SemanticFunctions;\n",
+ "using Microsoft.SemanticKernel.Orchestration;\n",
+ "\n",
+ "var kernelBuilder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ " kernelBuilder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " kernelBuilder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "var kernel = kernelBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In order to use memory, we need to instantiate the Memory Plugin with a Memory Storage\n",
+ "and an Embedding backend. In this example, we make use of the `VolatileMemoryStore`\n",
+ "which can be thought of as a temporary in-memory storage (not to be confused with Semantic Memory).\n",
+ "\n",
+ "This memory is not written to disk and is only available during the app session.\n",
+ "\n",
+ "When developing your app you will have the option to plug in persistent storage\n",
+ "like Azure Cosmos Db, PostgreSQL, SQLite, etc. Semantic Memory allows also to index\n",
+ "external data sources, without duplicating all the information, more on that later."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel.Plugins.Memory;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;\n",
+ "\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "var memoryBuilder = new MemoryBuilder();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ "{\n",
+ " memoryBuilder.WithAzureTextEmbeddingGenerationService(\"text-embedding-ada-002\", azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " memoryBuilder.WithOpenAITextEmbeddingGenerationService(\"text-embedding-ada-002\", apiKey);\n",
+ "}\n",
+ "\n",
+ "memoryBuilder.WithMemoryStore(new VolatileMemoryStore());\n",
+ "\n",
+ "var memory = memoryBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "At its core, Semantic Memory is a set of data structures that allow you to store\n",
+ "the meaning of text that come from different data sources, and optionally to store\n",
+ "the source text too.\n",
+ "\n",
+ "These texts can be from the web, e-mail providers, chats, a database, or from your\n",
+ "local directory, and are hooked up to the Semantic Kernel through data source connectors.\n",
+ "\n",
+ "The texts are embedded or compressed into a vector of floats representing mathematically\n",
+ "the texts' contents and meaning.\n",
+ "\n",
+ "You can read more about embeddings [here](https://aka.ms/sk/embeddings)."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Manually adding memories\n",
+ "Let's create some initial memories \"About Me\". We can add memories to our `VolatileMemoryStore` by using `SaveInformationAsync`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string MemoryCollectionName = \"aboutMe\";\n",
+ "\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info1\", text: \"My name is Andrea\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info2\", text: \"I currently work as a tourist operator\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info3\", text: \"I currently live in Seattle and have been living there since 2005\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info4\", text: \"I visited France and Italy five times since 2015\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info5\", text: \"My family is from New York\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's try searching the memory:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var questions = new[]\n",
+ "{\n",
+ " \"what is my name?\",\n",
+ " \"where do I live?\",\n",
+ " \"where is my family from?\",\n",
+ " \"where have I travelled?\",\n",
+ " \"what do I do for work?\",\n",
+ "};\n",
+ "\n",
+ "foreach (var q in questions)\n",
+ "{\n",
+ " var response = await memory.SearchAsync(MemoryCollectionName, q).FirstOrDefaultAsync();\n",
+ " Console.WriteLine(q + \" \" + response?.Metadata.Text);\n",
+ "}"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's now revisit our chat sample from the [previous notebook](04-context-variables-chat.ipynb).\n",
+ "If you remember, we used context variables to fill the prompt with a `history` that continuously got populated as we chatted with the bot. Let's add also memory to it!"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This is done by using the `TextMemoryPlugin` which exposes the `recall` native function.\n",
+ "\n",
+ "`recall` takes an input ask and performs a similarity search on the contents that have\n",
+ "been embedded in the Memory Store. By default, `recall` returns the most relevant memory."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// TextMemoryPlugin provides the \"recall\" function\n",
+ "kernel.ImportFunctions(new TextMemoryPlugin(memory));"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string skPrompt = @\"\n",
+ "ChatBot can have a conversation with you about any topic.\n",
+ "It can give explicit instructions or say 'I don't know' if it does not have an answer.\n",
+ "\n",
+ "Information about me, from previous conversations:\n",
+ "- {{$fact1}} {{recall $fact1}}\n",
+ "- {{$fact2}} {{recall $fact2}}\n",
+ "- {{$fact3}} {{recall $fact3}}\n",
+ "- {{$fact4}} {{recall $fact4}}\n",
+ "- {{$fact5}} {{recall $fact5}}\n",
+ "\n",
+ "Chat:\n",
+ "{{$history}}\n",
+ "User: {{$userInput}}\n",
+ "ChatBot: \";\n",
+ "\n",
+ "var chatFunction = kernel.CreateSemanticFunction(skPrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 200, Temperature = 0.8 });"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The `RelevanceParam` is used in memory search and is a measure of the relevance score from 0.0 to 1.0, where 1.0 means a perfect match. We encourage users to experiment with different values."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var context = kernel.CreateNewContext();\n",
+ "\n",
+ "context.Variables[\"fact1\"] = \"what is my name?\";\n",
+ "context.Variables[\"fact2\"] = \"where do I live?\";\n",
+ "context.Variables[\"fact3\"] = \"where is my family from?\";\n",
+ "context.Variables[\"fact4\"] = \"where have I travelled?\";\n",
+ "context.Variables[\"fact5\"] = \"what do I do for work?\";\n",
+ "\n",
+ "context.Variables[TextMemoryPlugin.CollectionParam] = MemoryCollectionName;\n",
+ "context.Variables[TextMemoryPlugin.RelevanceParam] = \"0.8\";"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now that we've included our memories, let's chat!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var history = \"\";\n",
+ "context.Variables[\"history\"] = history;\n",
+ "Func Chat = async (string input) => {\n",
+ " // Save new message in the context variables\n",
+ " context.Variables[\"userInput\"] = input;\n",
+ "\n",
+ " // Process the user message and get an answer\n",
+ " var answer = await chatFunction.InvokeAsync(context);\n",
+ "\n",
+ " // Append the new interaction to the chat history\n",
+ " history += $\"\\nUser: {input}\\nChatBot: {answer.GetValue()}\\n\";\n",
+ " context.Variables[\"history\"] = history;\n",
+ " \n",
+ " // Show the bot response\n",
+ " Console.WriteLine(\"ChatBot: \" + context);\n",
+ "};"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"Hello, I think we've met before, remember? my name is...\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"I want to plan a trip and visit my family. Do you know where that is?\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"Great! What are some fun things to do there?\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Adding documents to your memory\n",
+ "\n",
+ "Many times in your applications you'll want to bring in external documents into your memory. Let's see how we can do this using our VolatileMemoryStore.\n",
+ "\n",
+ "Let's first get some data using some of the links in the Semantic Kernel repo."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string memoryCollectionName = \"SKGitHub\";\n",
+ "\n",
+ "var githubFiles = new Dictionary()\n",
+ "{\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/README.md\"]\n",
+ " = \"README: Installation, getting started, and how to contribute\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/notebooks/02-running-prompts-from-file.ipynb\"]\n",
+ " = \"Jupyter notebook describing how to pass prompts from a file to a semantic plugin or function\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/notebooks/00-getting-started.ipynb\"]\n",
+ " = \"Jupyter notebook describing how to get started with the Semantic Kernel\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/plugins/ChatPlugin/ChatGPT\"]\n",
+ " = \"Sample demonstrating how to create a chat plugin interfacing with ChatGPT\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/Plugins/Plugins.Memory/VolatileMemoryStore.cs\"]\n",
+ " = \"C# class that defines a volatile embedding store\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/dotnet/KernelHttpServer/README.md\"]\n",
+ " = \"README: How to set up a Semantic Kernel Service API using Azure Function Runtime v4\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/apps/chat-summary-webapp-react/README.md\"]\n",
+ " = \"README: README associated with a sample starter react-based chat summary webapp\",\n",
+ "};"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's build a new Memory."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var memoryBuilder = new MemoryBuilder();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ "{\n",
+ " memoryBuilder.WithAzureTextEmbeddingGenerationService(\"text-embedding-ada-002\", azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " memoryBuilder.WithOpenAITextEmbeddingGenerationService(\"text-embedding-ada-002\", apiKey);\n",
+ "}\n",
+ "\n",
+ "memoryBuilder.WithMemoryStore(new VolatileMemoryStore());\n",
+ "\n",
+ "var memory = memoryBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now let's add these files to our VolatileMemoryStore using `SaveReferenceAsync`."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Console.WriteLine(\"Adding some GitHub file URLs and their descriptions to a volatile Semantic Memory.\");\n",
+ "var i = 0;\n",
+ "foreach (var entry in githubFiles)\n",
+ "{\n",
+ " await memory.SaveReferenceAsync(\n",
+ " collection: memoryCollectionName,\n",
+ " description: entry.Value,\n",
+ " text: entry.Value,\n",
+ " externalId: entry.Key,\n",
+ " externalSourceName: \"GitHub\"\n",
+ " );\n",
+ " Console.WriteLine($\" URL {++i} saved\");\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "string ask = \"I love Jupyter notebooks, how should I get started?\";\n",
+ "Console.WriteLine(\"===========================\\n\" +\n",
+ " \"Query: \" + ask + \"\\n\");\n",
+ "\n",
+ "var memories = memory.SearchAsync(memoryCollectionName, ask, limit: 5, minRelevanceScore: 0.77);\n",
+ "\n",
+ "i = 0;\n",
+ "await foreach (var memory in memories)\n",
+ "{\n",
+ " Console.WriteLine($\"Result {++i}:\");\n",
+ " Console.WriteLine(\" URL: : \" + memory.Metadata.Id);\n",
+ " Console.WriteLine(\" Title : \" + memory.Metadata.Description);\n",
+ " Console.WriteLine(\" Relevance: \" + memory.Relevance);\n",
+ " Console.WriteLine();\n",
+ "}"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now you might be wondering what happens if you have so much data that it doesn't fit into your RAM? That's where you want to make use of an external Vector Database made specifically for storing and retrieving embeddings.\n",
+ "\n",
+ "Stay tuned for that!"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/dotnet/notebooks/07-DALL-E-2.ipynb b/dotnet/notebooks/07-DALL-E-2.ipynb
new file mode 100644
index 000000000000..8bdd42abde6e
--- /dev/null
+++ b/dotnet/notebooks/07-DALL-E-2.ipynb
@@ -0,0 +1,228 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Generating images with AI\n",
+ "\n",
+ "This notebook demonstrates how to use OpenAI DALL-E 2 to generate images, in combination with other LLM features like text and embedding generation.\n",
+ "\n",
+ "Here, we use Chat Completion to generate a random image description and DALL-E 2 to create an image from that description, showing the image inline.\n",
+ "\n",
+ "Lastly, the notebook asks the user to describe the image. The embedding of the user's description is compared to the original description, using Cosine Similarity, and returning a score from 0 to 1, where 1 means exact match."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": [],
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Usual setup: importing Semantic Kernel SDK and SkiaSharp, used to display images inline.\n",
+ "\n",
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#r \"nuget: SkiaSharp, 2.88.3\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "#!import config/Utils.cs\n",
+ "#!import config/SkiaUtils.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.AI.ImageGeneration; \n",
+ "using Microsoft.SemanticKernel.AI.Embeddings;\n",
+ "using Microsoft.SemanticKernel.AI.Embeddings.VectorOperations;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Setup, using three AI services: images, text, embedding\n",
+ "\n",
+ "The notebook uses:\n",
+ "\n",
+ "* **OpenAI Dall-E 2** to transform the image description into an image\n",
+ "* **text-embedding-ada-002** to compare your guess against the real image description\n",
+ "\n",
+ "**Note:**: For Azure OpenAI, your endpoint should have DALL-E API enabled."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Load OpenAI credentials from config/settings.json\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "// Configure the three AI features: text embedding (using Ada), text completion (using DaVinci 3), image generation (DALL-E 2)\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "if(useAzureOpenAI)\n",
+ "{\n",
+ " builder.WithAzureTextEmbeddingGenerationService(\"text-embedding-ada-002\", azureEndpoint, apiKey);\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ " builder.WithAzureOpenAIImageGenerationService(azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " builder.WithOpenAITextEmbeddingGenerationService(\"text-embedding-ada-002\", apiKey, orgId);\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ " builder.WithOpenAIImageGenerationService(apiKey, orgId);\n",
+ "}\n",
+ " \n",
+ "var kernel = builder.Build();\n",
+ "\n",
+ "// Get AI service instance used to generate images\n",
+ "var dallE = kernel.GetService();\n",
+ "\n",
+ "// Get AI service instance used to extract embedding from a text\n",
+ "var textEmbedding = kernel.GetService();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Generate a (random) image with DALL-E 2\n",
+ "\n",
+ "**genImgDescription** is a Semantic Function used to generate a random image description. \n",
+ "The function takes in input a random number to increase the diversity of its output.\n",
+ "\n",
+ "The random image description is then given to **Dall-E 2** asking to create an image."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": [],
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Create a semantic function that generate a random image description.\n",
+ "var genImgDescription = kernel.CreateSemanticFunction(\n",
+ " \"Think about an artificial object correlated to number {{$input}}. \" +\n",
+ " \"Describe the image with one detailed sentence. The description cannot contain numbers.\", \n",
+ " requestSettings: new OpenAIRequestSettings { MaxTokens = 256, Temperature = 1 });\n",
+ "\n",
+ "var random = new Random().Next(0, 200);\n",
+ "var imageDescriptionResult = await kernel.RunAsync($\"{random}\", genImgDescription);\n",
+ "var imageDescription = imageDescriptionResult.GetValue();\n",
+ "\n",
+ "// Use DALL-E 2 to generate an image. OpenAI in this case returns a URL (though you can ask to return a base64 image)\n",
+ "var imageUrl = await dallE.GenerateImageAsync(imageDescription.Trim(), 512, 512);\n",
+ "\n",
+ "await SkiaUtils.ShowImage(imageUrl, 512, 512);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Let's play a guessing game\n",
+ "\n",
+ "Try to guess what the image is about, describing the content.\n",
+ "\n",
+ "You'll get a score at the end 😉"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": [],
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Prompt the user to guess what the image is\n",
+ "var guess = await InteractiveKernel.GetInputAsync(\"Describe the image in your words\");\n",
+ "\n",
+ "// Compare user guess with real description and calculate score\n",
+ "var origEmbedding = await textEmbedding.GenerateEmbeddingsAsync(new List { imageDescription } );\n",
+ "var guessEmbedding = await textEmbedding.GenerateEmbeddingsAsync(new List { guess } );\n",
+ "var similarity = origEmbedding.First().Span.CosineSimilarity(guessEmbedding.First().Span);\n",
+ "\n",
+ "Console.WriteLine($\"Your description:\\n{Utils.WordWrap(guess, 90)}\\n\");\n",
+ "Console.WriteLine($\"Real description:\\n{Utils.WordWrap(imageDescription.Trim(), 90)}\\n\");\n",
+ "Console.WriteLine($\"Score: {similarity:0.00}\\n\\n\");\n",
+ "\n",
+ "//Uncomment this line to see the URL provided by OpenAI\n",
+ "//Console.WriteLine(imageUrl);"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "file_extension": ".cs",
+ "mimetype": "text/x-csharp",
+ "name": "C#",
+ "pygments_lexer": "csharp",
+ "version": "11.0"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/dotnet/notebooks/08-chatGPT-with-DALL-E-2.ipynb b/dotnet/notebooks/08-chatGPT-with-DALL-E-2.ipynb
new file mode 100644
index 000000000000..532a7b640f89
--- /dev/null
+++ b/dotnet/notebooks/08-chatGPT-with-DALL-E-2.ipynb
@@ -0,0 +1,243 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Using ChatGPT with the Semantic Kernel featuring DALL-E 2\n",
+ "\n",
+ "This notebook shows how to make use of the new ChatCompletion API from OpenAI, popularized by ChatGPT. This API brings a new ChatML schema which is different from the TextCompletion API. While the text completion API expects input a prompt and returns a simple string, the chat completion API expects in input a Chat history and returns a new message:\n",
+ "\n",
+ "```\n",
+ "messages=[\n",
+ " { \"role\": \"system\", \"content\": \"You are a helpful assistant.\"},\n",
+ " { \"role\": \"user\", \"content\": \"Who won the world series in 2020?\"},\n",
+ " { \"role\": \"assistant\", \"content\": \"The Los Angeles Dodgers won the World Series in 2020.\"},\n",
+ " { \"role\": \"user\", \"content\": \"Where was it played?\"}\n",
+ "]\n",
+ "```\n",
+ "\n",
+ "Note that there are three message types:\n",
+ "\n",
+ "1. A System message is used to give instructions to the chat model, e.g. setting the context and the kind of conversation your app is offering.\n",
+ "2. User messages store the data received from the user of your app.\n",
+ "3. Assistant messages store the replies generated by the LLM model. \n",
+ "\n",
+ "Your app is responsible for adding information to the chat history and maintaining this object. The Chat Completion API is stateless, and returns only new messages, that your app can use, e.g. to execute commands, generate images, or simply continue the conversation."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "When deciding between which one to use, know that ChatGPT models (i.e. gpt-3.5-turbo) are optimized for chat applications and have been fine-tuned for instruction-following and dialogue. As such, for creating semantic plugins with the Semantic Kernel, users may still find the TextCompletion model better suited for certain use cases.\n",
+ "\n",
+ "The code below shows how to setup SK with ChatGPT, how to manage the Chat history object, and to make things a little more interesting asks ChatGPT to reply with image descriptions instead so we can have a dialog using images, leveraging DALL-E 2 integration."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": [],
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Usual setup: importing Semantic Kernel SDK and SkiaSharp, used to display images inline.\n",
+ "\n",
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#r \"nuget: SkiaSharp, 2.88.3\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "#!import config/Utils.cs\n",
+ "#!import config/SkiaUtils.cs\n",
+ "\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.AI.ImageGeneration;\n",
+ "using Microsoft.SemanticKernel.AI.ChatCompletion;\n",
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The notebook uses:\n",
+ "\n",
+ "* **OpenAI ChatGPT** to chat with the user\n",
+ "* **OpenAI Dall-E 2** to transform messages into images"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "// Load OpenAI credentials from config/settings.json\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "// Configure the two AI features: OpenAI Chat and DALL-E 2 for image generation\n",
+ "var builder = new KernelBuilder();\n",
+ "\n",
+ "if(useAzureOpenAI)\n",
+ "{\n",
+ " builder.WithAzureChatCompletionService(\"gpt-35-turbo\", azureEndpoint, apiKey);\n",
+ " builder.WithAzureOpenAIImageGenerationService(azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " builder.WithOpenAIChatCompletionService(\"gpt-3.5-turbo\", apiKey, orgId);\n",
+ " builder.WithOpenAIImageGenerationService(apiKey, orgId);\n",
+ "}\n",
+ "\n",
+ "var kernel = builder.Build();\n",
+ "\n",
+ "// Get AI service instance used to generate images\n",
+ "var dallE = kernel.GetService();\n",
+ "\n",
+ "// Get AI service instance used to manage the user chat\n",
+ "var chatGPT = kernel.GetService();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Chat configuration\n",
+ "\n",
+ "Before starting the chat, we create a new chat object with some instructions, which are included in the chat history. \n",
+ "\n",
+ "The instructions tell OpenAI what kind of chat we want to have, in this case we ask to reply with \"image descriptions\", so that we can chain ChatGPT with DALL-E 2."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "tags": [],
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI.ChatCompletion;\n",
+ "\n",
+ "var systemMessage = \"You're chatting with a user. Instead of replying directly to the user\"\n",
+ " + \" provide a description of a cartoonish image that expresses what you want to say.\"\n",
+ " + \" The user won't see your message, they will see only the image.\"\n",
+ " + \" Describe the image with details in one sentence.\";\n",
+ "\n",
+ "var chat = (OpenAIChatHistory)chatGPT.CreateNewChat(systemMessage);"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Let's chat\n",
+ "\n",
+ "Run the following code to start the chat. The chat consists of a loop with these main steps:\n",
+ "\n",
+ "1. Ask the user (you) for a message. The user enters a message. Add the user message into the Chat History object.\n",
+ "2. Send the chat object to AI asking to generate a response. Add the bot message into the Chat History object.\n",
+ "3. Show the answer to the user. In our case before showing the answer, generate an image and show that to the user too.\n",
+ "\n",
+ "*Note: to stop the chat in VS Code press ESC on the kyboard or the \"stop\" button on the left side.*"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ },
+ "vscode": {
+ "languageId": "polyglot-notebook"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "while (true)\n",
+ "{\n",
+ " // 1. Ask the user for a message. The user enters a message. Add the user message into the Chat History object.\n",
+ " var userMessage = await InteractiveKernel.GetInputAsync(\"Your message\");\n",
+ " Console.WriteLine($\"User: {userMessage}\");\n",
+ " chat.AddUserMessage(userMessage);\n",
+ "\n",
+ " // 2. Send the chat object to AI asking to generate a response. Add the bot message into the Chat History object.\n",
+ " string assistantReply = await chatGPT.GenerateMessageAsync(chat, new OpenAIRequestSettings());\n",
+ " chat.AddAssistantMessage(assistantReply);\n",
+ "\n",
+ " // 3. Show the reply as an image\n",
+ " Console.WriteLine($\"\\nBot:\");\n",
+ " var imageUrl = await dallE.GenerateImageAsync(assistantReply, 256, 256);\n",
+ " await SkiaUtils.ShowImage(imageUrl, 256, 256);\n",
+ " Console.WriteLine($\"[{assistantReply}]\\n\");\n",
+ "}"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "file_extension": ".cs",
+ "mimetype": "text/x-csharp",
+ "name": "C#",
+ "pygments_lexer": "csharp",
+ "version": "11.0"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/dotnet/notebooks/09-memory-with-chroma.ipynb b/dotnet/notebooks/09-memory-with-chroma.ipynb
new file mode 100644
index 000000000000..9e1257fd3121
--- /dev/null
+++ b/dotnet/notebooks/09-memory-with-chroma.ipynb
@@ -0,0 +1,576 @@
+{
+ "cells": [
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Building Semantic Memory with Embeddings\n",
+ "\n",
+ "In this notebook, we show how to use [Chroma](https://www.trychroma.com/) with Semantic Kernel to create even more\n",
+ "intelligent applications. We assume that you are already familiar with the concepts of Semantic Kernel\n",
+ "and memory. [Previously](04-context-variables-chat.ipynb), we have used `context variables` to pass\n",
+ "additional text into prompts, enriching them with more context for a basic chat experience.\n",
+ "\n",
+ "However, relying solely on context variables has its limitations, such as the model's token limit.\n",
+ "To overcome these limitations, we will use **SK Semantic Memory**, leveraging Chroma as a persistent\n",
+ "Semantic Memory Storage.\n",
+ "\n",
+ "**Chroma** is an open-source embedding database designed to make it easy to build Language Model\n",
+ "applications by making knowledge, facts, and plugins pluggable for LLMs. It allows us to store and\n",
+ "retrieve information in a way that can be easily utilized by the models, enabling both short-term\n",
+ "and long-term memory for more advanced applications. In this notebook, we will showcase how to\n",
+ "effectively use Chroma with the Semantic Kernel for a powerful application experience.\n",
+ "\n",
+ "**Note:** This example is verified using Chroma version **0.4.10**. Any higher versions may introduce incompatibility."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#r \"nuget: Microsoft.SemanticKernel.Connectors.Memory.Chroma, 1.0.0-beta1\"\n",
+ "#r \"nuget: System.Linq.Async, 6.0.1\"\n",
+ "\n",
+ "#!import config/Settings.cs\n",
+ "\n",
+ "using System;\n",
+ "using System.Collections.Generic;\n",
+ "using System.Linq;\n",
+ "using System.Threading.Tasks;\n",
+ "using Microsoft.SemanticKernel;\n",
+ "using Microsoft.SemanticKernel.Connectors.Memory.Chroma;\n",
+ "using Microsoft.SemanticKernel.Memory;\n",
+ "using Microsoft.SemanticKernel.Plugins.Memory;\n",
+ "\n",
+ "var kernelBuilder = new KernelBuilder();\n",
+ "\n",
+ "// Configure AI backend used by the kernel\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ " kernelBuilder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
+ "else\n",
+ " kernelBuilder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
+ "\n",
+ "var kernel = kernelBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In order to use memory, we need to instantiate the Memory Plugin with a Memory Storage\n",
+ "and an Embedding backend. In this example, we make use of the `ChromaMemoryStore`,\n",
+ "leveraging [Chroma](https://www.trychroma.com/), an open source embedding database\n",
+ "you can run locally and in the cloud.\n",
+ "\n",
+ "To run Chroma locally, here's a quick script to download Chroma source and run it using Docker:\n",
+ "\n",
+ "```shell\n",
+ "git clone https://github.com/chroma-core/chroma.git\n",
+ "cd chroma\n",
+ "docker-compose up --build\n",
+ "```"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel.Connectors.AI.OpenAI;\n",
+ "\n",
+ "var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
+ "\n",
+ "var memoryBuilder = new MemoryBuilder();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ "{\n",
+ " memoryBuilder.WithAzureTextEmbeddingGenerationService(\"text-embedding-ada-002\", azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " memoryBuilder.WithOpenAITextEmbeddingGenerationService(\"text-embedding-ada-002\", apiKey);\n",
+ "}\n",
+ "\n",
+ "var chromaMemoryStore = new ChromaMemoryStore(\"http://127.0.0.1:8000\");\n",
+ "\n",
+ "memoryBuilder.WithMemoryStore(chromaMemoryStore);\n",
+ "\n",
+ "var memory = memoryBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "At its core, Semantic Memory is a set of data structures that allows to store\n",
+ "the meaning of text that come from different data sources, and optionally to\n",
+ "store the source text and other metadata.\n",
+ "\n",
+ "The text can be from the web, e-mail providers, chats, a database, or from your\n",
+ "local directory, and are hooked up to the Semantic Kernel through memory connectors.\n",
+ "\n",
+ "The texts are embedded, sort of \"compressed\", into a vector of floats that representing\n",
+ "mathematically the text content and meaning.\n",
+ "\n",
+ "You can read more about embeddings [here](https://aka.ms/sk/embeddings)."
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Manually adding memories\n",
+ "\n",
+ "Let's create some initial memories \"About Me\". We can add memories to `ChromaMemoryStore` by using `SaveInformationAsync`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string MemoryCollectionName = \"aboutMe\";\n",
+ "\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info1\", text: \"My name is Andrea\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info2\", text: \"I currently work as a tourist operator\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info3\", text: \"I currently live in Seattle and have been living there since 2005\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info4\", text: \"I visited France and Italy five times since 2015\");\n",
+ "await memory.SaveInformationAsync(MemoryCollectionName, id: \"info5\", text: \"My family is from New York\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's try searching the memory:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var questions = new[]\n",
+ "{\n",
+ " \"what is my name?\",\n",
+ " \"where do I live?\",\n",
+ " \"where is my family from?\",\n",
+ " \"where have I travelled?\",\n",
+ " \"what do I do for work?\",\n",
+ "};\n",
+ "\n",
+ "foreach (var q in questions)\n",
+ "{\n",
+ " var response = await memory.SearchAsync(MemoryCollectionName, q, limit: 1, minRelevanceScore: 0.5).FirstOrDefaultAsync();\n",
+ " Console.WriteLine(q + \" \" + response?.Metadata.Text);\n",
+ "}"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's now revisit our chat sample from the [previous notebook](04-context-variables-chat.ipynb).\n",
+ "If you remember, we used context variables to fill the prompt with a `history` that continuously got populated as we chatted with the bot. Let's add also memory to it!"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This is done by using the `TextMemoryPlugin` which exposes the `recall` native function.\n",
+ "\n",
+ "`recall` takes an input ask and performs a similarity search on the contents that have\n",
+ "been embedded in the Memory Store. By default, `recall` returns the most relevant memory."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "using Microsoft.SemanticKernel.Plugins.Memory;\n",
+ "\n",
+ "// TextMemoryPlugin provides the \"recall\" function\n",
+ "kernel.ImportFunctions(new TextMemoryPlugin(memory));"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string skPrompt = @\"\n",
+ "ChatBot can have a conversation with you about any topic.\n",
+ "It can give explicit instructions or say 'I don't know' if it does not have an answer.\n",
+ "\n",
+ "Information about me, from previous conversations:\n",
+ "- {{$fact1}} {{recall $fact1}}\n",
+ "- {{$fact2}} {{recall $fact2}}\n",
+ "- {{$fact3}} {{recall $fact3}}\n",
+ "- {{$fact4}} {{recall $fact4}}\n",
+ "- {{$fact5}} {{recall $fact5}}\n",
+ "\n",
+ "Chat:\n",
+ "{{$history}}\n",
+ "User: {{$userInput}}\n",
+ "ChatBot: \";\n",
+ "\n",
+ "var chatFunction = kernel.CreateSemanticFunction(skPrompt, requestSettings: new OpenAIRequestSettings { MaxTokens = 200, Temperature = 0.8 });"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The `RelevanceParam` is used in memory search and is a measure of the relevance score from 0.0 to 1.0, where 1.0 means a perfect match. We encourage users to experiment with different values."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var context = kernel.CreateNewContext();\n",
+ "\n",
+ "context.Variables[\"fact1\"] = \"what is my name?\";\n",
+ "context.Variables[\"fact2\"] = \"where do I live?\";\n",
+ "context.Variables[\"fact3\"] = \"where is my family from?\";\n",
+ "context.Variables[\"fact4\"] = \"where have I travelled?\";\n",
+ "context.Variables[\"fact5\"] = \"what do I do for work?\";\n",
+ "\n",
+ "context.Variables[TextMemoryPlugin.CollectionParam] = MemoryCollectionName;\n",
+ "context.Variables[TextMemoryPlugin.RelevanceParam] = \"0.6\";"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now that we've included our memories, let's chat!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var history = \"\";\n",
+ "context.Variables[\"history\"] = history;\n",
+ "Func Chat = async (string input) => {\n",
+ " // Save new message in the context variables\n",
+ " context.Variables[\"userInput\"] = input;\n",
+ "\n",
+ " // Process the user message and get an answer\n",
+ " var answer = await chatFunction.InvokeAsync(context);\n",
+ "\n",
+ " // Append the new interaction to the chat history\n",
+ " history += $\"\\nUser: {input}\\nChatBot: {answer.GetValue()}\\n\";\n",
+ " context.Variables[\"history\"] = history;\n",
+ " \n",
+ " // Show the bot response\n",
+ " Console.WriteLine(\"ChatBot: \" + context);\n",
+ "};"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"Hello, I think we've met before, remember? my name is...\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"I want to plan a trip and visit my family. Do you know where that is?\");"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "await Chat(\"Great! What are some fun things to do there?\");"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Adding documents to your memory\n",
+ "\n",
+ "Many times in your applications you'll want to bring in external documents into your memory. Let's see how we can do this using ChromaMemoryStore.\n",
+ "\n",
+ "Let's first get some data using some of the links in the Semantic Kernel repo."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "const string memoryCollectionName = \"SKGitHub\";\n",
+ "\n",
+ "var githubFiles = new Dictionary()\n",
+ "{\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/README.md\"]\n",
+ " = \"README: Installation, getting started, and how to contribute\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/notebooks/02-running-prompts-from-file.ipynb\"]\n",
+ " = \"Jupyter notebook describing how to pass prompts from a file to a semantic plugin or function\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/notebooks/00-getting-started.ipynb\"]\n",
+ " = \"Jupyter notebook describing how to get started with the Semantic Kernel\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/plugins/ChatPlugin/ChatGPT\"]\n",
+ " = \"Sample demonstrating how to create a chat plugin interfacing with ChatGPT\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/Plugins/Plugins.Memory/VolatileMemoryStore.cs\"]\n",
+ " = \"C# class that defines a volatile embedding store\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/dotnet/KernelHttpServer/README.md\"]\n",
+ " = \"README: How to set up a Semantic Kernel Service API using Azure Function Runtime v4\",\n",
+ " [\"https://github.com/microsoft/semantic-kernel/tree/main/samples/apps/chat-summary-webapp-react/README.md\"]\n",
+ " = \"README: README associated with a sample starter react-based chat summary webapp\",\n",
+ "};"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's build a new Memory."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "var memoryBuilder = new MemoryBuilder();\n",
+ "\n",
+ "if (useAzureOpenAI)\n",
+ "{\n",
+ " memoryBuilder.WithAzureTextEmbeddingGenerationService(\"text-embedding-ada-002\", azureEndpoint, apiKey);\n",
+ "}\n",
+ "else\n",
+ "{\n",
+ " memoryBuilder.WithOpenAITextEmbeddingGenerationService(\"text-embedding-ada-002\", apiKey);\n",
+ "}\n",
+ "\n",
+ "var chromaMemoryStore = new ChromaMemoryStore(\"http://127.0.0.1:8000\");\n",
+ "\n",
+ "memoryBuilder.WithMemoryStore(chromaMemoryStore);\n",
+ "\n",
+ "var memory = memoryBuilder.Build();"
+ ]
+ },
+ {
+ "attachments": {},
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now let's add these files to ChromaMemoryStore using `SaveReferenceAsync`."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "Console.WriteLine(\"Adding some GitHub file URLs and their descriptions to Chroma Semantic Memory.\");\n",
+ "var i = 0;\n",
+ "foreach (var entry in githubFiles)\n",
+ "{\n",
+ " await memory.SaveReferenceAsync(\n",
+ " collection: memoryCollectionName,\n",
+ " description: entry.Value,\n",
+ " text: entry.Value,\n",
+ " externalId: entry.Key,\n",
+ " externalSourceName: \"GitHub\"\n",
+ " );\n",
+ " Console.WriteLine($\" URL {++i} saved\");\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "dotnet_interactive": {
+ "language": "csharp"
+ },
+ "polyglot_notebook": {
+ "kernelName": "csharp"
+ }
+ },
+ "outputs": [],
+ "source": [
+ "string ask = \"I love Jupyter notebooks, how should I get started?\";\n",
+ "Console.WriteLine(\"===========================\\n\" +\n",
+ " \"Query: \" + ask + \"\\n\");\n",
+ "\n",
+ "var memories = memory.SearchAsync(memoryCollectionName, ask, limit: 5, minRelevanceScore: 0.6);\n",
+ "\n",
+ "i = 0;\n",
+ "await foreach (var memory in memories)\n",
+ "{\n",
+ " Console.WriteLine($\"Result {++i}:\");\n",
+ " Console.WriteLine(\" URL: : \" + memory.Metadata.Id);\n",
+ " Console.WriteLine(\" Title : \" + memory.Metadata.Description);\n",
+ " Console.WriteLine(\" Relevance: \" + memory.Relevance);\n",
+ " Console.WriteLine();\n",
+ "}"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": ".NET (C#)",
+ "language": "C#",
+ "name": ".net-csharp"
+ },
+ "language_info": {
+ "name": "polyglot-notebook"
+ },
+ "polyglot_notebook": {
+ "kernelInfo": {
+ "defaultKernelName": "csharp",
+ "items": [
+ {
+ "aliases": [],
+ "name": "csharp"
+ }
+ ]
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/samples/notebooks/dotnet/10-BingSearch-using-kernel.ipynb b/dotnet/notebooks/10-BingSearch-using-kernel.ipynb
similarity index 77%
rename from samples/notebooks/dotnet/10-BingSearch-using-kernel.ipynb
rename to dotnet/notebooks/10-BingSearch-using-kernel.ipynb
index 1d9108a51edf..17dbafa49f86 100644
--- a/samples/notebooks/dotnet/10-BingSearch-using-kernel.ipynb
+++ b/dotnet/notebooks/10-BingSearch-using-kernel.ipynb
@@ -11,7 +11,7 @@
"\n",
"To use Bing Search you simply need a Bing Search API key. You can get the API key by creating a [Bing Search resource](https://learn.microsoft.com/en-us/bing/search-apis/bing-web-search/create-bing-search-service-resource) in Azure. \n",
"\n",
- "To learn more read the following [documentation](https://learn.microsoft.com/en-us/bing/search-apis/bing-web-search/overview)\n"
+ "To learn more read the following [documentation](https://learn.microsoft.com/en-us/bing/search-apis/bing-web-search/overview).\n"
]
},
{
@@ -35,29 +35,29 @@
},
"outputs": [],
"source": [
- "#r \"nuget: Microsoft.SemanticKernel, 0.17.230626.1-preview\"\n",
- "#r \"nuget: Microsoft.SemanticKernel.Skills.Web, 0.17.230626.1-preview\"\n",
+ "#r \"nuget: Microsoft.SemanticKernel, 1.0.0-beta1\"\n",
+ "#r \"nuget: Microsoft.SemanticKernel.Plugins.Web, 1.0.0-beta1\"\n",
"\n",
"#!import config/Settings.cs\n",
"#!import config/Utils.cs\n",
"\n",
"using Microsoft.SemanticKernel;\n",
- "using Microsoft.SemanticKernel.Skills.Core;\n",
- "using Microsoft.SemanticKernel.SkillDefinition;\n",
+ "using Microsoft.SemanticKernel.Plugins.Core;\n",
"using Microsoft.SemanticKernel.Orchestration;\n",
"using Microsoft.SemanticKernel.Planning;\n",
- "using Microsoft.SemanticKernel.Planning.Sequential;\n",
+ "using Microsoft.SemanticKernel.Planners;\n",
"using Microsoft.SemanticKernel.TemplateEngine;\n",
"using InteractiveKernel = Microsoft.DotNet.Interactive.Kernel;\n",
+ "\n",
"var builder = new KernelBuilder();\n",
"\n",
"// Configure AI backend used by the kernel\n",
"var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();\n",
"\n",
"if (useAzureOpenAI)\n",
- " builder.WithAzureTextCompletionService(model, azureEndpoint, apiKey);\n",
+ " builder.WithAzureChatCompletionService(model, azureEndpoint, apiKey);\n",
"else\n",
- " builder.WithOpenAITextCompletionService(model, apiKey, orgId);\n",
+ " builder.WithOpenAIChatCompletionService(model, apiKey, orgId);\n",
"\n",
"var kernel = builder.Build();"
]
@@ -83,8 +83,8 @@
},
"outputs": [],
"source": [
- "using Microsoft.SemanticKernel.Skills.Web;\n",
- "using Microsoft.SemanticKernel.Skills.Web.Bing;"
+ "using Microsoft.SemanticKernel.Plugins.Web;\n",
+ "using Microsoft.SemanticKernel.Plugins.Web.Bing;"
]
},
{
@@ -110,7 +110,7 @@
"source": [
"using InteractiveKernel = Microsoft.DotNet.Interactive.Kernel;\n",
"\n",
- "string BING_KEY = await InteractiveKernel.GetPasswordAsync(\"Please enter your Bing Search Key\");"
+ "string BING_KEY = (await InteractiveKernel.GetPasswordAsync(\"Please enter your Bing Search Key\")).GetClearTextPassword();"
]
},
{
@@ -118,7 +118,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Below are some examples on how [`WebSearchEngineSkill`](../../../dotnet/src/Skills/Skills.Web/WebSearchEngineSkill.cs) can be used in SK. "
+ "Below are some examples on how [`WebSearchEnginePlugin`](../src/Plugins/Plugins.Web/WebSearchEnginePlugin.cs) can be used in SK. "
]
},
{
@@ -140,11 +140,13 @@
"\n",
" // Run \n",
" var question = \"What is quantum tunnelling\";\n",
- " var bingResult = await kernel.Func(\"bing\", \"search\").InvokeAsync(question);\n",
+ " var function = kernel.Functions.GetFunction(\"bing\", \"search\");\n",
+ " var bingResult = await kernel.RunAsync(question, function);\n",
+ " var bingResultString = bingResult.GetValue();\n",
"\n",
" Console.WriteLine(question);\n",
" Console.WriteLine(\"----\");\n",
- " Console.WriteLine(bingResult);\n",
+ " Console.WriteLine(bingResultString);\n",
" Console.WriteLine();\n",
"\n",
" /* OUTPUT:\n",
@@ -180,9 +182,11 @@
" //The following function only works in interactive notebooks\n",
" string question = await InteractiveKernel.GetInputAsync(\"Please ask your question\"); \n",
"\n",
- " var bingResult = await kernel.Func(\"bing\", \"search\").InvokeAsync(question);\n",
+ " var function = kernel.Functions.GetFunction(\"bing\", \"search\");\n",
+ " var bingResult = await kernel.RunAsync(question, function);\n",
+ " var bingResultString = bingResult.GetValue();\n",
"\n",
- " Console.WriteLine(bingResult);\n",
+ " Console.WriteLine(bingResultString);\n",
"}"
]
},
@@ -207,14 +211,13 @@
},
"outputs": [],
"source": [
- "// Load Bing skill\n",
- "using (var bingConnector = new BingConnector(BING_KEY))\n",
- "{\n",
- " kernel.ImportSkill(new WebSearchEngineSkill(bingConnector), \"bing\");\n",
+ "// Load Bing plugin\n",
+ "var bingConnector = new BingConnector(BING_KEY);\n",
"\n",
- " //await Example1Async(kernel);\n",
- " //await Example2Async(kernel);\n",
- "}"
+ "kernel.ImportFunctions(new WebSearchEnginePlugin(bingConnector), \"bing\");\n",
+ "\n",
+ "//await Example1Async(kernel);\n",
+ "//await Example2Async(kernel);"
]
}
],
diff --git a/dotnet/notebooks/README.md b/dotnet/notebooks/README.md
new file mode 100644
index 000000000000..b7df44d5b309
--- /dev/null
+++ b/dotnet/notebooks/README.md
@@ -0,0 +1,119 @@
+# Semantic Kernel C# Notebooks
+
+The current folder contains a few C# Jupyter Notebooks that demonstrate how to get started with
+the Semantic Kernel. The notebooks are organized in order of increasing complexity.
+
+To run the notebooks, we recommend the following steps:
+
+- [Install .NET 7](https://dotnet.microsoft.com/download/dotnet/7.0)
+- [Install Visual Studio Code (VS Code)](https://code.visualstudio.com)
+- Launch VS Code and [install the "Polyglot" extension](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.dotnet-interactive-vscode).
+ Min version required: v1.0.4102020 (Feb 2022).
+
+The steps above should be sufficient, you can now **open all the C# notebooks in VS Code**.
+
+VS Code screenshot example:
+
+![image](https://user-images.githubusercontent.com/371009/216761942-1861635c-b4b7-4059-8ecf-590d93fe6300.png)
+
+## Set your OpenAI API key
+
+To start using these notebooks, be sure to add the appropriate API keys to `config/settings.json`.
+
+You can create the file manually or run [the Setup notebook](0-AI-settings.ipynb).
+
+For Azure OpenAI:
+
+```json
+{
+ "type": "azure",
+ "model": "...", // Azure OpenAI Deployment Name
+ "endpoint": "...", // Azure OpenAI endpoint
+ "apikey": "..." // Azure OpenAI key
+}
+```
+
+For OpenAI:
+
+```json
+{
+ "type": "openai",
+ "model": "gpt-3.5-turbo", // OpenAI model name
+ "apikey": "...", // OpenAI API Key
+ "org": "" // only for OpenAI accounts with multiple orgs
+}
+```
+
+If you need an Azure OpenAI key, go [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/quickstart?pivots=rest-api).
+If you need an OpenAI key, go [here](https://platform.openai.com/account/api-keys)
+
+# Topics
+
+Before starting, make sure you configured `config/settings.json`,
+see the previous section.
+
+For a quick dive, look at the [getting started notebook](00-getting-started.ipynb).
+
+1. [Loading and configuring Semantic Kernel](01-basic-loading-the-kernel.ipynb)
+2. [Running AI prompts from file](02-running-prompts-from-file.ipynb)
+3. [Creating Semantic Functions at runtime (i.e. inline functions)](03-semantic-function-inline.ipynb)
+4. [Using Context Variables to Build a Chat Experience](04-context-variables-chat.ipynb)
+5. [Creating and Executing Plans](05-using-the-planner.ipynb)
+6. [Building Memory with Embeddings](06-memory-and-embeddings.ipynb)
+7. [Creating images with DALL-E 2](07-DALL-E-2.ipynb)
+8. [Chatting with ChatGPT and Images](08-chatGPT-with-DALL-E-2.ipynb)
+
+# Run notebooks in the browser with JupyterLab
+
+You can run the notebooks also in the browser with JupyterLab. These steps
+should be sufficient to start:
+
+Install Python 3, Pip and .NET 7 in your system, then:
+
+ pip install jupyterlab
+ dotnet tool install -g Microsoft.dotnet-interactive
+ dotnet tool update -g Microsoft.dotnet-interactive
+ dotnet interactive jupyter install
+
+This command will confirm that Jupyter now supports C# notebooks:
+
+ jupyter kernelspec list
+
+Enter the notebooks folder, and run this to launch the browser interface:
+
+ jupyter-lab
+
+![image](https://user-images.githubusercontent.com/371009/216756924-41657aa0-5574-4bc9-9bdb-ead3db7bf93a.png)
+
+# Troubleshooting
+
+## Nuget
+
+If you are unable to get the Nuget package, first list your Nuget sources:
+
+```sh
+dotnet nuget list source
+```
+
+If you see `No sources found.`, add the NuGet official package source:
+
+```sh
+dotnet nuget add source "https://api.nuget.org/v3/index.json" --name "nuget.org"
+```
+
+Run `dotnet nuget list source` again to verify the source was added.
+
+## Polyglot Notebooks
+
+If somehow the notebooks don't work, run these commands:
+
+- Install .NET Interactive: `dotnet tool install -g Microsoft.dotnet-interactive`
+- Register .NET kernels into Jupyter: `dotnet interactive jupyter install` (this might return some errors, ignore them)
+- If you are still stuck, read the following pages:
+ - https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.dotnet-interactive-vscode
+ - https://devblogs.microsoft.com/dotnet/net-core-with-juypter-notebooks-is-here-preview-1/
+ - https://docs.servicestack.net/jupyter-notebooks-csharp
+ - https://developers.refinitiv.com/en/article-catalog/article/using--net-core-in-jupyter-notebook
+
+Note: ["Polyglot Notebooks" used to be called ".NET Interactive Notebooks"](https://devblogs.microsoft.com/dotnet/dotnet-interactive-notebooks-is-now-polyglot-notebooks/),
+so you might find online some documentation referencing the old name.
diff --git a/samples/notebooks/dotnet/config/.gitignore b/dotnet/notebooks/config/.gitignore
similarity index 100%
rename from samples/notebooks/dotnet/config/.gitignore
rename to dotnet/notebooks/config/.gitignore
diff --git a/samples/notebooks/dotnet/config/Settings.cs b/dotnet/notebooks/config/Settings.cs
similarity index 96%
rename from samples/notebooks/dotnet/config/Settings.cs
rename to dotnet/notebooks/config/Settings.cs
index 3a6fd3bbbdc5..498d5afaae12 100644
--- a/samples/notebooks/dotnet/config/Settings.cs
+++ b/dotnet/notebooks/config/Settings.cs
@@ -59,7 +59,7 @@ public static async Task AskModel(bool _useAzureOpenAI = true, string co
else
{
// Use the best model by default, and reduce the setup friction, particularly in VS Studio.
- model = "text-davinci-003";
+ model = "gpt-3.5-turbo";
}
}
@@ -92,12 +92,12 @@ public static async Task AskApiKey(bool _useAzureOpenAI = true, string c
{
if (useAzureOpenAI)
{
- apiKey = await InteractiveKernel.GetPasswordAsync("Please enter your Azure OpenAI API key");
+ apiKey = (await InteractiveKernel.GetPasswordAsync("Please enter your Azure OpenAI API key")).GetClearTextPassword();
orgId = "";
}
else
{
- apiKey = await InteractiveKernel.GetPasswordAsync("Please enter your OpenAI API key");
+ apiKey = (await InteractiveKernel.GetPasswordAsync("Please enter your OpenAI API key")).GetClearTextPassword();
}
}
diff --git a/samples/notebooks/dotnet/config/SkiaUtils.cs b/dotnet/notebooks/config/SkiaUtils.cs
similarity index 100%
rename from samples/notebooks/dotnet/config/SkiaUtils.cs
rename to dotnet/notebooks/config/SkiaUtils.cs
diff --git a/samples/notebooks/dotnet/config/Utils.cs b/dotnet/notebooks/config/Utils.cs
similarity index 100%
rename from samples/notebooks/dotnet/config/Utils.cs
rename to dotnet/notebooks/config/Utils.cs
diff --git a/samples/notebooks/dotnet/config/settings.json.azure-example b/dotnet/notebooks/config/settings.json.azure-example
similarity index 100%
rename from samples/notebooks/dotnet/config/settings.json.azure-example
rename to dotnet/notebooks/config/settings.json.azure-example
diff --git a/samples/notebooks/dotnet/config/settings.json.openai-example b/dotnet/notebooks/config/settings.json.openai-example
similarity index 80%
rename from samples/notebooks/dotnet/config/settings.json.openai-example
rename to dotnet/notebooks/config/settings.json.openai-example
index 7b22ea437065..9d5535489ea1 100644
--- a/samples/notebooks/dotnet/config/settings.json.openai-example
+++ b/dotnet/notebooks/config/settings.json.openai-example
@@ -1,7 +1,7 @@
{
"type": "openai",
"endpoint": "NOT-USED-BUT-REQUIRED-FOR-PARSER",
- "model": "text-davinci-003",
+ "model": "gpt-3.5-turbo",
"apikey": "... your OpenAI key ...",
"org": ""
}
\ No newline at end of file
diff --git a/dotnet/nuget/nuget-package.props b/dotnet/nuget/nuget-package.props
index 5dba582410ed..69f17536329e 100644
--- a/dotnet/nuget/nuget-package.props
+++ b/dotnet/nuget/nuget-package.props
@@ -1,7 +1,7 @@
- 0.18
+ 1.0.0-beta2Debug;Release;Publishtrue
diff --git a/dotnet/samples/ApplicationInsightsExample/ApplicationInsightsExample.csproj b/dotnet/samples/ApplicationInsightsExample/ApplicationInsightsExample.csproj
index 99c7aad4187a..3957425441f5 100644
--- a/dotnet/samples/ApplicationInsightsExample/ApplicationInsightsExample.csproj
+++ b/dotnet/samples/ApplicationInsightsExample/ApplicationInsightsExample.csproj
@@ -19,14 +19,13 @@
-
-
-
-
-
-
-
+
+
+
+
+
+
diff --git a/dotnet/samples/ApplicationInsightsExample/Program.cs b/dotnet/samples/ApplicationInsightsExample/Program.cs
index e90cea2e1aa8..0432ef34cb83 100644
--- a/dotnet/samples/ApplicationInsightsExample/Program.cs
+++ b/dotnet/samples/ApplicationInsightsExample/Program.cs
@@ -12,14 +12,12 @@
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.ApplicationInsights;
using Microsoft.SemanticKernel;
+using Microsoft.SemanticKernel.Planners;
using Microsoft.SemanticKernel.Planning;
-using Microsoft.SemanticKernel.Planning.Action;
-using Microsoft.SemanticKernel.Planning.Sequential;
-using Microsoft.SemanticKernel.Planning.Stepwise;
-using Microsoft.SemanticKernel.Skills.Core;
-using Microsoft.SemanticKernel.Skills.Web;
-using Microsoft.SemanticKernel.Skills.Web.Bing;
-using NCalcSkills;
+using Microsoft.SemanticKernel.Plugins.Core;
+using Microsoft.SemanticKernel.Plugins.Web;
+using Microsoft.SemanticKernel.Plugins.Web.Bing;
+using NCalcPlugins;
///
/// Example of telemetry in Semantic Kernel using Application Insights within console application.
@@ -33,14 +31,18 @@ public sealed class Program
/// is set by default.
/// will enable logging with more detailed information, including sensitive data. Should not be used in production.
///
- private static LogLevel LogLevel = LogLevel.Information;
+ private const LogLevel MinLogLevel = LogLevel.Information;
+ ///
+ /// The main entry point for the application.
+ ///
+ /// A representing the asynchronous operation.
public static async Task Main()
{
var serviceProvider = GetServiceProvider();
var telemetryClient = serviceProvider.GetRequiredService();
- var logger = serviceProvider.GetRequiredService>();
+ var loggerFactory = serviceProvider.GetRequiredService();
using var meterListener = new MeterListener();
using var activityListener = new ActivityListener();
@@ -48,8 +50,8 @@ public static async Task Main()
ConfigureMetering(meterListener, telemetryClient);
ConfigureTracing(activityListener, telemetryClient);
- var kernel = GetKernel(logger);
- var planner = GetSequentialPlanner(kernel, logger);
+ var kernel = GetKernel(loggerFactory);
+ var planner = GetSequentialPlanner(kernel, loggerFactory);
try
{
@@ -66,7 +68,7 @@ public static async Task Main()
var result = await kernel.RunAsync(plan);
Console.WriteLine("Result:");
- Console.WriteLine(result.Result);
+ Console.WriteLine(result.GetValue());
}
finally
{
@@ -92,8 +94,8 @@ private static void ConfigureApplicationInsightsTelemetry(ServiceCollection serv
services.AddLogging(loggingBuilder =>
{
- loggingBuilder.AddFilter(typeof(Program).FullName, LogLevel);
- loggingBuilder.SetMinimumLevel(LogLevel);
+ loggingBuilder.AddFilter(logLevel => logLevel == MinLogLevel);
+ loggingBuilder.SetMinimumLevel(MinLogLevel);
});
services.AddApplicationInsightsTelemetryWorkerService(options =>
@@ -102,49 +104,49 @@ private static void ConfigureApplicationInsightsTelemetry(ServiceCollection serv
});
}
- private static IKernel GetKernel(ILogger logger)
+ private static IKernel GetKernel(ILoggerFactory loggerFactory)
{
- var folder = RepoFiles.SampleSkillsPath();
+ var folder = RepoFiles.SamplePluginsPath();
var bingConnector = new BingConnector(Env.Var("Bing__ApiKey"));
- var webSearchEngineSkill = new WebSearchEngineSkill(bingConnector);
+ var webSearchEnginePlugin = new WebSearchEnginePlugin(bingConnector);
var kernel = new KernelBuilder()
- .WithLogger(logger)
+ .WithLoggerFactory(loggerFactory)
.WithAzureChatCompletionService(
Env.Var("AzureOpenAI__ChatDeploymentName"),
Env.Var("AzureOpenAI__Endpoint"),
Env.Var("AzureOpenAI__ApiKey"))
.Build();
- kernel.ImportSemanticSkillFromDirectory(folder, "SummarizeSkill", "WriterSkill");
+ kernel.ImportSemanticFunctionsFromDirectory(folder, "SummarizePlugin", "WriterPlugin");
- kernel.ImportSkill(webSearchEngineSkill, "WebSearch");
- kernel.ImportSkill(new LanguageCalculatorSkill(kernel), "advancedCalculator");
- kernel.ImportSkill(new TimeSkill(), "time");
+ kernel.ImportFunctions(webSearchEnginePlugin, "WebSearch");
+ kernel.ImportFunctions(new LanguageCalculatorPlugin(kernel), "advancedCalculator");
+ kernel.ImportFunctions(new TimePlugin(), "time");
return kernel;
}
private static ISequentialPlanner GetSequentialPlanner(
IKernel kernel,
- ILogger logger,
+ ILoggerFactory loggerFactory,
int maxTokens = 1024)
{
var plannerConfig = new SequentialPlannerConfig { MaxTokens = maxTokens };
- return new SequentialPlanner(kernel, plannerConfig).WithInstrumentation(logger);
+ return new SequentialPlanner(kernel, plannerConfig).WithInstrumentation(loggerFactory);
}
private static IActionPlanner GetActionPlanner(
IKernel kernel,
- ILogger logger)
+ ILoggerFactory loggerFactory)
{
- return new ActionPlanner(kernel).WithInstrumentation(logger);
+ return new ActionPlanner(kernel).WithInstrumentation(loggerFactory);
}
private static IStepwisePlanner GetStepwisePlanner(
IKernel kernel,
- ILogger logger,
+ ILoggerFactory loggerFactory,
int minIterationTimeMs = 1500,
int maxTokens = 2000)
{
@@ -154,7 +156,7 @@ private static IStepwisePlanner GetStepwisePlanner(
MaxTokens = maxTokens
};
- return new StepwisePlanner(kernel, plannerConfig).WithInstrumentation(logger);
+ return new StepwisePlanner(kernel, plannerConfig).WithInstrumentation(loggerFactory);
}
///
@@ -203,22 +205,22 @@ private static void ConfigureTracing(ActivityListener activityListener, Telemetr
var operations = new ConcurrentDictionary>();
// For more detailed tracing we need to attach Activity entity to Application Insights operation manually.
- Action activityStarted = activity =>
+ void activityStarted(Activity activity)
{
var operation = telemetryClient.StartOperation(activity);
operation.Telemetry.Type = activity.Kind.ToString();
operations.TryAdd(activity.TraceId.ToString(), operation);
- };
+ }
// We also need to manually stop Application Insights operation when Activity entity is stopped.
- Action activityStopped = activity =>
+ void activityStopped(Activity activity)
{
if (operations.TryRemove(activity.TraceId.ToString(), out var operation))
{
telemetryClient.StopOperation(operation);
}
- };
+ }
// Subscribe to all traces in Semantic Kernel
activityListener.ShouldListenTo =
diff --git a/dotnet/samples/ApplicationInsightsExample/RepoUtils/RepoFiles.cs b/dotnet/samples/ApplicationInsightsExample/RepoUtils/RepoFiles.cs
index dc15dfed4472..0c7d595b1bad 100644
--- a/dotnet/samples/ApplicationInsightsExample/RepoUtils/RepoFiles.cs
+++ b/dotnet/samples/ApplicationInsightsExample/RepoUtils/RepoFiles.cs
@@ -6,13 +6,13 @@
internal static class RepoFiles
{
///
- /// Scan the local folders from the repo, looking for "samples/skills" folder.
+ /// Scan the local folders from the repo, looking for "samples/plugins" folder.
///
- /// The full path to samples/skills
- public static string SampleSkillsPath()
+ /// The full path to samples/plugins
+ public static string SamplePluginsPath()
{
const string Parent = "samples";
- const string Folder = "skills";
+ const string Folder = "plugins";
bool SearchPath(string pathToFind, out string result, int maxAttempts = 10)
{
@@ -31,7 +31,7 @@ bool SearchPath(string pathToFind, out string result, int maxAttempts = 10)
if (!SearchPath(Parent + Path.DirectorySeparatorChar + Folder, out string path)
&& !SearchPath(Folder, out path))
{
- throw new DirectoryNotFoundException("Skills directory not found. The app needs the skills from the repo to work.");
+ throw new DirectoryNotFoundException("Plugins directory not found. The app needs the plugins from the repo to work.");
}
return path;
diff --git a/dotnet/samples/KernelSyntaxExamples/Example01_NativeFunctions.cs b/dotnet/samples/KernelSyntaxExamples/Example01_NativeFunctions.cs
index 6c68f07d41f7..50c2faed5548 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example01_NativeFunctions.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example01_NativeFunctions.cs
@@ -2,7 +2,7 @@
using System;
using System.Threading.Tasks;
-using Microsoft.SemanticKernel.Skills.Core;
+using Microsoft.SemanticKernel.Plugins.Core;
// ReSharper disable once InconsistentNaming
public static class Example01_NativeFunctions
@@ -11,8 +11,8 @@ public static Task RunAsync()
{
Console.WriteLine("======== Functions ========");
- // Load native skill
- var text = new TextSkill();
+ // Load native plugin
+ var text = new TextPlugin();
// Use function without kernel
var result = text.Uppercase("ciao!");
diff --git a/dotnet/samples/KernelSyntaxExamples/Example02_Pipeline.cs b/dotnet/samples/KernelSyntaxExamples/Example02_Pipeline.cs
index 819f84656fd8..3e944b021bf1 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example02_Pipeline.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example02_Pipeline.cs
@@ -5,28 +5,28 @@
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Orchestration;
-using Microsoft.SemanticKernel.Skills.Core;
+using Microsoft.SemanticKernel.Plugins.Core;
using RepoUtils;
// ReSharper disable once InconsistentNaming
public static class Example02_Pipeline
{
- private static readonly ILogger s_logger = ConsoleLogger.Logger;
+ private static readonly ILoggerFactory s_loggerFactory = ConsoleLogger.LoggerFactory;
public static async Task RunAsync()
{
Console.WriteLine("======== Pipeline ========");
- IKernel kernel = new KernelBuilder().WithLogger(s_logger).Build();
+ IKernel kernel = new KernelBuilder().WithLoggerFactory(s_loggerFactory).Build();
- // Load native skill
- var text = kernel.ImportSkill(new TextSkill());
+ // Load native plugin
+ var textFunctions = kernel.ImportFunctions(new TextPlugin());
- SKContext result = await kernel.RunAsync(" i n f i n i t e s p a c e ",
- text["TrimStart"],
- text["TrimEnd"],
- text["Uppercase"]);
+ KernelResult result = await kernel.RunAsync(" i n f i n i t e s p a c e ",
+ textFunctions["TrimStart"],
+ textFunctions["TrimEnd"],
+ textFunctions["Uppercase"]);
- Console.WriteLine(result);
+ Console.WriteLine(result.GetValue());
}
}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example03_Variables.cs b/dotnet/samples/KernelSyntaxExamples/Example03_Variables.cs
index 5d312892f49c..f44366ae70ba 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example03_Variables.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example03_Variables.cs
@@ -6,28 +6,28 @@
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Orchestration;
+using Plugins;
using RepoUtils;
-using Skills;
// ReSharper disable once InconsistentNaming
public static class Example03_Variables
{
- private static readonly ILogger s_logger = ConsoleLogger.Logger;
+ private static readonly ILoggerFactory s_loggerFactory = ConsoleLogger.LoggerFactory;
public static async Task RunAsync()
{
Console.WriteLine("======== Variables ========");
- IKernel kernel = new KernelBuilder().WithLogger(s_logger).Build();
- var text = kernel.ImportSkill(new StaticTextSkill(), "text");
+ IKernel kernel = new KernelBuilder().WithLoggerFactory(s_loggerFactory).Build();
+ var textFunctions = kernel.ImportFunctions(new StaticTextPlugin(), "text");
var variables = new ContextVariables("Today is: ");
variables.Set("day", DateTimeOffset.Now.ToString("dddd", CultureInfo.CurrentCulture));
- SKContext result = await kernel.RunAsync(variables,
- text["AppendDay"],
- text["Uppercase"]);
+ KernelResult result = await kernel.RunAsync(variables,
+ textFunctions["AppendDay"],
+ textFunctions["Uppercase"]);
- Console.WriteLine(result);
+ Console.WriteLine(result.GetValue());
}
}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example04_CombineLLMPromptsAndNativeCode.cs b/dotnet/samples/KernelSyntaxExamples/Example04_CombineLLMPromptsAndNativeCode.cs
index 2fe345dbc951..15acd518b4c5 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example04_CombineLLMPromptsAndNativeCode.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example04_CombineLLMPromptsAndNativeCode.cs
@@ -3,8 +3,8 @@
using System;
using System.Threading.Tasks;
using Microsoft.SemanticKernel;
-using Microsoft.SemanticKernel.Skills.Web;
-using Microsoft.SemanticKernel.Skills.Web.Bing;
+using Microsoft.SemanticKernel.Plugins.Web;
+using Microsoft.SemanticKernel.Plugins.Web.Bing;
using RepoUtils;
// ReSharper disable once InconsistentNaming
@@ -23,14 +23,11 @@ public static async Task RunAsync()
}
IKernel kernel = new KernelBuilder()
- .WithLogger(ConsoleLogger.Logger)
- .WithOpenAITextCompletionService("text-davinci-002", openAIApiKey, serviceId: "text-davinci-002")
- .WithOpenAITextCompletionService("text-davinci-003", openAIApiKey)
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAIChatCompletionService(TestConfiguration.OpenAI.ChatModelId, openAIApiKey)
.Build();
- // Load native skill
string bingApiKey = TestConfiguration.Bing.ApiKey;
-
if (bingApiKey == null)
{
Console.WriteLine("Bing credentials not found. Skipping example.");
@@ -38,39 +35,37 @@ public static async Task RunAsync()
}
var bingConnector = new BingConnector(bingApiKey);
- var bing = new WebSearchEngineSkill(bingConnector);
- var search = kernel.ImportSkill(bing, "bing");
+ var bing = new WebSearchEnginePlugin(bingConnector);
+ var searchFunctions = kernel.ImportFunctions(bing, "bing");
- // Load semantic skill defined with prompt templates
- string folder = RepoFiles.SampleSkillsPath();
+ // Load semantic plugins defined with prompt templates
+ string folder = RepoFiles.SamplePluginsPath();
- var sumSkill = kernel.ImportSemanticSkillFromDirectory(
- folder,
- "SummarizeSkill");
+ var summarizeFunctions = kernel.ImportSemanticFunctionsFromDirectory(folder, "SummarizePlugin");
// Run
var ask = "What's the tallest building in South America";
var result1 = await kernel.RunAsync(
ask,
- search["Search"]
+ searchFunctions["Search"]
);
var result2 = await kernel.RunAsync(
ask,
- search["Search"],
- sumSkill["Summarize"]
+ searchFunctions["Search"],
+ summarizeFunctions["Summarize"]
);
var result3 = await kernel.RunAsync(
ask,
- search["Search"],
- sumSkill["Notegen"]
+ searchFunctions["Search"],
+ summarizeFunctions["Notegen"]
);
Console.WriteLine(ask + "\n");
- Console.WriteLine("Bing Answer: " + result1 + "\n");
- Console.WriteLine("Summary: " + result2 + "\n");
- Console.WriteLine("Notes: " + result3 + "\n");
+ Console.WriteLine("Bing Answer: " + result1.GetValue() + "\n");
+ Console.WriteLine("Summary: " + result2.GetValue() + "\n");
+ Console.WriteLine("Notes: " + result3.GetValue() + "\n");
}
}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example05_InlineFunctionDefinition.cs b/dotnet/samples/KernelSyntaxExamples/Example05_InlineFunctionDefinition.cs
index a34919933b00..0237389c01d2 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example05_InlineFunctionDefinition.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example05_InlineFunctionDefinition.cs
@@ -3,6 +3,7 @@
using System;
using System.Threading.Tasks;
using Microsoft.SemanticKernel;
+using Microsoft.SemanticKernel.Connectors.AI.OpenAI;
using RepoUtils;
// ReSharper disable once InconsistentNaming
@@ -12,7 +13,7 @@ public static async Task RunAsync()
{
Console.WriteLine("======== Inline Function Definition ========");
- string openAIModelId = TestConfiguration.OpenAI.ModelId;
+ string openAIModelId = TestConfiguration.OpenAI.ChatModelId;
string openAIApiKey = TestConfiguration.OpenAI.ApiKey;
if (openAIModelId == null || openAIApiKey == null)
@@ -28,14 +29,14 @@ public static async Task RunAsync()
*/
IKernel kernel = new KernelBuilder()
- .WithLogger(ConsoleLogger.Logger)
- .WithOpenAITextCompletionService(
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAIChatCompletionService(
modelId: openAIModelId,
apiKey: openAIApiKey)
.Build();
// Function defined using few-shot design pattern
- const string FunctionDefinition = @"
+ string promptTemplate = @"
Generate a creative reason or excuse for the given event.
Be creative and be funny. Let your imagination run wild.
@@ -48,17 +49,17 @@ Be creative and be funny. Let your imagination run wild.
Event: {{$input}}
";
- var excuseFunction = kernel.CreateSemanticFunction(FunctionDefinition, maxTokens: 100, temperature: 0.4, topP: 1);
+ var excuseFunction = kernel.CreateSemanticFunction(promptTemplate, new OpenAIRequestSettings() { MaxTokens = 100, Temperature = 0.4, TopP = 1 });
- var result = await excuseFunction.InvokeAsync("I missed the F1 final race");
- Console.WriteLine(result);
+ var result = await kernel.RunAsync("I missed the F1 final race", excuseFunction);
+ Console.WriteLine(result.GetValue());
- result = await excuseFunction.InvokeAsync("sorry I forgot your birthday");
- Console.WriteLine(result);
+ result = await kernel.RunAsync("sorry I forgot your birthday", excuseFunction);
+ Console.WriteLine(result.GetValue());
- var fixedFunction = kernel.CreateSemanticFunction($"Translate this date {DateTimeOffset.Now:f} to French format", maxTokens: 100);
+ var fixedFunction = kernel.CreateSemanticFunction($"Translate this date {DateTimeOffset.Now:f} to French format", new OpenAIRequestSettings() { MaxTokens = 100 });
- result = await fixedFunction.InvokeAsync();
- Console.WriteLine(result);
+ result = await kernel.RunAsync(fixedFunction);
+ Console.WriteLine(result.GetValue());
}
}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example06_TemplateLanguage.cs b/dotnet/samples/KernelSyntaxExamples/Example06_TemplateLanguage.cs
index c8cee60f9a46..cd1816e54b19 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example06_TemplateLanguage.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example06_TemplateLanguage.cs
@@ -3,8 +3,9 @@
using System;
using System.Threading.Tasks;
using Microsoft.SemanticKernel;
-using Microsoft.SemanticKernel.Skills.Core;
-using Microsoft.SemanticKernel.TemplateEngine;
+using Microsoft.SemanticKernel.Connectors.AI.OpenAI;
+using Microsoft.SemanticKernel.Plugins.Core;
+using Microsoft.SemanticKernel.TemplateEngine.Basic;
using RepoUtils;
// ReSharper disable once InconsistentNaming
@@ -18,7 +19,7 @@ public static async Task RunAsync()
{
Console.WriteLine("======== TemplateLanguage ========");
- string openAIModelId = TestConfiguration.OpenAI.ModelId;
+ string openAIModelId = TestConfiguration.OpenAI.ChatModelId;
string openAIApiKey = TestConfiguration.OpenAI.ApiKey;
if (openAIModelId == null || openAIApiKey == null)
@@ -28,15 +29,15 @@ public static async Task RunAsync()
}
IKernel kernel = Kernel.Builder
- .WithLogger(ConsoleLogger.Logger)
- .WithOpenAITextCompletionService(
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAIChatCompletionService(
modelId: openAIModelId,
apiKey: openAIApiKey)
.Build();
- // Load native skill into the kernel skill collection, sharing its functions with prompt templates
+ // Load native plugin into the kernel function collection, sharing its functions with prompt templates
// Functions loaded here are available as "time.*"
- kernel.ImportSkill(new TimeSkill(), "time");
+ kernel.ImportFunctions(new TimePlugin(), "time");
// Semantic Function invoking time.Date and time.Time native functions
const string FunctionDefinition = @"
@@ -50,17 +51,17 @@ Is it weekend time (weekend/not weekend)?
// This allows to see the prompt before it's sent to OpenAI
Console.WriteLine("--- Rendered Prompt");
- var promptRenderer = new PromptTemplateEngine();
+ var promptRenderer = new BasicPromptTemplateEngine();
var renderedPrompt = await promptRenderer.RenderAsync(FunctionDefinition, kernel.CreateNewContext());
Console.WriteLine(renderedPrompt);
// Run the prompt / semantic function
- var kindOfDay = kernel.CreateSemanticFunction(FunctionDefinition, maxTokens: 150);
+ var kindOfDay = kernel.CreateSemanticFunction(FunctionDefinition, new OpenAIRequestSettings() { MaxTokens = 100 });
// Show the result
Console.WriteLine("--- Semantic Function result");
- var result = await kindOfDay.InvokeAsync();
- Console.WriteLine(result);
+ var result = await kernel.RunAsync(kindOfDay);
+ Console.WriteLine(result.GetValue());
/* OUTPUT:
diff --git a/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGooglePlugins.cs b/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGooglePlugins.cs
new file mode 100644
index 000000000000..84c180a204c2
--- /dev/null
+++ b/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGooglePlugins.cs
@@ -0,0 +1,193 @@
+// Copyright (c) Microsoft. All rights reserved.
+
+using System;
+using System.Threading.Tasks;
+using Microsoft.SemanticKernel;
+using Microsoft.SemanticKernel.Connectors.AI.OpenAI;
+using Microsoft.SemanticKernel.Plugins.Web;
+using Microsoft.SemanticKernel.Plugins.Web.Bing;
+using Microsoft.SemanticKernel.Plugins.Web.Google;
+using Microsoft.SemanticKernel.TemplateEngine.Basic;
+using RepoUtils;
+
+///
+/// The example shows how to use Bing and Google to search for current data
+/// you might want to import into your system, e.g. providing AI prompts with
+/// recent information, or for AI to generate recent information to display to users.
+///
+// ReSharper disable CommentTypo
+// ReSharper disable once InconsistentNaming
+public static class Example07_BingAndGooglePlugins
+{
+ public static async Task RunAsync()
+ {
+ string openAIModelId = TestConfiguration.OpenAI.ChatModelId;
+ string openAIApiKey = TestConfiguration.OpenAI.ApiKey;
+
+ if (openAIModelId == null || openAIApiKey == null)
+ {
+ Console.WriteLine("OpenAI credentials not found. Skipping example.");
+ return;
+ }
+
+ IKernel kernel = new KernelBuilder()
+ .WithLoggerFactory(ConsoleLogger.LoggerFactory)
+ .WithOpenAIChatCompletionService(
+ modelId: openAIModelId,
+ apiKey: openAIApiKey)
+ .Build();
+
+ // Load Bing plugin
+ string bingApiKey = TestConfiguration.Bing.ApiKey;
+ if (bingApiKey == null)
+ {
+ Console.WriteLine("Bing credentials not found. Skipping example.");
+ }
+ else
+ {
+ var bingConnector = new BingConnector(bingApiKey);
+ var bing = new WebSearchEnginePlugin(bingConnector);
+ kernel.ImportFunctions(bing, "bing");
+ await Example1Async(kernel, "bing");
+ await Example2Async(kernel);
+ }
+
+ // Load Google plugin
+ string googleApiKey = TestConfiguration.Google.ApiKey;
+ string googleSearchEngineId = TestConfiguration.Google.SearchEngineId;
+
+ if (googleApiKey == null || googleSearchEngineId == null)
+ {
+ Console.WriteLine("Google credentials not found. Skipping example.");
+ }
+ else
+ {
+ using var googleConnector = new GoogleConnector(
+ apiKey: googleApiKey,
+ searchEngineId: googleSearchEngineId);
+ var google = new WebSearchEnginePlugin(googleConnector);
+ kernel.ImportFunctions(new WebSearchEnginePlugin(googleConnector), "google");
+ await Example1Async(kernel, "google");
+ }
+ }
+
+ private static async Task Example1Async(IKernel kernel, string searchPluginName)
+ {
+ Console.WriteLine("======== Bing and Google Search Plugins ========");
+
+ // Run
+ var question = "What's the largest building in the world?";
+ var function = kernel.Functions.GetFunction(searchPluginName, "search");
+ var result = await kernel.RunAsync(question, function);
+
+ Console.WriteLine(question);
+ Console.WriteLine($"----{searchPluginName}----");
+ Console.WriteLine(result.GetValue());
+
+ /* OUTPUT:
+
+ What's the largest building in the world?
+ ----
+ The Aerium near Berlin, Germany is the largest uninterrupted volume in the world, while Boeing's
+ factory in Everett, Washington, United States is the world's largest building by volume. The AvtoVAZ
+ main assembly building in Tolyatti, Russia is the largest building in area footprint.
+ ----
+ The Aerium near Berlin, Germany is the largest uninterrupted volume in the world, while Boeing's
+ factory in Everett, Washington, United States is the world's ...
+ */
+ }
+
+ private static async Task Example2Async(IKernel kernel)
+ {
+ Console.WriteLine("======== Use Search Plugin to answer user questions ========");
+
+ const string SemanticFunction = @"Answer questions only when you know the facts or the information is provided.
+When you don't have sufficient information you reply with a list of commands to find the information needed.
+When answering multiple questions, use a bullet point list.
+Note: make sure single and double quotes are escaped using a backslash char.
+
+[COMMANDS AVAILABLE]
+- bing.search
+
+[INFORMATION PROVIDED]
+{{ $externalInformation }}
+
+[EXAMPLE 1]
+Question: what's the biggest lake in Italy?
+Answer: Lake Garda, also known as Lago di Garda.
+
+[EXAMPLE 2]
+Question: what's the biggest lake in Italy? What's the smallest positive number?
+Answer:
+* Lake Garda, also known as Lago di Garda.
+* The smallest positive number is 1.
+
+[EXAMPLE 3]
+Question: what's Ferrari stock price? Who is the current number one female tennis player in the world?
+Answer:
+{{ '{{' }} bing.search ""what\\'s Ferrari stock price?"" {{ '}}' }}.
+{{ '{{' }} bing.search ""Who is the current number one female tennis player in the world?"" {{ '}}' }}.
+
+[END OF EXAMPLES]
+
+[TASK]
+Question: {{ $input }}.
+Answer: ";
+
+ var questions = "Who is the most followed person on TikTok right now? What's the exchange rate EUR:USD?";
+ Console.WriteLine(questions);
+
+ var oracle = kernel.CreateSemanticFunction(SemanticFunction, new OpenAIRequestSettings() { MaxTokens = 150, Temperature = 0, TopP = 1 });
+
+ var answer = await kernel.RunAsync(oracle, new(questions)
+ {
+ ["externalInformation"] = string.Empty
+ });
+
+ var result = answer.GetValue()!;
+
+ // If the answer contains commands, execute them using the prompt renderer.
+ if (result.Contains("bing.search", StringComparison.OrdinalIgnoreCase))
+ {
+ var promptRenderer = new BasicPromptTemplateEngine();
+
+ Console.WriteLine("---- Fetching information from Bing...");
+ var information = await promptRenderer.RenderAsync(result, kernel.CreateNewContext());
+
+ Console.WriteLine("Information found:");
+ Console.WriteLine(information);
+
+ // Run the semantic function again, now including information from Bing
+ answer = await kernel.RunAsync(oracle, new(questions)
+ {
+ // The rendered prompt contains the information retrieved from search engines
+ ["externalInformation"] = information
+ });
+ }
+ else
+ {
+ Console.WriteLine("AI had all the information, no need to query Bing.");
+ }
+
+ Console.WriteLine("---- ANSWER:");
+ Console.WriteLine(answer.GetValue());
+
+ /* OUTPUT:
+
+ Who is the most followed person on TikTok right now? What's the exchange rate EUR:USD?
+ ---- Fetching information from Bing...
+ Information found:
+
+ Khaby Lame is the most-followed user on TikTok. This list contains the top 50 accounts by number
+ of followers on the Chinese social media platform TikTok, which was merged with musical.ly in 2018.
+ [1] The most-followed individual on the platform is Khaby Lame, with over 153 million followers..
+ EUR – Euro To USD – US Dollar 1.00 Euro = 1.10 37097 US Dollars 1 USD = 0.906035 EUR We use the
+ mid-market rate for our Converter. This is for informational purposes only. You won’t receive this
+ rate when sending money. Check send rates Convert Euro to US Dollar Convert US Dollar to Euro..
+ ---- ANSWER:
+
+ * The most followed person on TikTok right now is Khaby Lame, with over 153 million followers.
+ * The exchange rate for EUR to USD is 1.1037097 US Dollars for 1 Euro.
+ */
+ }
+}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGoogleSkills.cs b/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGoogleSkills.cs
deleted file mode 100644
index dd0222edce4f..000000000000
--- a/dotnet/samples/KernelSyntaxExamples/Example07_BingAndGoogleSkills.cs
+++ /dev/null
@@ -1,189 +0,0 @@
-// Copyright (c) Microsoft. All rights reserved.
-
-using System;
-using System.Threading.Tasks;
-using Microsoft.SemanticKernel;
-using Microsoft.SemanticKernel.SkillDefinition;
-using Microsoft.SemanticKernel.Skills.Web;
-using Microsoft.SemanticKernel.Skills.Web.Bing;
-using Microsoft.SemanticKernel.Skills.Web.Google;
-using Microsoft.SemanticKernel.TemplateEngine;
-using RepoUtils;
-
-///
-/// The example shows how to use Bing and Google to search for current data
-/// you might want to import into your system, e.g. providing AI prompts with
-/// recent information, or for AI to generate recent information to display to users.
-///
-// ReSharper disable CommentTypo
-// ReSharper disable once InconsistentNaming
-public static class Example07_BingAndGoogleSkills
-{
- public static async Task RunAsync()
- {
- string openAIModelId = TestConfiguration.OpenAI.ModelId;
- string openAIApiKey = TestConfiguration.OpenAI.ApiKey;
-
- if (openAIModelId == null || openAIApiKey == null)
- {
- Console.WriteLine("OpenAI credentials not found. Skipping example.");
- return;
- }
-
- IKernel kernel = new KernelBuilder()
- .WithLogger(ConsoleLogger.Logger)
- .WithOpenAITextCompletionService(
- modelId: openAIModelId,
- apiKey: openAIApiKey)
- .Build();
-
- // Load Bing skill
- string bingApiKey = TestConfiguration.Bing.ApiKey;
-
- if (bingApiKey == null)
- {
- Console.WriteLine("Bing credentials not found. Skipping example.");
- }
- else
- {
- var bingConnector = new BingConnector(bingApiKey);
- var bing = new WebSearchEngineSkill(bingConnector);
- var search = kernel.ImportSkill(bing, "bing");
- await Example1Async(kernel, "bing");
- await Example2Async(kernel);
- }
-
- // Load Google skill
- string googleApiKey = TestConfiguration.Google.ApiKey;
- string googleSearchEngineId = TestConfiguration.Google.SearchEngineId;
-
- if (googleApiKey == null || googleSearchEngineId == null)
- {
- Console.WriteLine("Google credentials not found. Skipping example.");
- }
- else
- {
- using var googleConnector = new GoogleConnector(
- apiKey: googleApiKey,
- searchEngineId: googleSearchEngineId);
- var google = new WebSearchEngineSkill(googleConnector);
- var search = kernel.ImportSkill(new WebSearchEngineSkill(googleConnector), "google");
- await Example1Async(kernel, "google");
- }
- }
-
- private static async Task Example1Async(IKernel kernel, string searchSkillId)
- {
- Console.WriteLine("======== Bing and Google Search Skill ========");
-
- // Run
- var question = "What's the largest building in the world?";
- var result = await kernel.Func(searchSkillId, "search").InvokeAsync(question);
-
- Console.WriteLine(question);
- Console.WriteLine($"----{searchSkillId}----");
- Console.WriteLine(result);
-
- /* OUTPUT:
-
- What's the largest building in the world?
- ----
- The Aerium near Berlin, Germany is the largest uninterrupted volume in the world, while Boeing's
- factory in Everett, Washington, United States is the world's largest building by volume. The AvtoVAZ
- main assembly building in Tolyatti, Russia is the largest building in area footprint.
- ----
- The Aerium near Berlin, Germany is the largest uninterrupted volume in the world, while Boeing's
- factory in Everett, Washington, United States is the world's ...
- */
- }
-
- private static async Task Example2Async(IKernel kernel)
- {
- Console.WriteLine("======== Use Search Skill to answer user questions ========");
-
- const string SemanticFunction = @"Answer questions only when you know the facts or the information is provided.
-When you don't have sufficient information you reply with a list of commands to find the information needed.
-When answering multiple questions, use a bullet point list.
-Note: make sure single and double quotes are escaped using a backslash char.
-
-[COMMANDS AVAILABLE]
-- bing.search
-
-[INFORMATION PROVIDED]
-{{ $externalInformation }}
-
-[EXAMPLE 1]
-Question: what's the biggest lake in Italy?
-Answer: Lake Garda, also known as Lago di Garda.
-
-[EXAMPLE 2]
-Question: what's the biggest lake in Italy? What's the smallest positive number?
-Answer:
-* Lake Garda, also known as Lago di Garda.
-* The smallest positive number is 1.
-
-[EXAMPLE 3]
-Question: what's Ferrari stock price? Who is the current number one female tennis player in the world?
-Answer:
-{{ '{{' }} bing.search ""what\\'s Ferrari stock price?"" {{ '}}' }}.
-{{ '{{' }} bing.search ""Who is the current number one female tennis player in the world?"" {{ '}}' }}.
-
-[END OF EXAMPLES]
-
-[TASK]
-Question: {{ $input }}.
-Answer: ";
-
- var questions = "Who is the most followed person on TikTok right now? What's the exchange rate EUR:USD?";
- Console.WriteLine(questions);
-
- var oracle = kernel.CreateSemanticFunction(SemanticFunction, maxTokens: 200, temperature: 0, topP: 1);
-
- var context = kernel.CreateNewContext();
- context.Variables["externalInformation"] = "";
- var answer = await oracle.InvokeAsync(questions, context);
-
- // If the answer contains commands, execute them using the prompt renderer.
- if (answer.Result.Contains("bing.search", StringComparison.OrdinalIgnoreCase))
- {
- var promptRenderer = new PromptTemplateEngine();
-
- Console.WriteLine("---- Fetching information from Bing...");
- var information = await promptRenderer.RenderAsync(answer.Result, context);
-
- Console.WriteLine("Information found:");
- Console.WriteLine(information);
-
- // The rendered prompt contains the information retrieved from search engines
- context.Variables["externalInformation"] = information;
-
- // Run the semantic function again, now including information from Bing
- answer = await oracle.InvokeAsync(questions, context);
- }
- else
- {
- Console.WriteLine("AI had all the information, no need to query Bing.");
- }
-
- Console.WriteLine("---- ANSWER:");
- Console.WriteLine(answer);
-
- /* OUTPUT:
-
- Who is the most followed person on TikTok right now? What's the exchange rate EUR:USD?
- ---- Fetching information from Bing...
- Information found:
-
- Khaby Lame is the most-followed user on TikTok. This list contains the top 50 accounts by number
- of followers on the Chinese social media platform TikTok, which was merged with musical.ly in 2018.
- [1] The most-followed individual on the platform is Khaby Lame, with over 153 million followers..
- EUR – Euro To USD – US Dollar 1.00 Euro = 1.10 37097 US Dollars 1 USD = 0.906035 EUR We use the
- mid-market rate for our Converter. This is for informational purposes only. You won’t receive this
- rate when sending money. Check send rates Convert Euro to US Dollar Convert US Dollar to Euro..
- ---- ANSWER:
-
- * The most followed person on TikTok right now is Khaby Lame, with over 153 million followers.
- * The exchange rate for EUR to USD is 1.1037097 US Dollars for 1 Euro.
- */
- }
-}
diff --git a/dotnet/samples/KernelSyntaxExamples/Example08_RetryHandler.cs b/dotnet/samples/KernelSyntaxExamples/Example08_RetryHandler.cs
index d354e3cbd175..0adeec6e0967 100644
--- a/dotnet/samples/KernelSyntaxExamples/Example08_RetryHandler.cs
+++ b/dotnet/samples/KernelSyntaxExamples/Example08_RetryHandler.cs
@@ -2,12 +2,15 @@
using System;
using System.Net;
+using System.Net.Http;
+using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel;
-using Microsoft.SemanticKernel.Reliability;
-using Microsoft.SemanticKernel.Skills.Core;
-using Reliability;
+using Microsoft.SemanticKernel.Http;
+using Microsoft.SemanticKernel.Plugins.Core;
+using Microsoft.SemanticKernel.Reliability.Basic;
+using Polly;
using RepoUtils;
// ReSharper disable once InconsistentNaming
@@ -15,104 +18,157 @@ public static class Example08_RetryHandler
{
public static async Task RunAsync()
{
- var kernel = InitializeKernel();
- var retryHandlerFactory = new RetryThreeTimesWithBackoffFactory();
- InfoLogger.Logger.LogInformation("============================== RetryThreeTimesWithBackoff ==============================");
- await RunRetryPolicyAsync(kernel, retryHandlerFactory);
+ await DefaultNoRetryAsync();
- InfoLogger.Logger.LogInformation("========================= RetryThreeTimesWithRetryAfterBackoff =========================");
- await RunRetryPolicyBuilderAsync(typeof(RetryThreeTimesWithRetryAfterBackoffFactory));
+ await ReliabilityBasicExtensionAsync();
- InfoLogger.Logger.LogInformation("==================================== NoRetryPolicy =====================================");
- await RunRetryPolicyBuilderAsync(typeof(NullHttpRetryHandlerFactory));
+ await ReliabilityPollyExtensionAsync();
- InfoLogger.Logger.LogInformation("=============================== DefaultHttpRetryHandler ================================");
- await RunRetryHandlerConfigAsync(new HttpRetryConfig() { MaxRetryCount = 3, UseExponentialBackoff = true });
-
- InfoLogger.Logger.LogInformation("======= DefaultHttpRetryConfig [MaxRetryCount = 3, UseExponentialBackoff = true] =======");
- await RunRetryHandlerConfigAsync(new HttpRetryConfig() { MaxRetryCount = 3, UseExponentialBackoff = true });
+ await CustomHandlerAsync();
}
- private static async Task RunRetryHandlerConfigAsync(HttpRetryConfig? httpConfig = null)
+ private static async Task DefaultNoRetryAsync()
{
- var kernelBuilder = Kernel.Builder.WithLogger(InfoLogger.Logger);
- if (httpConfig != null)
- {
- kernelBuilder = kernelBuilder.Configure(c => c.SetDefaultHttpRetryConfig(httpConfig));
- }
+ InfoLogger.Logger.LogInformation("============================== Kernel default behavior: No Retry ==============================");
+ var kernel = InitializeKernelBuilder()
+ .Build();
- // Add 401 to the list of retryable status codes
- // Typically 401 would not be something we retry but for demonstration
- // purposes we are doing so as it's easy to trigger when using an invalid key.
- kernelBuilder = kernelBuilder.Configure(c => c.DefaultHttpRetryConfig.RetryableStatusCodes.Add(HttpStatusCode.Unauthorized));
+ await ImportAndExecutePluginAsync(kernel);
+ }
- // OpenAI settings - you can set the OpenAI.ApiKey to an invalid value to see the retry policy in play
- kernelBuilder = kernelBuilder.WithOpenAITextCompletionService("text-davinci-003", "BAD_KEY");
+ private static async Task ReliabilityBasicExtensionAsync()
+ {
+ InfoLogger.Logger.LogInformation("============================== Using Reliability.Basic extension ==============================");
+ var retryConfig = new BasicRetryConfig
+ {
+ MaxRetryCount = 3,
+ UseExponentialBackoff = true,
+ };
+ retryConfig.RetryableStatusCodes.Add(HttpStatusCode.Unauthorized);
- var kernel = kernelBuilder.Build();
+ var kernel = InitializeKernelBuilder()
+ .WithRetryBasic(retryConfig)
+ .Build();
- await ImportAndExecuteSkillAsync(kernel);
+ await ImportAndExecutePluginAsync(kernel);
}
- private static IKernel InitializeKernel()
+ private static async Task ReliabilityPollyExtensionAsync()
{
- var kernel = Kernel.Builder
- .WithLogger(InfoLogger.Logger)
- // OpenAI settings - you can set the OpenAI.ApiKey to an invalid value to see the retry policy in play
- .WithOpenAITextCompletionService("text-davinci-003", "BAD_KEY")
+ InfoLogger.Logger.LogInformation("============================== Using Reliability.Polly extension ==============================");
+ var kernel = InitializeKernelBuilder()
+ .WithRetryPolly(GetPollyPolicy(InfoLogger.LoggerFactory))
.Build();
- return kernel;
+ await ImportAndExecutePluginAsync(kernel);
}
- private static async Task RunRetryPolicyAsync(IKernel kernel, IDelegatingHandlerFactory retryHandlerFactory)
+ private static async Task CustomHandlerAsync()
{
- kernel.Config.SetHttpRetryHandlerFactory(retryHandlerFactory);
- await ImportAndExecuteSkillAsync(kernel);
+ InfoLogger.Logger.LogInformation("============================== Using a Custom Http Handler ==============================");
+ var kernel = InitializeKernelBuilder()
+ .WithHttpHandlerFactory(new MyCustomHandlerFactory())
+ .Build();
+
+ await ImportAndExecutePluginAsync(kernel);
}
- private static async Task RunRetryPolicyBuilderAsync(Type retryHandlerFactoryType)
+ private static KernelBuilder InitializeKernelBuilder()
{
- var kernel = Kernel.Builder.WithLogger(InfoLogger.Logger)
- .WithRetryHandlerFactory((Activator.CreateInstance(retryHandlerFactoryType) as IDelegatingHandlerFactory)!)
- // OpenAI settings - you can set the OpenAI.ApiKey to an invalid value to see the retry policy in play
- .WithOpenAITextCompletionService("text-davinci-003", "BAD_KEY")
- .Build();
+ return Kernel.Builder
+ .WithLoggerFactory(InfoLogger.LoggerFactory)
+ // OpenAI settings - you can set the OpenAI.ApiKey to an invalid value to see the retry policy in play
+ .WithOpenAIChatCompletionService(TestConfiguration.OpenAI.ChatModelId, "BAD_KEY");
+ }
- await ImportAndExecuteSkillAsync(kernel);
+ private static AsyncPolicy GetPollyPolicy(ILoggerFactory? logger)
+ {
+ // Handle 429 and 401 errors
+ // Typically 401 would not be something we retry but for demonstration
+ // purposes we are doing so as it's easy to trigger when using an invalid key.
+ const int TooManyRequests = 429;
+ const int Unauthorized = 401;
+
+ return Policy
+ .HandleResult(response =>
+ (int)response.StatusCode is TooManyRequests or Unauthorized)
+ .WaitAndRetryAsync(new[]
+ {
+ TimeSpan.FromSeconds(2),
+ TimeSpan.FromSeconds(4),
+ TimeSpan.FromSeconds(8)
+ },
+ (outcome, timespan, retryCount, _)
+ => InfoLogger.Logger.LogWarning("Error executing action [attempt {RetryCount} of 3], pausing {PausingMilliseconds}ms. Outcome: {StatusCode}",
+ retryCount,
+ timespan.TotalMilliseconds,
+ outcome.Result.StatusCode));
}
- private static async Task ImportAndExecuteSkillAsync(IKernel kernel)
+ private static async Task ImportAndExecutePluginAsync(IKernel kernel)
{
- // Load semantic skill defined with prompt templates
- string folder = RepoFiles.SampleSkillsPath();
+ // Load semantic plugin defined with prompt templates
+ string folder = RepoFiles.SamplePluginsPath();
- kernel.ImportSkill(new TimeSkill(), "time");
+ kernel.ImportFunctions(new TimePlugin(), "time");
- var qaSkill = kernel.ImportSemanticSkillFromDirectory(
+ var qaPlugin = kernel.ImportSemanticFunctionsFromDirectory(
folder,
- "QASkill");
+ "QAPlugin");
var question = "How popular is Polly library?";
InfoLogger.Logger.LogInformation("Question: {0}", question);
// To see the retry policy in play, you can set the OpenAI.ApiKey to an invalid value
- var answer = await kernel.RunAsync(question, qaSkill["Question"]);
- InfoLogger.Logger.LogInformation("Answer: {0}", answer);
+#pragma warning disable CA1031 // Do not catch general exception types
+ try
+ {
+ var answer = await kernel.RunAsync(question, qaPlugin["Question"]);
+ InfoLogger.Logger.LogInformation("Answer: {0}", answer.GetValue());
+ }
+ catch (Exception ex)
+ {
+ InfoLogger.Logger.LogInformation("Error: {0}", ex.Message);
+ }
+#pragma warning restore CA1031 // Do not catch general exception types
+ }
+
+ // Basic custom retry handler factory
+ public sealed class MyCustomHandlerFactory : HttpHandlerFactory
+ {
+ }
+
+ // Basic custom empty retry handler
+ public sealed class MyCustomHandler : DelegatingHandler
+ {
+ public MyCustomHandler(ILoggerFactory loggerFactory)
+ {
+ }
+
+ protected override Task SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
+ {
+ // Your custom http handling implementation
+ return Task.FromResult(new HttpResponseMessage(HttpStatusCode.BadRequest)
+ {
+ Content = new StringContent("My custom bad request override")
+ });
+ }
}
private static class InfoLogger
{
- internal static ILogger Logger => LogFactory.CreateLogger