OllamaPromptExecutionSettings is missing ResponseFormat property for structured outputs #10036
Closed
sven634231
started this conversation in
General
Replies: 1 comment
-
duplicate to #9919 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As Ollama also supports structured outputs in JSON format (see https://github.com/ollama/ollama/blob/main/docs/api.md#request-structured-outputs)
The possibility is missing in OllamaPromptExecutionSettings to define response format and schema similar to openai models (https://devblogs.microsoft.com/semantic-kernel/using-json-schema-for-structured-output-in-net-for-openai-models/#structured-outputs-with-system.type)
Beta Was this translation helpful? Give feedback.
All reactions