-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: LLM invoked KernelFunction but final answer does not use result from function #5622
Comments
What's happening in this case is as follows:
You are expecting the function call result to be returned but this is not guaranteed because the LLM decides on the final response. We are investigating adding the the ability to allow developer to decide whether or not the LLM will be called after a function call to allow the return value to be used as the final answer. |
Tracking via this request #5436 (comment) |
@markwallace-microsoft thanks for your explain, but it should have a way to put prompt to ask LLM force answer base on result from plugin right? I think its behavior just depend on our prompt |
it's not make sense when LLM decide to use function to get solution and after that it changes its mind and want to use their solution. I think we can ask it use result from function as source of truth. |
Hi Mark, Has this issue been resolved? OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new() This runs a function lets say, "Turn_on_lights".
How do I avoid that the reply in step 2? |
Context: Issue Is there any way to disable passing the invoked function response back to LLM? |
Describe the bug
I have a function, LLM invoked it, the function return value - the value is fixed = 10, but LLM does not use it, LLM make it up
To Reproduce
Code to reproduce the behavior:
Expected behavior
LLM should use anwser result that function provided
Screenshots
Platform
The text was updated successfully, but these errors were encountered: