Ollama
#1534
Replies: 3 comments 1 reply
-
@lenzenc Currently, promptflow only support models deployed to AML through model catalog, local model is not supported. + @prakharg-msft |
Beta Was this translation helpful? Give feedback.
0 replies
-
Any update on whether Promptflow has support for locally deployed models? |
Beta Was this translation helpful? Give feedback.
1 reply
-
@lenzenc I think you can use some api proxy system like one-api or new-api. I use it now |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is there any support for or example of custom code that can be used to call local models via Ollama?
Beta Was this translation helpful? Give feedback.
All reactions