-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI compatible API support #5
Comments
@Zane-XY yes, endpoints will be configurable per adapter kind. I need to find the right way to do it (e.g., host/port v.s path ...) |
@Zane-XY btw, feel free to explain your particular usecase. I will make sure it get covered. |
In my use case, the service url and models are different, but the service is OpenAI compatible. Really appreciated the fast response! |
@Zane-XY thanks. Is this aws bedrock / Google vertexai, or a custom service somewhere ? |
It's an enterprise hosted AI service. |
Ok, that's will probably be a Custom Adapter then. I will get to it, genai should support this usecase. |
👍 +1 I'm using Jan.ai, TabbyML and LM Studio to run local models with local API server exposing an OpenAI-compatible API. |
Hi! + for this feature. Basically, it would be better to just make API base URL to be a variable that can be changed in constructor. That's what I did to use ollama (it was not
|
seems it supported now; |
I noticed that the url in this crate is hardcoded, which currently only supports the official API endpoints of each AI service provider.
Would there be a plan to make the endpoint configurable? For example, allowing users to specify Azure OpenAI endpoints through configuration would greatly enhance flexibility.
The text was updated successfully, but these errors were encountered: