Huggingface inference API not working with openai_compatible #517
bassamsdata
started this conversation in
General
Replies: 1 comment 7 replies
-
What's your adapter currently look like? I would have thought the openai_compatible would work just fine. Something like: require("codecompanion").setup({
adapters = {
hugging_face = function()
return require("codecompanion.adapters").extend("openai_compatible", {
url = "${url}/${model}/${chat_url}",
env = {
url = "https://api-inference.huggingface.co/models",
model = "schema.model.default",
chat_url = "v1/chat/completions",
api_key = "YOUR_HF_KEY",
},
schema = {
model = {
default = "meta-llama/Llama-3.2-3B-Instruct"
},
},
})
}
)}
|
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm trying to integrate the Hugging Face Inference API. I tested it with the
openai
andopenai_compatible
adapters, but it didn't work. The challenge lies in its unique URL structure, where the model is part of the URL itself. This makes switching between multiple models tricky, as changing the repo/model (e.g.,meta-llama/Llama-3.2-3B-Instruct
) requires altering the URL as well.If anyone can assist, I’d greatly appreciate it. If there’s currently no solution, I’d be happy to add a Hugging Face adapter to CodeCompanion, given its extensive model library (complete with tests, of course).
Thank you
edit: add code curl
Beta Was this translation helpful? Give feedback.
All reactions