-
Notifications
You must be signed in to change notification settings - Fork 274
Issues: xusenlinzy/api-for-open-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Can I manually modify the code so that the new model can be supported?
#316
opened Dec 10, 2024 by
zhanglixuan0720
2 tasks done
TASKS=llm,rag模式下,出现线程问题报错:RuntimeError: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method
#308
opened Aug 21, 2024 by
syusama
2 tasks done
💡 [REQUEST] - 请问可以支持中国电信大模型Telechat吗?流程可以跑通,但是回复content会被截断
question
Further information is requested
#301
opened Aug 2, 2024 by
Song345381185
什么时候能修复 Qwen 1.5 call function功能了。
question
Further information is requested
#273
opened May 11, 2024 by
skyliwq
Qwen1.5不支持tool_choice
enhancement
New feature or request
#245
opened Mar 12, 2024 by
YunmengLiu0
2 tasks done
docker vllm运行qwen1.5-7b-chat无法部署,报错:Fatal Python error: Bus error
#242
opened Mar 2, 2024 by
syusama
2 tasks done
Qwen1.5推理报错RuntimeError: cannot reshape tensor of 0 elements into shape [-1, 0] because the unspecified dimension size -1 can be any value and is ambiguous
#241
opened Feb 28, 2024 by
syusama
2 tasks done
💡 [REQUEST] - 请问如何支持 Qwen/Qwen-VL-Chat
question
Further information is requested
#105
opened Sep 6, 2023 by
wangschang
ProTip!
Exclude everything labeled
bug
with -label:bug.