Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Naas Chat Plugin - Check prompt tokens #2222

Closed
FlorentLvr opened this issue Sep 27, 2023 · 0 comments · Fixed by #2215
Closed

Naas Chat Plugin - Check prompt tokens #2222

FlorentLvr opened this issue Sep 27, 2023 · 0 comments · Fixed by #2215
Assignees
Labels
enhancement New feature or request

Comments

@FlorentLvr
Copy link
Contributor

This notebook is designed to check whether the number of tokens in a prompt exceeds the maximum limit of the model being used. You will have the ability to set a limit as a percentage based on the model's maximum token count. This feature ensures that you can engage in conversations without exceeding the context window. The function automatically calculates the maximum token count.

@FlorentLvr FlorentLvr self-assigned this Sep 27, 2023
@FlorentLvr FlorentLvr added the enhancement New feature or request label Sep 27, 2023
@FlorentLvr FlorentLvr linked a pull request Sep 27, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant