Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] New item in the settings Api #5310

Open
investing0 opened this issue Aug 23, 2024 · 1 comment
Open

[Feature Request] New item in the settings Api #5310

investing0 opened this issue Aug 23, 2024 · 1 comment

Comments

@investing0
Copy link

Hello, is it possible to add "max new tokens" in the settings to change the length of the response from LLM? In some compatible api openai, there is a default limit, for example, in the response 400 tokens, but if you specify, for example, as in SillyTavern there is a setting Response (tokens) 2000 tokens, then the length of the response will match and will not be cut off

@nextchat-manager
Copy link

Please follow the issue template to update title and description of your issue.

@investing0 investing0 changed the title New item in the settings Api [Feature Request] New item in the settings Api Aug 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant