-
Notifications
You must be signed in to change notification settings - Fork 60.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update vision model check in utils.ts #4941
Conversation
compatible with openrouter model format, see https://openrouter.ai/models/google/gemini-flash-1.5
@wolf-joe is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe update enhances the Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- app/utils.ts (1 hunks)
Additional comments not posted (1)
app/utils.ts (1)
258-259
: LGTM!The new model versions "gemini-pro-1.5" and "gemini-flash-1.5" have been correctly added to the
visionKeywords
array.
I believe that using a custom model feature is more appropriate here. We can't keep adding identical models indefinitely if they are just named differently by different providers. |
The issue is that when the model is set to |
这个现在可以通过CUSTOM_MODELS配置使用,所以,不需要往内置的模型添加这种不在官方列表中的模型 |
This can now be configured through CUSTOM_MODELS, so there is no need to add this model that is not in the official list to the built-in model. |
compatible with openrouter model format, see
Summary by CodeRabbit