Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: 什么时候支持开源的gemma #4119

Closed
ynclub opened this issue Feb 24, 2024 · 20 comments
Closed

[Feature Request]: 什么时候支持开源的gemma #4119

ynclub opened this issue Feb 24, 2024 · 20 comments
Labels
duplicate This issue or pull request already exists enhancement New feature or request

Comments

@ynclub
Copy link

ynclub commented Feb 24, 2024

Problem Description

没有开源的gemma的设置

Solution Description

什么时候支持开源的gemma?或者现在用,怎么设置?

Alternatives Considered

No response

Additional Context

No response

@ynclub ynclub added the enhancement New feature or request label Feb 24, 2024
@nextchat-manager
Copy link

Please follow the issue template to update title and description of your issue.

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Title: [Feature Request]: When will open source gemma be supported?

Problem Description

There is no open source gemma settings

Solution Description

When will open source gemma be supported? Or if it is used now, how to set it up?

Alternatives Considered

No response

Additional Context

No response

@liheji
Copy link

liheji commented Feb 25, 2024

#4030 Ollama 集成在计划中

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


#4030 Ollama is integrated in the plan

@fred-bf fred-bf added the duplicate This issue or pull request already exists label Feb 26, 2024
@fred-bf fred-bf closed this as completed Feb 26, 2024
@fred-bf
Copy link
Contributor

fred-bf commented Feb 27, 2024

@ynclub @liheji The latest release of NextChat supports Ollama. You can check out the documentation here to use Gemma through Ollama: https://docs.nextchat.dev/models/ollama

@Leroy-X
Copy link

Leroy-X commented Mar 4, 2024

@ynclub @liheji The latest release of NextChat supports Ollama. You can check out the documentation here to use Gemma through Ollama: https://docs.nextchat.dev/models/ollama

你好,我将OLLAMA_ORIGINS=tauri://localhost添加到环境变量中得到错误如下,
image

所以我把tauri替换成http,但是在nextweb中无法获取并得到403的错误
image

@fred-bf
Copy link
Contributor

fred-bf commented Mar 4, 2024

@ynclub @liheji The latest release of NextChat supports Ollama. You can check out the documentation here to use Gemma through Ollama: https://docs.nextchat.dev/models/ollama

你好,我将OLLAMA_ORIGINS=tauri://localhost添加到环境变量中得到错误如下, image

所以我把tauri替换成http,但是在nextweb中无法获取并得到403的错误 image

You can try change env OLLAMA_ORIGINS value to *://localhost

@Leroy-X
Copy link

Leroy-X commented Mar 4, 2024

You can try change env OLLAMA_ORIGINS value to *://localhost

你好, 依旧出错 "(
image

@eryajf
Copy link

eryajf commented Mar 5, 2024

You can try change env OLLAMA_ORIGINS value to *://localhost

你好, 依旧出错 "( image

估计还是因为你的环境变量没有设置成功的缘故,我一开始使用如下命令启动:

OLLAMA_ORIGINS=*://localhost;OLLAMA_HOST="0.0.0.0";ollama serve

然后请求也是看到403,改成下边的方式启动,就能正常请求了:

export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


You can try change env OLLAMA_ORIGINS value to *://localhost

Hello, still error "( ![image](https://private-user-images.githubusercontent.com/13515498/309756743-65743a7d-985a-402d-9308-c35aacedb01e.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9. eyJpc3 MiOiJna Q5OC8zMDk3NTY3NDMtNjU3NDNhN2QtOTg1YS00MDJkLTkzMDgtYzM1YWFjZWRiMDFlLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzU FFLNFpBJTJGMjAyNDAzMDUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwMzA1VDEzNDQ0MVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWExM2E0 ZTk1NWVkOTQ1ZmUwMzZkM2I3Y2IwNDUxZTgxMzU4ODgzYWQ4YzRiZDY3ODc2NmIyNjBhNWYwODc1YzgmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVw b19pZD0wIn0.tGEGh0rscynVnx49lv_LcOL5AEfhx7OcKKnbnyKzZsc)

It is probably because your environment variables were not set successfully. I started with the following command:

OLLAMA_ORIGINS=*://localhost;OLLAMA_HOST="0.0.0.0";ollama serve

Then the request will also see 403. If you change it to the following method and start it, you can request normally:

export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

@Leroy-X
Copy link

Leroy-X commented Mar 7, 2024

image
还是403,不管是否含有 export 都是如此,windows除了设置环境变量还有其他方式吗,我看有的说修改启动文件,不知道和环境变量是不是一个东西

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


image
It’s still 403, whether it contains export or not. Is there any other way to set environment variables in Windows? I’ve read that some people say modifying the startup file, but I don’t know if it’s the same thing as environment variables.

@wanix1988
Copy link

@Leroy-X 这个问题你解决了吗?我的Linux环境也有一样的问题。
export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

image

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@Leroy-X Have you solved this problem? I have the same problem in my Linux environment.
export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

image

@Leroy-X
Copy link

Leroy-X commented Mar 8, 2024

@wanix1988 没有,我是windows

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@wanix1988 No, I am windows

@Leroy-X
Copy link

Leroy-X commented Mar 16, 2024

@Leroy-X 这个问题你解决了吗?我的Linux环境也有一样的问题。 export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

你好,我参考了#4219, 添加环境变量 OLLAMA_ORIGINS=* 解决了。

希望开发在文档里面标注下,谢谢

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@Leroy-X Have you solved this problem? I have the same problem in my Linux environment. export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="0.0.0.0";ollama serve

Hello, I referred to #4219 and added the environment variable OLLAMA_ORIGINS=* to solve the problem.

I hope the development will be marked in the document, thank you.

@cnliucheng
Copy link

在mac环境下如果要使用127.0.0.1:11434或localhost:11434,应该这样设置启动才行:
export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="127.0.0.1";ollama serve

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


If you want to use 127.0.0.1:11434 or localhost:11434 in a mac environment, you should set up startup like this:
export OLLAMA_ORIGINS=*://localhost;export OLLAMA_HOST="127.0.0.1";ollama serve

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

8 participants