Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama 使用实践 #44

Open
myml opened this issue Sep 13, 2024 · 0 comments
Open

ollama 使用实践 #44

myml opened this issue Sep 13, 2024 · 0 comments

Comments

@myml
Copy link
Owner

myml commented Sep 13, 2024

安装

linux 使用命令 curl -fsSL https://ollama.com/install.sh | sh 一键安装

更新

更新只需要再次执行curl -fsSL https://ollama.com/install.sh | sh 即可

翻译接入

使用 systemctl edit ollama.service 命令,添加以下配置

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

不要直接编辑/etc/systemd/system/ollama.service,因为更新ollama时会覆盖这个文件。

浏览器安装“沉浸式翻译”插件,进入插件设置 -> 翻译服务 -> OpenAI -> 设置
模型可以手动设置用ollama下载的

vscode 接入

vscode安装Continue 插件,打开插件侧边栏,点最下面的设置,可以编辑配置使用ollama, 下面是配置参考

  "models": [
    {
      "title": "Llama Code 13b",
      "provider": "ollama",
      "model": "codellama:13b",
      "apiBase": "https://xxx.xxx.xxx"
    }
  ],
  "tabAutocompleteModel": {
    "title": "deepseekCoder 2",
    "provider": "ollama",
    "model": "deepseek-coder-v2",
    "apiBase": "https://xxx.xxx.xxx"
  },
  "embeddingsProvider": {
    "provider": "ollama",
    "model": "nomic-embed-text",
    "apiBase": "https://xxx.xxx.xxx"
  }
@myml myml changed the title ollama 使用事件 ollama 使用实践 Sep 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant