Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama support #198

Open
michaelmior opened this issue Sep 11, 2024 · 2 comments
Open

Ollama support #198

michaelmior opened this issue Sep 11, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@michaelmior
Copy link

Feature request

Is your feature request related to a problem? Please describe.

I'd like to be able to use a local model with Ollama.

Describe the solution you'd like

I want to be able to use a local Ollama instance with seed.

Describe alternatives you've considered

Cloud based models require an Internet connection as well as an account with a third party.
It would be possible to use something like LocalAI which provides an OpenAI-compatible API for local models, but this is a pretty heavyweight solution.

Additional context

LangChain.js already has Ollama support.
It seems like this could be as simple as allowing an OLLAMA_MODEL environment variable that can be used to construct a ChatOllama instance.

@michaelmior michaelmior added the enhancement New feature or request label Sep 11, 2024
@BRAVO68WEB
Copy link

+1

1 similar comment
@chirag3003
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants