Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for custom OpenAI-Compatible backends #1231

Open
keinsell opened this issue Feb 11, 2024 · 3 comments
Open

Support for custom OpenAI-Compatible backends #1231

keinsell opened this issue Feb 11, 2024 · 3 comments
Labels
feature A new feature

Comments

@keinsell
Copy link

keinsell commented Feb 11, 2024

What's the problem?

Usage of local large language models through LocalAI, LMStudio and so on, all of these are providing OpenAI-compatible API, but application need to expose a setting to change baseurl.

What's the outcome?

  • Ability to use Azure OpenAI services, for people who want alternative hosting to OpenAI.
  • Ability to use custom large language models (ex. local ones) that are compatible with OpenAI API.

Related Issues: #415 #1094

@keinsell keinsell added the feature A new feature label Feb 11, 2024
@rodion-m
Copy link

rodion-m commented Feb 23, 2024

An ability to use custom backends is really required. Azure OpenAI supporting as well. Even as a paid feature.

@recursionbane
Copy link

Yes, AzureOpenAI support, please.

@ggordonhall
Copy link
Contributor

We've open-sourced the OpenAI API logic (https://github.com/BloopAI/bloop/tree/oss/server/bleep/src/llm). You can now build and run bloop with a custom OpenAI API key (see: https://github.com/BloopAI/bloop?tab=readme-ov-file#building-from-source).

Feel free to open a PR adding support for Azure OpenAI 😀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A new feature
Projects
None yet
Development

No branches or pull requests

4 participants