-
Notifications
You must be signed in to change notification settings - Fork 647
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Add a custom AI that can call rest API endpoint #990
Comments
Hi @lili-wan thanks for creating this issue.
Would this work for you? Or do you have a completely different API? |
Is it necessary to provide a localai sample project for user to make integration with their self hosted llm? Just a suggestion. @AlexsJones |
There are lots of tutorials on how to use k8sgpt/localAI. |
I've tried to setup the localai to point to a local endpoint made with huggingface tgi.
but I get a
Running a test query although works fine
|
Checklist
Is this feature request related to a problem?
Yes
Problem Description
Our company developed another platform on top of openAI for compliance and security reason. And we can only use the internal API (similar API signature like openAI). Currently the openAI integration directly calls CreateChatCompletion as go-client code which does not work for our use case
Solution Description
Can we have a AI option that can call AI API as rest point? It will take the following parameters as configurable input from CLI/API
Benefits
this provide more flexibility on the client side to configure the rest endpoint directly
Potential Drawbacks
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: