Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: All backends using gpt-3.5-turbo as default model #1192

Open
3 of 4 tasks
gyliu513 opened this issue Jul 12, 2024 · 1 comment
Open
3 of 4 tasks

[Bug]: All backends using gpt-3.5-turbo as default model #1192

gyliu513 opened this issue Jul 12, 2024 · 1 comment

Comments

@gyliu513
Copy link
Contributor

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've included steps to reproduce the behavior

Affected Components

  • K8sGPT (CLI)
  • K8sGPT Operator

K8sGPT Version

v0.3.38

Kubernetes Version

v1.30

Host OS and its Version

No response

Steps to reproduce

gyliu@guangyas-air k8sgpt % ./bin/k8sgpt auth add --backend watsonxai
Warning: model input is empty, will use the default value: gpt-3.5-turbo
Provider with same name already exists.
gyliu@guangyas-air k8sgpt % ./bin/k8sgpt auth add --backend localai
Warning: model input is empty, will use the default value: gpt-3.5-turbo
localai added to the AI backend provider list

From above, we can see both watrsonxai and localai settinng gpt-3.5-turbo as default model, this is not right.

Expected behaviour

Each LLM backend should have its own default model

Actual behaviour

No response

Additional Information

No response

@AlexsJones
Copy link
Member

This is certainly something we could change

@gyliu513 gyliu513 changed the title [Bug]: All backends using got-3.5-turbo as default model [Bug]: All backends using gpt-3.5-turbo as default model Aug 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Proposed
Development

No branches or pull requests

2 participants