Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Result CR not showing any details under Specs.Details section #400

Open
3 of 4 tasks
kodefoundry opened this issue Apr 3, 2024 · 3 comments
Open
3 of 4 tasks

Comments

@kodefoundry
Copy link

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've included steps to reproduce the behavior

Affected Components

  • K8sGPT (CLI)
  • K8sGPT Operator

K8sGPT Version

v.0.3.29

Kubernetes Version

v1.29.3+rke2r1

Host OS and its Version

SLES15 SP 4

Steps to reproduce

  1. Deploy K8sGPT following the Readme.md
  2. Verify scan result using "kubectl describe result"

Manifest
apiVersion: core.k8sgpt.ai/v1alpha1 kind: K8sGPT metadata: name: k8sgpt-sample namespace: k8sgpt spec: ai: enabled: true model: gpt-3.5-turbo backend: openai secret: name: opemai-secret key: openai-api-key # anonymized: false # language: english noCache: false repository: ghcr.io/k8sgpt-ai/k8sgpt version: v0.3.29

Expected behaviour

Specs.Details section should have populated some details that were queried from OpenAI backend.

Actual behaviour

Specs.Detail section is empty
Output from kubectl describe result jarvis -n k8sgpt
Name: jarvis Namespace: k8sgpt Labels: k8sgpts.k8sgpt.ai/backend=openai k8sgpts.k8sgpt.ai/name=k8sgpt-sample k8sgpts.k8sgpt.ai/namespace=k8sgpt Annotations: <none> API Version: core.k8sgpt.ai/v1alpha1 Kind: Result Metadata: Creation Timestamp: 2024-04-03T12:49:47Z Generation: 1 Resource Version: 3369563 UID: 5345ff82-5e7b-4874-9fbe-218e8c6e1762 Spec: Backend: openai Details: Error: Sensitive: Masked: KlkvdlVI Unmasked: jarvis Text: jarvis has condition of type EtcdIsVoter, reason MemberNotLearner: Node is a voting member of the etcd cluster Kind: Node Name: jarvis Parent Object: Status: Lifecycle: historical Events: <none>

Additional Information

Logs from the k8sgpt POD

{"level":"info","ts":1712149239.235297,"caller":"server/log.go:50","msg":"request completed","duration_ms":1009,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":"...:59408"}
{"level":"info","ts":1712149275.4206443,"caller":"server/log.go:50","msg":"request completed","duration_ms":1007,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":"
.**.
.:49692"}
{"level":"info","ts":1712149311.6114714,"caller":"server/log.go:50","msg":"request completed","duration_ms":1007,"method":"/schema.v1.ServerService/Analyze","request":"backend:"openai" anonymize:true language:"english" max_concurrency:10 output:"json"","remote_addr":"
..*.:59432"}

@qdrddr
Copy link

qdrddr commented Apr 3, 2024

Same issue here.

Checklist

  • I've searched for similar issues and couldn't find anything matching *
  • I've included steps to reproduce the behavior *

Affected Components

  • K8sGPT (CLI)
  • K8sGPT Operator

K8sGPT Version
v.0.3.29

Kubernetes Version
v1.27.12+rke2r1

Host OS and its Version
Ubuntu 22.04.04

Local-Ai Version:
v2.11.0-aio-cpu

Description

My Result.example.yaml the Spec.Details field does not show details from my local-AI. Also, in the localai pod, the debug log I do not see any HTTP requests when I’m running k8sgpt:v0.3.29 with localai:v2.11.0-aio-cpu & the ggml-gpt4all-j model, both localai & k8sgpt are in the same namespace.

Expected behavior

Spec.Details from the Result CR section should have populated details with responses from Local-AI that were queried to the backend.

Steps to reproduce

I deployed k8sgpt as a helm-chart operator and added this CR:

apiVersion: core.k8sgpt.ai/v1alpha1
kind: K8sGPT
metadata:
  name: k8sgpt-resource
  namespace: k8sgpt-operator-system
spec:
  ai:
    enabled: true
    backend: localai
    model: ggml-gpt4all-j_f5d8f27287d3
    baseUrl: http://local-ai.k8sgpt-operator-system.svc.cluster.local/v1
    anonymized: true
    language: english
  noCache: false
  version: v0.3.29

Actual behavior

In the Result.example.yaml CR manifest, the Spec.Details field is empty.

Additional Information

However, when I curl to localai, it works just fine, and in localai pod, I can see debug with this request and I can see a response from my localai model ggml-gpt4all-j_f5d8f27287d3

curl http://local-ai.k8sgpt-operator-system.svc.cluster.local/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{ "model": "ggml-gpt4all-j_f5d8f27287d3", "messages": [{"role": "user", "content": "How are you doing?", "temperature": 0.1}] }'
{"created":1712108557,"object":"chat.completion","id":"c264e40f-98a4-4ff3-9acf-9bad12aebd08","model":"ggml-gpt4all-j_f5d8f27287d3","choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"As an AI language model, I am doing well. Thank you for asking!"}}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}

My log files are attached:
kube-rbac-proxy.log
k8sgpt.log
k8sgpt-manager.log

@qdrddr
Copy link

qdrddr commented Apr 3, 2024

I just noticed k8sgpt-operator-controller-manager pod logs, from the manager container:

failed to call Analyze RPC: rpc error: code = Unknown desc = failed while calling AI provider localai: error, status code: 520, message: invalid character '<' looking for beginning of value"}
Finished Reconciling k8sGPT with error: failed to call Analyze RPC: rpc error: code = Unavailable desc = error reading from server: EOF

@qdrddr
Copy link

qdrddr commented Apr 4, 2024

I have deployed k8sgpt-operator v0.1.3 and my Result manifests are no longer created

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants