Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Upstream URL for llama2 completion example #7242

Merged
merged 2 commits into from
Apr 16, 2024
Merged

Conversation

dascole
Copy link
Contributor

@dascole dascole commented Apr 16, 2024

Description

The example provided in the curl command references the ollama endpoint, http://ollama-server.local:11434/v1/chat, which returns a 404.

Per the docs (referenced earlier on this page), the correct endpoint should be /api/chat which I can confirm works.

https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion

example:

$ curl localhost:11434/api/generate -d '{ "model": "llama2", "prompt":"Why does Kong make the best gateway?", "stream":false }' -i
HTTP/1.1 200 OK

$ curl localhost:11434/v1/generate -d '{ "model": "llama2", "prompt":"Why does Kong make the best gateway?", stream":false }' -i
HTTP/1.1 404 Not Found
Content-Type: text/plain
Date: Tue, 16 Apr 2024 01:56:56 GMT

404 page not found

Testing instructions

Preview link:

Checklist

For example, if this change is for an upcoming 3.6 release, enclose your content in {% if_version gte:3.6.x %} <content> {% endif_version %} tags (or if_plugin_version tags for plugins).

Use any of the following keys:

  • gte:<version> - greater than or equal to a specific version
  • lte:<version> - less than or equal to a specific version
  • eq:<version> - exactly equal to a specific version

You can do the same for older versions.

The example provided in the curl command references the ollama endpoint, http://ollama-server.local:11434/v1/chat, which returns a 404.

Per the docs (referenced earlier on this page), the correct endpoint should be /api/chat which I can confirm works.

https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
@dascole dascole added the review:general Review for general accuracy and presentation. Does the doc work? Does it output correctly? label Apr 16, 2024
@dascole dascole requested a review from a team as a code owner April 16, 2024 01:58
Copy link

netlify bot commented Apr 16, 2024

Deploy Preview for kongdocs ready!

Name Link
🔨 Latest commit 3acaefb
🔍 Latest deploy log https://app.netlify.com/sites/kongdocs/deploys/661ee19c4cf1760008653767
😎 Deploy Preview https://deploy-preview-7242--kongdocs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.
Lighthouse
Lighthouse
9 paths audited
Performance: 93 (no change from production)
Accessibility: 93 (no change from production)
Best Practices: 98 (🟢 up 8 from production)
SEO: 91 (no change from production)
PWA: -
View the detailed breakdown and full score reports

To edit notification comments on pull requests, go to your Netlify site configuration.

@lena-larionova lena-larionova changed the title Update _llama2.md Fix: Upstream URL for llama2 completion example Apr 16, 2024
Copy link
Contributor

@lena-larionova lena-larionova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Applied the same fix to the curl-format example as well. LGTM, thanks!

@lena-larionova lena-larionova merged commit 0434574 into main Apr 16, 2024
15 checks passed
@lena-larionova lena-larionova deleted the dascole-patch-1 branch April 16, 2024 22:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
review:general Review for general accuracy and presentation. Does the doc work? Does it output correctly?
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants