Skip to content

Add llm-prompt-default-max-tokens, open AI token limit fixes, parallel tool use fixes

Latest
Compare
Choose a tag to compare
@ahyatt ahyatt released this 07 Sep 21:11
· 4 commits to main since this release
e6bc34d

What's Changed

  • Fix breakage with Open AI's llm-chat-token-limit by @ahyatt in #77
  • Fix Vertex and Open AI's parallel call tool use by @ahyatt in #78
  • Add variable llm-prompt-default-max-tokens by @ahyatt in #79
  • Fix how we look for ollama models in integration tests by @ahyatt in #80

Full Changelog: 0.17.3...0.17.4