Skip to content

llm-prompt added, a way to define and fill prompts for LLMs

Compare
Choose a tag to compare
@ahyatt ahyatt released this 13 Jul 21:52
· 30 commits to main since this release
  • Introduced llm-prompt for prompt management and creation from generators.
  • Removed Gemini and Vertex token counting, because llm-prompt uses token
    counting often and it's best to have a quick estimate than a more expensive
    more accurate count.