llm-prompt added, a way to define and fill prompts for LLMs
- Introduced
llm-prompt
for prompt management and creation from generators. - Removed Gemini and Vertex token counting, because
llm-prompt
uses token
counting often and it's best to have a quick estimate than a more expensive
more accurate count.