Find answers from the community

Updated 9 months ago

I have a suggestion.

I have a suggestion.

There is an issue trying to incorporate the prompts for various languages. Sometimes it goes swimmingly like gpt4 - claude or vice versa.
Other times it struggles. Like trying to use those prompts with multilingual models, small models, custom specialized models, yada yada.

So how about some centralized hub for storing these prompt sets that can be dynamically set through Settings? The community can create and share entire promptsets.
L
s
18 comments
Thats definitely in the roadmap πŸ™‚
First step is finding all the prompts in the codebase and centralizing them in some sort of registry
Then from there, customizing is straightforward
Probably a P2 priority right now though internally
Are you all tracking issues outside github?
Other than the language one don't see corresponding issue
yea we have an internal list of priorities/projects, the prompt template thing is one of them πŸ‘

For full transparency, the current planned stuff for the next month or two
  • improved callbacks (better interface, more coverage)
  • better docs (focus an actual API documentation, rather than flooding with random notebooks)
  • converting underlying query engines/agents to use new query pipeline syntax (and improve the query pipeline while doing this)
  • refactor the current KG Index, introduce better abstractions for building/querying KGs
This is on top of normal bug fixes and other small things of course
Speaking of callbacks, not sure if it's there in your tracker, but some LLMs do have their own built-in token-counters. Probably some abstraction can be made to reflect actual values from it, rather than guesswork from tiktoken
openai (and APIs that have a similar api) already pull token usage from it πŸ‘
OK. bedrock's responses are different for different providers.

Will take a dig into this later.
classic bedrock πŸ˜† Singular API definition? nahhh
I wonder how vertex does it
Or any of the public APIs for other LLMs
I think the logic may be that the vendors get to define their APIs. Using REST through Anthropic's public API or through bedrock (or Cohere or other vendors) will be exactly the same.
ah I see, that makes sense. But also extremely hard to support nicely in a framework haha
yes

I wonder if we can define some abstraction layer and pass it to the provider specific API class.
Like for Anthropic provider use the antropic class rather than redefining everything in bedrock class.
Yea I was thinking the same. Like, its basically just a different auth layer on top of antropic for example
probably the class could be refactored to handle this πŸ€”
Add a reply
Sign up and join the conversation on Discord