Find answers from the community

Updated 2 months ago

Do issues in GitHub occasionally get

Do issues in GitHub occasionally get buried with no interaction?
W
L
s
10 comments
No, it may get delayed a bit but they are checked definitely.
I mean, usually if it gets buried, it's because OP never responded
Was thinking of improving the react agent, but this is a dependency
Yea that is a bigger job πŸ˜… You are welcome to work on it. Technically some of the prompts are already "pre-filled", like the core RAG prompts. Although this really only applies to non-chat models I think?

Not every LLM supports stop tokens either

This general setup could be introduced as a new agent maybe?
What's the standard keyword for stop_tokens ? some LLMs have that as different name
Or is it something not yet defined?
Different name, or some apis just straight up do not expose that as an option

You could manually expose it, but that would only work for streaming
I mean having that as a standard in baseLLM will be easier to deal with. Like max tokens having various names is still standardized across bedrock.
with default being none
changing these base classes is always a huge ton of work, so thats my main reason for not jumping on this πŸ˜› Hard to prioritize it in favor of other things personally, but as always, community PRs are welcome
Add a reply
Sign up and join the conversation on Discord