Find answers from the community

Updated 2 years ago

Hi I was running into the same issue One

Hi, I was running into the same issue. One problem I've found is that if I set the max tokens manually, I might get an error by sending to many tokens.
j
s
3 comments
@statsman could you post the code you're using? depending on which LLM you're using, you may also need to create a custom PromptHelper
Got it, I'm going to read more into prompthelper. I actually downloaded your documentation into a GPT_Index and have been querying it. It's fantastic!
Add a reply
Sign up and join the conversation on Discord