Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 2 years ago
0
Follow
Hi I was running into the same issue One
Hi I was running into the same issue One
Inactive
0
Follow
s
statsman
2 years ago
Β·
Hi, I was running into the same issue. One problem I've found is that if I set the max tokens manually, I might get an error by sending to many tokens.
j
s
3 comments
Share
Open in Discord
j
jerryjliu0
2 years ago
@statsman could you post the code you're using? depending on which LLM you're using, you may also need to create a custom PromptHelper
s
statsman
2 years ago
Got it, I'm going to read more into prompthelper. I actually downloaded your documentation into a GPT_Index and have been querying it. It's fantastic!
j
jerryjliu0
2 years ago
nice! π
Add a reply
Sign up and join the conversation on Discord
Join on Discord