Find answers from the community

Updated 5 months ago

Langchain

At a glance
Hi Guys I having issue with Lanchain as it is updating everyday, my Qna Model is throwing this error while using langchain version 0.0.113.I need to commit to this version, now i am getting this error:

InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 8724 tokens (8724 in your prompt; 0 for the completion). Please reduce your prompt; or completion length

can any one tell me how to not create more tokens than permitted?
L
s
4 comments
I'm not am expert with langchain (this is a llama index discord πŸ˜…) but one of your texts is probably too long ?
yes I am scraping a whole website, all the hyperlinks connected to it, wich make a large text file, where it goes through text splitter, embedding and then doc search similarity.
any way I can limit token creation to 8000?
With langchain, I think you have to preprocess your text to limit it. Again, definitely not a langchain expert :p If you use llama-index, it should handle it all for you
Add a reply
Sign up and join the conversation on Discord