Hi Guys I having issue with Lanchain as it is updating everyday, my Qna Model is throwing this error while using langchain version 0.0.113.I need to commit to this version, now i am getting this error:
InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 8724 tokens (8724 in your prompt; 0 for the completion). Please reduce your prompt; or completion length
can any one tell me how to not create more tokens than permitted?
yes I am scraping a whole website, all the hyperlinks connected to it, wich make a large text file, where it goes through text splitter, embedding and then doc search similarity.
With langchain, I think you have to preprocess your text to limit it. Again, definitely not a langchain expert :p If you use llama-index, it should handle it all for you