Find answers from the community

Updated 2 years ago

I see the warning"""Token indices sequence length is longer than the specified maximum s

@kapa.ai I see the warning
"""
Token indices sequence length is longer than the specified maximum sequence length for this model (1770 > 1024). Running this sequence through the model will result in indexing errors
"""
What is the fix?
k
L
R
4 comments
Please stand by @Rouzbeh as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
Hmm, I don't know enough to give you a confident answer yet. However, you can refer to the LlamaIndex documentation for more information on customizing LLMs and handling token limits: Custom LLMs ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
this is fine, it's a benign warning (only happens on python3.8)
I see - thanks
Add a reply
Sign up and join the conversation on Discord