I see the warning"""Token indices sequence length is longer than the specified maximum s
I see the warning"""Token indices sequence length is longer than the specified maximum s
At a glance
A community member encountered a warning about the token indices sequence length being longer than the specified maximum sequence length for the model, which could result in indexing errors. Another community member responded that this is a benign warning that only occurs on Python 3.8, and the issue is not a problem. The community members also discussed referring to the LlamaIndex documentation for more information on customizing language models and handling token limits, but there is no explicitly marked answer provided.
@kapa.ai I see the warning """ Token indices sequence length is longer than the specified maximum sequence length for this model (1770 > 1024). Running this sequence through the model will result in indexing errors """ What is the fix?
Please stand by @Rouzbeh as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
---------------------- Hmm, I don't know enough to give you a confident answer yet. However, you can refer to the LlamaIndex documentation for more information on customizing LLMs and handling token limits: Custom LLMs (). ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------