@kapa.ai I see the warning """ Token indices sequence length is longer than the specified maximum sequence length for this model (1770 > 1024). Running this sequence through the model will result in indexing errors """ What is the fix?
Please stand by @Rouzbeh as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
---------------------- Hmm, I don't know enough to give you a confident answer yet. However, you can refer to the LlamaIndex documentation for more information on customizing LLMs and handling token limits: Custom LLMs (). ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------