I wonder if my llama_index.as_chat is getting too much chat history in the prompt causing this or something? Any good ways of managing this? Or settings I can adjust to condense the chat history? Or will I perhaps need to make a custom layer to summarize the chat history
Well I’m passing it into the system prompt setting of the as_chat function. I also put it in the service context but the chatbot wasn’t using that one for some reason