Is anyone having problems with setting up groq as their llm in lamaindex. At top I am using Settings.llm = new Groq() But its still defaulting to open AI and using it for both embeddings and queries.
Hey guys , I am new to lamaIndex . I am having trouble in saving chatHistory in between user sessions. If a user closes or disconnects the execution on backend has to end and so does the contextChatEngine message history. Next time it doesnt have any chat history. Has anyone overcome this issue by persisting chat history in index or something ?