Find answers from the community

Updated 8 months ago

Urgent help needed !

Urgent help needed !

Hey guys , I am new to lamaIndex . I am having trouble in saving chatHistory in between user sessions. If a user closes or disconnects the execution on backend has to end and so does the contextChatEngine message history. Next time it doesnt have any chat history. Has anyone overcome this issue by persisting chat history in index or something ?
L
A
4 comments
you can get the chat history with chat_engine.chat_history

Then you can pass in the chat history with chat_engine.chat("Hello!", chat_history=chat_history)

chat_history is just a list of objects you can serialize.

Or, you can use a chat store
https://docs.llamaindex.ai/en/stable/module_guides/storing/chat_stores/?h=chat+store
Logan thanks for replying. I am actually using typescript and I cant see any ChatMemoryBuffer in there
@Logan M I am able to get the chat history but i think currently there is no support for chat memory in typescript version of lama index
Add a reply
Sign up and join the conversation on Discord