Find answers from the community

Updated 3 months ago

Hi the chat example uses

Hi, the chat example uses ConversationBufferMemory as the langchain memory. Has anyone used a vector store to save the conversation history long term?
j
T
7 comments
you can actually also use llamaindex modules as memory components too!
would be curious for you to try it out + lmk what you think '
I am testing now and will let you know my findings
why do all examples use temperature=0?
to make the outputs a bit less random by default πŸ™‚
easier to reproduce issues
Add a reply
Sign up and join the conversation on Discord