Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Hi the chat example uses
Hi the chat example uses
Inactive
0
Follow
T
Tiago Freitas
2 years ago
Β·
Hi, the chat example uses ConversationBufferMemory as the langchain memory. Has anyone used a vector store to save the conversation history long term?
j
T
7 comments
Share
Open in Discord
j
jerryjliu0
2 years ago
you can actually also use llamaindex modules as memory components too!
j
jerryjliu0
2 years ago
would be curious for you to try it out + lmk what you think '
j
jerryjliu0
2 years ago
https://gpt-index.readthedocs.io/en/latest/how_to/using_with_langchain.html#llama-demo-notebook-tool-memory-module
T
Tiago Freitas
2 years ago
I am testing now and will let you know my findings
T
Tiago Freitas
2 years ago
why do all examples use temperature=0?
j
jerryjliu0
2 years ago
to make the outputs a bit less random by default π
j
jerryjliu0
2 years ago
easier to reproduce issues
Add a reply
Sign up and join the conversation on Discord
Join on Discord