Find answers from the community

Updated 5 months ago

Memory

At a glance

The community member is interested in using LlamaIndex to provide langchain conversational memory with hypothetical infinite memory, but is having difficulty integrating it into their existing LLMChain agent. Another community member provides an example of using LlamaIndex as a memory module in langchain, and the original poster expresses gratitude and belief that this will be integral for creating personalized conversational models. A third community member also had the same idea but is glad the original poster has found a pathway to the implementation.

Useful resources
I became aware of LlamaIndex due to the possibility of providing langchain conversational memory a hypothetical infinite memory. I'm having a hard time using this in my agent which was originally an LLMChain. Does anyone have a few more examples of langchain + llamaindex?
L
.
3 comments
There's an example of using llama index as a memory module at the bottom of this
https://github.com/jerryjliu/llama_index/blob/main/examples/langchain_demo/LangchainDemo.ipynb

You can use any type of index/graph for it too πŸ‘
@Logan M thank you! I did use this specifically as my inspiration for my existing code. I was trying to convert my existing langchain conversational models. I will give you more details later! I am glad you have made this possible, as I believe it will be integral for me to create personalized conversational models
I had this exact idea but you already have a pathway to the exact implementation
Add a reply
Sign up and join the conversation on Discord