The community member is interested in using LlamaIndex to provide langchain conversational memory with hypothetical infinite memory, but is having difficulty integrating it into their existing LLMChain agent. Another community member provides an example of using LlamaIndex as a memory module in langchain, and the original poster expresses gratitude and belief that this will be integral for creating personalized conversational models. A third community member also had the same idea but is glad the original poster has found a pathway to the implementation.
I became aware of LlamaIndex due to the possibility of providing langchain conversational memory a hypothetical infinite memory. I'm having a hard time using this in my agent which was originally an LLMChain. Does anyone have a few more examples of langchain + llamaindex?
@Logan M thank you! I did use this specifically as my inspiration for my existing code. I was trying to convert my existing langchain conversational models. I will give you more details later! I am glad you have made this possible, as I believe it will be integral for me to create personalized conversational models