Find answers from the community

Updated 2 years ago

Hi all new to llamaindex here Im trying

Hi all, new to llamaindex here. Im trying to figure out how to add ‘short memory’’. Like adding the query and response text of the conversation into the next prompt.. Is that possible? I know i would hit the max tokens limit quite fast, but it would be useful anyway,.
L
k
4 comments
You'll want to integrate with langchain to have memory. llama index on it's own is more of a search engine/tool, that a langchain agent might use
Yup, something like that!

There is also a tutorial that uses some extra wrappers, you could prefer that method as well
https://gpt-index.readthedocs.io/en/latest/guides/tutorials/building_a_chatbot.html
ok, thanks. Will keep trying to grasp what does all that mean.. 😛
Add a reply
Sign up and join the conversation on Discord