Find answers from the community

s
F
Y
a
P
Updated 11 months ago

@Logan M question - are there any

@Logan M question - are there any memory modules availble that can be injected into bots ..something like RollingWindow Memory . If there is one - can you direct me to the notebook
L
a
S
10 comments
All our agents/chat engines by default using ChatMemoryBuffer which is essnetially a rolling window

You could also configure this manually

Plain Text
from llama_index.memory import ChatMemoryBuffer

memory = ChatMemoryBuffer.from_defaults(token_limit=3900)

chat_engine = index.as_chat_engine(
    chat_mode="condense_plus_context",
    memory=memory,
    verbose=False,
)
yeah -- I am trying to see how can I extract it and save it externally
is that possible
Since it's a pyndatic object, yes πŸ‘

You should be able to either pickle it, or turn it into json and save/load it
Is it possible, instead of using a chat engine, to use a retriever and then pass it the memory as a parameter when I call query? I hope I have explained myself. thanks
Not really? Like you could do that, but it would be a little manual. Anything is possible with a bit of elbow grease
Plain Text
retriever = index.as_retriever(similarity_top_k=2)
nodes = retriever.retrieve("query")

from llama_index import get_response_synthesizer
response_synthesizer = get_response_synthesizer()
response = response_synthesizer.synthesize("query", nodes)
I am currently using a RetrieverQueryEngine with a custom prompt (no memory rn), where would you suggest I start/look
If you don't want to use a chat engine, you can use the above to either inject the chat history into the query, or modify the prompts to include chat history.

You can use the memory manually as well

Plain Text
from llama_index.memory import ChatMemoryBuffer
from llama_index.llms import ChatMessage

memory = ChatMemoryBuffer.from_defaults(token_limit=1500)
memory.put(ChatMessage(role="user", content="hello"))
chat_history = memory.get()
thanks let me try . i'll try to understand how to inject the history first. thanks!
Add a reply
Sign up and join the conversation on Discord