Find answers from the community

m
markns
Offline, last seen 3 months ago
Joined October 24, 2024
Hey, having experimented with langgraph previously, I was hoping/expecting to see something like thread-level memory in LlamaIndex. However, I notice in the create-llama-app example that the entire conversation history is being passed from the frontend to the backend on each interaction. Is that the expected paradigm for chat engines in LlamaIndex?
I did see these docs, but they're not very comprehensive tbh - and don't cover for example how to create a memory per user/conversation.
3 comments
L
m
Hey, sorry if this is a real newbie question, but is there a good example somewhere of how to combine a chat_engine with a workflow? Does it make sense to wrap a workflow as a tool, that can be used in a ReAct agent?
2 comments
m
L