Find answers from the community

Updated 10 months ago

Hi, maybe this is a stupid question, but

Hi, maybe this is a stupid question, but I'm new to LlamaIndex and AI in general.

I want to make a QueryPipeline in a Chat with context (chat_history), is it possible? Because I only see some examples with .run()
L
v
3 comments
I feel like its possible, Ive been meaning to make an example for this
It might require some custom comonents to wrap/handle the memory
πŸ˜ƒ Thanks! I'll keep trying!
Add a reply
Sign up and join the conversation on Discord