Find answers from the community

Updated last year

Hey I m trying to figure out how to use

Hey, I'm trying to figure out how to use RouterQueryEngine with chat history. Is it even possible?
L
m
12 comments
mmm not by default. You'd either have to stuff the prompt with the previous messages, write a custom selector that uses the previous messages, or use a data agent setup with the router engine as a tool
Hey, thanks. I guess stuffing the history to the prompt would be easiest, but what do you think would be better?
Probably best to use a chat engine / data agent that uses the query engine.

You could use an OpenAIAgent with the existing router engine

Or if you switch to a retriever router, a context chat engine might also work πŸ‘€
Great, thanks a lot for the advice πŸ™‚
Hey Logan, I'm trying to understand where and how can I change the llama_index_starter_pack react demo to use Azure instead of OpenAI. Can you help me with that?
Thanks! is there a more updated fullstack demo by any chance? I'm really struggeling to make it fit to the current llamaindex version
ah yea, it's a little old πŸ˜…

What part was tripping you up?
haha yeah.. basically what I want to achieve is exactly the server of the demo, only updated and using Azure πŸ™‚
Hmm, following that guide above should help no? Setup the llm and embed model, set the global service context, and off you go πŸ™
Yeah I'm gonna try it this week. Definitely I will bug you about it πŸ˜†
You should come visit Israel, maybe do a webinar πŸ˜‰
Add a reply
Sign up and join the conversation on Discord