Find answers from the community

Updated 3 months ago

Is there a way for llama index, instead

Is there a way for llama index, instead of giving a no context response, instead invoking a function like a web downloader and read the response and add it for context to attempt to resolve an actual answer
L
L
5 comments
I think if it was an agent, it could do that. Or some kind of custom query engine + router/sub question query engine

(cc @emrgnt_cmplxty -- this is exactly the type of use-case a sciphi retriever might solve)
Do you think if I do an agent with a tool as a query engine along with a web retriever its possible?
I don't know how I would detect when the engine doesnt give context
I think it would have to either decide ahead of time based on tool descriptions

You could also implement a custom query engine that first calls a query engine, calls the LLM to decide if if "no context response" was given, and if so, call another service like a web retriever tool
Add a reply
Sign up and join the conversation on Discord