The community members are discussing the possibility of enhancing the LlamaIndex to avoid "no context" responses by invoking a web downloader function to retrieve additional information and provide a more comprehensive answer. The comments suggest that this could be achieved through the use of a custom query engine, an agent, or a combination of tools like a web retriever. However, there is no explicitly marked answer in the provided information.
Is there a way for llama index, instead of giving a no context response, instead invoking a function like a web downloader and read the response and add it for context to attempt to resolve an actual answer
I think it would have to either decide ahead of time based on tool descriptions
You could also implement a custom query engine that first calls a query engine, calls the LLM to decide if if "no context response" was given, and if so, call another service like a web retriever tool