The community member is looking to use a search engine API like SerpAPI or SerpERA to retrieve search results and use them as context for answering questions. They are interested in being able to crawl the links returned by the API using headless Chrome and incorporate that information into the context window. The community member asks if there is anything similar to this functionality available out-of-the-box within LlamaIndex, and notes that latency is not an issue as they can batch their requests.
In the comments, another community member suggests that the original poster could use an agent with a SerpAPI tool, or create a custom query engine that uses SerpAPI under the hood.
Hopefully this is a quick question but could not find much info in the docs right away. I want to use a search engine api like serpapi or a serperapi and use those results as context for answerign questions. ideally, it would be good if the links can be crawled through headless chrome as well and pumped into the context window. Is there anything close to this out of the box within llamaindex? Latency is not an issue as i can batch my requests.