Hello, im following this notebook:
https://github.com/run-llama/llamacloud-demo/blob/main/examples/advanced_rag/corrective_rag_workflow.ipynbi want to run it fully locally... what is the local equivalent of tavily?
# If any document is found irrelevant, transform the query string for better search results.
if "no" in relevancy_results:
prompt = DEFAULT_TRANSFORM_QUERY_TEMPLATE.format(query_str=query_str)
result = self.llm.complete(prompt)
transformed_query_str = result.text
# Conduct a search with the transformed query string and collect the results.
search_results = self.tavily_tool.search(
transformed_query_str, max_results=5
)
search_text = "\n".join([result.text for result in search_results])
else:
search_text = ""