----------------------
LlamaIndex allows you to use any data loader, including those from Langchain, within the LlamaIndex core repository or in LlamaHub as an "on-demand" data query Tool within a LangChain agent. The Tool will load data using the data loader, index the data, and query the data, returning the response in an ad-hoc manner.
Here is an example of how you can use a LlamaIndex query engine along with a Langchain agent:
from llama_index.langchain_helpers.agents import (
IndexToolConfig,
LlamaIndexTool,
)
tool_config = IndexToolConfig(
query_engine=query_engine,
name=f"Vector Index",
description=f"useful for when you want to answer queries about X",
tool_kwargs={"return_direct": True},
)
tool = LlamaIndexTool.from_tool_config(tool_config)
In this example, a "Tool" is created from a
QueryEngine
directly. This tool can then be used within a Langchain agent.
For more detailed instructions and examples, you can refer to the
LangChain integration guide () and the
OnDemandLoaderTool Tutorial () provided in the LlamaIndex documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------