Find answers from the community

Updated last year

Is it possible to integrate Langchain code in LlamaIndex?

At a glance
@kapa.ai Is it possible to integrate Langchain code in LlamaIndex?
k
3 comments
Please stand by @Matthews as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, it is possible to integrate Langchain with LlamaIndex. LlamaIndex provides both Tool abstractions for a Langchain agent as well as a memory module.

For instance, you can use LlamaIndex as a callable tool with a Langchain agent. Here is an example of how to do this:

Plain Text
from langchain.agents import Tool
from langchain.chains.conversation.memory import ConversationBufferMemory
from langchain.chat_models import ChatOpenAI
from langchain.agents import initialize_agent

from llama_index import VectorStoreIndex, SimpleDirectoryReader

documents = SimpleDirectoryReader("../paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents=documents)

tools = [
    Tool(
        name="LlamaIndex",
        func=lambda q: str(index.as_query_engine().query(q)),
        description="useful for when you want to answer questions about the author. The input to this tool should be a complete english sentence.",
        return_direct=True,
    ),
]

memory = ConversationBufferMemory(memory_key="chat_history")
llm = ChatOpenAI(temperature=0)
agent_executor = initialize_agent(
    tools, llm, agent="conversational-react-description", memory=memory
)

(Source: LangchainDemo.ipynb ())

You can also use LlamaIndex as a memory module. Here is an example:

```python
from langchain.llms import OpenAI
from langchain.llms import OpenAIChat
from langchain.agents import initialize_agent

from llama_index import SummaryIndex
from llama_index.langchain_helpers.memory_wrapper import GPTIndexChatMemory

index = SummaryIndex([])
memory = GPTIndexChatMemory(
index=index,
memory_key="chat_history",
query_kwargs={"response_mode": "compact"},
return_source=True,
return_messages=True,
)
llm = OpenAIChat(temperature=0)
agent_executor = initialize_agent(
[], llm, agent="conversational-react-description", memory=memory
)
```
(Source: LangchainDemo.ipynb ())

For more details, you can refer to the LangChain integration guide () and the Building a Chatbot Tutorial (LangChain + LlamaIndex) ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord