----------------------
Yes, it is possible to integrate Langchain with LlamaIndex. LlamaIndex provides both Tool abstractions for a Langchain agent as well as a memory module.
For instance, you can use LlamaIndex as a callable tool with a Langchain agent. Here is an example of how to do this:
from langchain.agents import Tool
from langchain.chains.conversation.memory import ConversationBufferMemory
from langchain.chat_models import ChatOpenAI
from langchain.agents import initialize_agent
from llama_index import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("../paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents=documents)
tools = [
Tool(
name="LlamaIndex",
func=lambda q: str(index.as_query_engine().query(q)),
description="useful for when you want to answer questions about the author. The input to this tool should be a complete english sentence.",
return_direct=True,
),
]
memory = ConversationBufferMemory(memory_key="chat_history")
llm = ChatOpenAI(temperature=0)
agent_executor = initialize_agent(
tools, llm, agent="conversational-react-description", memory=memory
)
(Source:
LangchainDemo.ipynb ())
You can also use LlamaIndex as a memory module. Here is an example:
```python
from langchain.llms import OpenAI
from langchain.llms import OpenAIChat
from langchain.agents import initialize_agent
from llama_index import SummaryIndex
from llama_index.langchain_helpers.memory_wrapper import GPTIndexChatMemory
index = SummaryIndex([])