Find answers from the community

Updated 2 years ago

Let's say I had a List index, vector index, and a kw index. Is there a way to query from

@kapa.ai

Let's say I had a List index, vector index, and a kw index.

Is there a way to query from all of the indexes and use information from all three indices to formulate an answer?
k
L
r
17 comments
Please stand by @ryanglambert as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
Yes, you can query from all of the indices and use information from all three indices to formulate an answer by composing a graph over your existing data. Specifically, you can compose a GPTListIndex over your subindices. A list index inherently combines information for each node, allowing it to synthesize information across your heterogeneous data sources.

Here's an example of how to compose a graph with a GPTListIndex over your subindices:

Plain Text
from llama_index import GPTSimpleVectorIndex, GPTListIndex
from llama_index.indices.composability import ComposableGraph

index1 = GPTSimpleVectorIndex.from_documents(notion_docs)  # Replace with your List index
index2 = GPTSimpleVectorIndex.from_documents(slack_docs)   # Replace with your Vector index
index3 = GPTSimpleKeywordTableIndex.from_documents(...)    # Replace with your Keyword Table index

graph = ComposableGraph.from_indices(GPTListIndex, [index1, index2, index3], index_summaries=["summary1", "summary2", "summary3"])
response = graph.query("<query_str>", mode="recursive", query_configs=...)


For more information, refer to the Synthesis over Heterogeneous Data (https://gpt-index.readthedocs.io/en/latest/use_cases/queries.html#synthesis-over-heterogeneous-data) section in the documentation.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
That looks about right (but I think kapa isn't quite updated with the new docs yet, weird)
@Logan M how would one do this if they also wanted to allow the model to "decide" which tools to use of the 3?

I don't want to always use all 3, and I don't always want to use just 1. I would like to use the tools that would be helpful. 1, 2, or 3.
@kapa.ai How can I let the model decide which tools to use? There are multiple tools, it can use any or all of them.
Let me check the knowledge base for you. One moment! :books:
----------------------
You can let the model decide which tools to use by integrating the tools into a conversational agent like Langchain. In the example provided in the Langchain Demo (https://github.com/jerryjliu/gpt_index/tree/main/examples/langchain_demo/LangchainDemo.ipynb), multiple tools are defined and added to the agent chain. The agent then decides which tool to use based on the input.

Here's a code snippet from the Langchain Demo that shows how to define multiple tools and integrate them into the agent chain:

Plain Text
from langchain.agents import Tool
from langchain.chains.conversation.memory import ConversationBufferMemory
from langchain import OpenAI
from langchain.agents import initialize_agent

from gpt_index import GPTSimpleVectorIndex

index = GPTSimpleVectorIndex.load_from_disk('../vector_indices/index_simple.json')

tools = [
    Tool(
        name = "GPT Index",
        func=lambda q: str(index.query(q)),
        description="useful for when you want to answer questions about the author. The input to this tool should be a complete english sentence.",
        return_direct=True
    ),
    # Add more tools here
]

memory = ConversationBufferMemory(memory_key="chat_history")
llm = OpenAI(temperature=0)
agent_chain = initialize_agent(tools, llm, agent="conversational-react-description", memory=memory)


In this example, you can add more tools to the tools list, and the agent will decide which tool to use based on the input.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
I think if you have multiple tools, and you want to use any number of them, thats not quite supported yet.

The router query engine kind of does this, but it only uses one tool https://github.com/jerryjliu/llama_index/blob/main/docs/examples/query_engine/RouterQueryEngine.ipynb

Another option is setting up a graph with a keyword index at the top level. Then any matching sub-indexes will get used.

Otherwise, maybe there needs to be a PR to pick MULTIPLE tools to route a query to with the router engine... but then the LLM would also have to summarize the responses from each tool/index
Cool

Yeah right now it's "one" or "all",

definitely would be nice to have "some". I'll think on it!
CORRECTION
@Logan M looksl ike there actually is a tool that does this! πŸ™‚

llama_index.selectors.llm_selectors.LLMMultiSelector
oh wow hahaha
I need to keep up with these updates
Not sure how to use it (there's not example in the code lol)
dude, they're going so fast! who can keep up!
I was on a contract for a month and had no time to look at the code,

I come back and the entire architecture is changed! (I like the ServiceContext architecture, the naming is taking some getting used to though)
hahaha yea, like half the codebase was re-written lately
Add a reply
Sign up and join the conversation on Discord