I got this from the mendable snippet of llama-index asking your query:
To create a new context chat engine that uses nodes from both VectorStoreIndex A and VectorStoreIndex B, you would first need to combine the nodes from both indexes into a new index. Here's a simple way to do it:
First, you need to retrieve the nodes from both VectorStoreIndex A and VectorStoreIndex B. You can do this by calling the get_all_nodes() method on each index. This will return a list of all nodes in the index.
nodes_A = vector_store_index_A.get_all_nodes()
nodes_B = vector_store_index_B.get_all_nodes()
Next, you can combine the nodes from both indexes into a single list.
combined_nodes = nodes_A + nodes_B
Now, you can create a new VectorStoreIndex using the combined nodes.
from llama_index import VectorStoreIndex
combined_index = VectorStoreIndex(combined_nodes)
Finally, you can create a new context chat engine using the combined index.
from llama_index.memory import ChatMemoryBuffer
memory = ChatMemoryBuffer.from_defaults(token_limit=1500)
combined_chat_engine = combined_index.as_chat_engine(
chat_mode="context",
memory=memory,
system_prompt="You are a chatbot, able to have normal interactions, as well as talk about a variety of topics.",
)
Now, your combined_chat_engine will be able to retrieve and use information from both VectorStoreIndex A and VectorStoreIndex B when answering queries.
You can also check this source
https://gpt-index.readthedocs.io/en/latest/examples/agent/multi_document_agents.html