Find answers from the community

Updated last year

I have built a context chat engine which

I have built a context chat engine (which is working quite well!). I have two separate functions that create VectorStoreIndex A and VectorStoreIndex B respectively when they are called.

I want to create a case that if VectorStoreIndex A is created and VectorStoreIndex B also gets created, then a new context chat engine to be created that will be using nodes of both VectorStoreIndex A and VectorStoreIndex B. How can I make this work? I explored the documentation and some session videos but didn't find quite the solution I wanted. What would you suggest? I am just an enthusiast, so lack some technical knowledge.
1
А
e
t
12 comments
hello! can you share your code?
But im not going to help) Im newbie and want to make chat bot
with some context
but Im getting only one answer
I got this from the mendable snippet of llama-index asking your query:

To create a new context chat engine that uses nodes from both VectorStoreIndex A and VectorStoreIndex B, you would first need to combine the nodes from both indexes into a new index. Here's a simple way to do it:

First, you need to retrieve the nodes from both VectorStoreIndex A and VectorStoreIndex B. You can do this by calling the get_all_nodes() method on each index. This will return a list of all nodes in the index.
nodes_A = vector_store_index_A.get_all_nodes() nodes_B = vector_store_index_B.get_all_nodes()
Next, you can combine the nodes from both indexes into a single list.
combined_nodes = nodes_A + nodes_B
Now, you can create a new VectorStoreIndex using the combined nodes.
from llama_index import VectorStoreIndex

combined_index = VectorStoreIndex(combined_nodes)
Finally, you can create a new context chat engine using the combined index.
from llama_index.memory import ChatMemoryBuffer memory = ChatMemoryBuffer.from_defaults(token_limit=1500) combined_chat_engine = combined_index.as_chat_engine( chat_mode="context", memory=memory, system_prompt="You are a chatbot, able to have normal interactions, as well as talk about a variety of topics.", )
Now, your combined_chat_engine will be able to retrieve and use information from both VectorStoreIndex A and VectorStoreIndex B when answering queries.

You can also check this source https://gpt-index.readthedocs.io/en/latest/examples/agent/multi_document_agents.html
wow this is so cool!
@elmatero Thanks! Yeah I have also thought about similar logic, just not with get_all_nodes(). What do we know about get_all_nodes()? Doesn't seem there's anything about it in Llama index docs.
@Антон My code is integrated with another service and its user data, arguments, APIs, etc... Since you're a beginner it might just more complicate you, so I would just advise reading the docs of how to build a chat bot. That's how I started from zero. I started from here: https://gpt-index.readthedocs.io/en/latest/core_modules/query_modules/chat_engines/modules.html
you are right, maybe using something like: storage_context_vector_1.docstore
storage_context_vector_2.docstore
Yeah, seems like this doesn't work either.

I get:

Nodes A: <llama_index.storage.docstore.simple_docstore.SimpleDocumentStore object at 0x14aada550>
Nodes B: <llama_index.storage.docstore.simple_docstore.SimpleDocumentStore object at 0x14fba0610>
Add a reply
Sign up and join the conversation on Discord