Find answers from the community

Updated last year

hi all would appreciate some direction

hi all, would appreciate some direction on whether llama-index can help for my scenario:
say i have two manuals : how to fix a bike and how to fix a car. both are very long docts.
i ask a question about fixing a car.
i need an agent to figure out which manual to consult with -> consult with the appropriate manual -> give me the right answer.
can you pls recommend what i could employ for this scenario?
thx much!
L
M
12 comments
Yea, this is exactly what the router query engine does. We also have our own llama-index agents which should do a very similar thing (but with conversation history)
thank you Logan!
will go , learn about router query now
Logan, do I understand correctly the router engine redirects queries between list index and vector index, which are over the same data (docs in a paul graham folder)? My case is a bit different, I have two different files and will create two vector databases (since they are long).
do i create two vector indices , with different names, and just add in the "description" which index is useful for what?
The router engine can route between any number of indexes, and indexes of any type πŸ™‚

So yea, you can create two vector indexes with different names and descriptions, and give those to the router query engine. Then, it should work πŸ™
sorry , it's probably a very stupid question, but how is that different from langchain implementation https://python.langchain.com/docs/modules/chains/additional/multi_retrieval_qa_router ?
tbh it's probably pretty similar lol

Just that ours works with llama-index indexes (and also has the option to let the LLM select/query multiple indexes)
yes, i did, thank you but got confused because it was creating an index over the folder and i needed two separate ones over each separate file
You can do seperate files with SimpleDirectoryReader, something like this, for group of file(s)

documents = SimpleDirectoryReader(input_files=["file.txt"]).load_data()
Add a reply
Sign up and join the conversation on Discord