Find answers from the community

Updated 2 months ago

Hey everyone how are you

Hey everyone, how are you?

I'm new to using LlamaIndex!
I have the following problem and would appreciate it if you could help me think it through:

I have a VectorStoreIndex and when I make a Query it returns a Node, the information I need to synthesize the answer is split between that Node it returns and another Node (which it doesn't return) but that in the Document is following that Chunk.

How can I make the Synthesizer take into consideration the Nodes that surround the Node that the Retriever brought me so that no information is missing when generating the answer?
L
f
2 comments
You could use this node postprocessor to always fetch the prev and next nodes

https://docs.llamaindex.ai/en/stable/core_modules/query_modules/node_postprocessors/modules.html#beta-prevnextnodepostprocessor

index.as_query_engine(node_postprocessors=[PrevNextNodePostprocessor(...)])
Thanks! Would try this!
Add a reply
Sign up and join the conversation on Discord