Find answers from the community

Updated 12 months ago

example.md

At a glance

The community member is having trouble with querying issues when using a combination of text splitters, markdown splitters, and multilingual models to retrieve answers. Even basic questions like "what is the topic about" fail, with the query citing that there are no nodes containing the information. The community member is wondering what could be the reason and has attached a code block and a markdown file as an example. They are seeking tips or troubleshooting advice to resolve the issue.

In the comments, another community member suggests that questions like "what is the topic about" are not really useful for a vector index, as it will retrieve the top-k most similar results, which may not be helpful. A common pattern suggested is routing between two indexes (vector vs. summary) for queries that require all nodes vs. a top-k.

The original community member thanks the other for the quick reply and asks if there is a way to resolve this kind of issue. They explain their use case, which is to go through a few posts and query for things like what the comments are mentioning, when the post was posted, and a 5-bullet-point explanation of the post. The goal is to analyze the posts and get insights, potentially by chatting with the LlamaIndex community. However, after custom chunking and embedding, the community member says the approach seems to be failing or their methodology may

Useful resources
Hi, there i am having problem troubleshooting with querying issues. I am use combination of text splitters, markdownsplitters and multilingual models to retreive answers . However the query fails even in basic questions such as "what is the topic about" citing there are no nodes that contain the information . I am wondering what could be the reason. Attaching the code block and also the markdown files. I am not sure why is the case so . The example md file is here : https://drive.google.com/file/d/1XiZD4QIcwi4eQjbh0vn4nn5YGlWpDIhI/view?usp=sharing . Any tips or troubleshooting that could help ?
L
s
3 comments
Questions like "what is the topic about" are not really useful for a vector index?

Like, it will take that, and retrieve a top-k most similar, which thinking about that query text, wont be very helpful
A common pattern is routing between two indexes (vector vs. summary) for queries that require all nodes vs a top k
@Logan M thank you for the quick reply. Is there a way to resolve this kind of issues. The use case is to go through a few such posts and query for example
a) what are the comments mentioning ?
b) When the post posted
c) Explain the post in 5 bullet points .

this way the posts could be analysed to get insights. The posts are from reddit , that way it would help to get a quick overview . It could be something like chatting with llama index community for example. However after custom chunking and embedding it seems that it fails to work or my methodology is totally wrong .
Add a reply
Sign up and join the conversation on Discord