Find answers from the community

Updated 2 months ago

Top k

Hi! Is there any way to see how big text nodes that will be sent as context? I need to know in advance because in some cases, when similarity_top_k is 5, I don't want the exception to be thrown because text nodes are too big, then I would prefer to search with similarity_top_k 4 or 3.
L
S
25 comments
Llama index should be handling any size of retrieved context though?

You can probably implement a custom node postprocessor to check this though
https://gpt-index.readthedocs.io/en/stable/core_modules/query_modules/node_postprocessors/usage_pattern.html#custom-node-postprocessor
@Logan M Well, I thought the same but I already had this exception, not sure why when I set top_k = 5 and it sent 5 chunks of ~19,000 symbols each not sure why
How are you using llama index? I can set the top k to 100 and it works fine πŸ˜…
Plain Text
query_engine = index.as_chat_engine(verbose=True,chat_mode="context",
                similarity_top_k=5,
                system_prompt=prepared_system_prompt)
actually, I have never hit this issue until yersterday
Ohhh you are using chat engine
yes, is that bad?
Nah it just works a little differently
Yea using node postprocessors is probably the best solution
Yeah, I see. I doesn't look complex but not sure how actually I could use it
should I pass my postprocessor as a param?
Yea, like this

as_chat_engine(..., node_postprocessors=[...])
cool, I see. Let me try it. Thanks!
Hmm the example is not working. my version is 7.19, the code is the same but it can't find from llama_index.indices.postprocessor.base import BaseNodePostprocessor
Ok, it's in .types not in .base
@Logan M should I instantiate it or just pass a class name?
@Logan M this is the code but the nodepostrpocessor is never hit, not sure why
Plain Text
query_engine = index.as_chat_engine(verbose=True,chat_mode="context",
                similarity_top_k=5,
                system_prompt=prepared_system_prompt, node_postprocessors=[CustomPostprocessor()])
What version of llamaindex do you have? I know this was added somewhat recently πŸ€”
Ohhhh yea it won't be in that version πŸ˜…
which min version should I move to to make it work?
yeah I see. will I have to do a lot of fixes if moving to the latest version?
I doubt it? Worth a shot anyways lol
haha totally
Add a reply
Sign up and join the conversation on Discord