Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 8 months ago
0
Follow
High context length LLM vs Low context length
High context length LLM vs Low context length
Inactive
0
Follow
Y
Yj
8 months ago
Β·
if i use a model with higher context length ( 128 k ) vs a lower context length (16k) will it help when i do RAG with higher context length?
does larger context length model have larger chunk size? or node size?
can someone please explain, thank you. Or link me to any article that talks about this
Y
T
5 comments
Share
Open in Discord
Y
Yj
8 months ago
explanation is needed, thank you
Y
Yj
8 months ago
did some research, i can configure the similarity Top K to retrieve more chunks
Y
Yj
8 months ago
lets say if i use claude 3 models, which means it has much higher context length. Does using high similarity Top K always better than using lower? or is there scenario i want to use lower similarity top k
T
Teemu
8 months ago
If you retrieve too much data it will be harder for the LLM to synthesize an accurate response
T
Teemu
8 months ago
Also the latency and cost will be higher
Add a reply
Sign up and join the conversation on Discord
Join on Discord