Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Active
Updated 4 weeks ago
0
Follow
Introducing Contextual Retrieval
Introducing Contextual Retrieval
Active
0
Follow
l
lucaswillkill
4 weeks ago
Β·
https://www.anthropic.com/news/contextual-retrieval
L
l
4 comments
Share
Open in Discord
L
Logan M
4 weeks ago
This is pretty old -- but it certainly makes sense. Attaching metadata for either embeddings or for LLM prompting can help
l
lucaswillkill
4 weeks ago
could you please tell me what would be the most up to date sources?
l
lucaswillkill
4 weeks ago
my idea is to interpret open source llm with llamaindex instead of using claude, with milvus
L
Logan M
4 weeks ago
Here's one using some dedicated extractor
https://docs.llamaindex.ai/en/stable/examples/metadata_extraction/DocumentContextExtractor/
Here's one more from scratch
https://docs.llamaindex.ai/en/stable/examples/cookbooks/contextual_retrieval/
The main thing that makes this feasible though is anthropics ability to cache. Without caching, it will be slower, and more expensive
Add a reply
Sign up and join the conversation on Discord
Join on Discord