Find answers from the community

Updated 4 weeks ago

Introducing Contextual Retrieval

L
l
4 comments
This is pretty old -- but it certainly makes sense. Attaching metadata for either embeddings or for LLM prompting can help
could you please tell me what would be the most up to date sources?
my idea is to interpret open source llm with llamaindex instead of using claude, with milvus
Here's one using some dedicated extractor
https://docs.llamaindex.ai/en/stable/examples/metadata_extraction/DocumentContextExtractor/

Here's one more from scratch
https://docs.llamaindex.ai/en/stable/examples/cookbooks/contextual_retrieval/

The main thing that makes this feasible though is anthropics ability to cache. Without caching, it will be slower, and more expensive
Add a reply
Sign up and join the conversation on Discord