Find answers from the community

Updated 2 weeks ago

S3

Hey guys,

I was trying to save a BM25Retriever into S3 following this doc: https://docs.llamaindex.ai/en/stable/module_guides/storing/save_load/

But unfortunately, it's not working. Can anyone please help me out on how to save a retriever into S3?
L
d
9 comments
Hmm, it only saves to disk

https://docs.llamaindex.ai/en/stable/api_reference/retrievers/bm25/#llama_index.retrievers.bm25.BM25Retriever

I don't think it's possible to use an fsspec either tbh

You'd need to put your nodes in a docstore, and save the docstore to s3 using s3-fs
Yeah, just found that out the hard way.
@Logan M I have another question. How can we convert RetrieverQueryEngine as ChatEngine? Like is it possible?
You can put the query engine as a tool for an agent, or just use the retriever inside a CondensePlusContextChatEngine or ContextChatEngine
@Logan M I couldn't find the documentation to add retriever to condense chat engine.
Not sure what you mean

Plain Text
from llama_index.core.chat_engines import CondensePlusContextChatEngine

chat_engine = CondensePlusContextChatEngine.from_defaults(retriever, llm=llm)
Thanks a lot 😁
Add a reply
Sign up and join the conversation on Discord