Hey I could use some help trying to understand the following:
- I want to use hybrid search with a weaviate db in a chat engine
- I see how I can do that with a query engine with `index.as_query_engine(vector_store_query_mode="hybrid"), but I want to use a chat engine
- I'm using the CONDENSE_PLUS_CONTEXT type and I see in the source that it does:
return CondensePlusContextChatEngine.from_defaults(
retriever=self.as_retriever(**kwargs),
llm=llm,
**kwargs,
)
Would the following work?
`index.as_chat_engine(vector_store_query_mode="hybrid")?
What if I want to do something more complicated like implement this as the retreiver?
https://docs.llamaindex.ai/en/stable/examples/retrievers/auto_merging_retriever.html