Find answers from the community

s
F
Y
a
P
Home
Members
GeoloeG
G
GeoloeG
Offline, last seen last month
Joined September 25, 2024
Dear fellow LlamaIndexers, is it possible to get a chat_engine from a query_engine ? I currently deriving a chat_engine from a vector index, but I would like to derive a chat engine from a router engine based on both vector index and summary index ?
3 comments
G
L
I am trying to use IngestionPipeline to be executed locally but it has a dependency on llama_index_client which comes from a poetry module mapping:
Plain Text
poetry_requirements(
    name="poetry",
    module_mapping={"llamaindex-py-client": ["llama_index_client"]},
)

which is not installed by default, and even when installing llamaindex-py-client with pip, there is no way to get this mapping.

Therefore, I am wondering if we are able to use IngestionPipeline standalone ?
8 comments
L
G