Find answers from the community

Updated 10 months ago

Hi , is there a way to make pandas query

Hi , is there a way to make pandas query engine more contextual like taking previous input/output into context for follow up questions
R
T
L
8 comments
You can try creating a chat_engine from the pandas query engine.

There are different chat modes you can use for chat engines, the following example creates a "condense_question" one which creates a standalone query from the chat history.

Plain Text
chat_engine = CondenseQuestionChatEngine.from_defaults(
                query_engine=pandas_query_engine,
            )
Will try this . Also pandas query engine is able to to plot few times inline in jupyter notebook, but I am unable to see any .jpg or plot instance in return response from pandas query engine. How can I save those plot and show in any other chatbot
Also how to pass column description as a meta data to pandas query engine as it is getting confused with few similar column names
Can we bring inference time to less than a minute for contextchatengine without GPU , by utilising all cpu cores ?
You'd probably need to customize the prompt template to include details of the data then
Are you running a local model? Most likely it's already using all the resources it can
Yes using llama code instruct model. But many times it fails to give correct answer. And how to make pandas query engine work with multiple CSV files
it doesn't really support multiple CSV files on its on -- each query engine using a single dataframe. You could combine with an agent or sub-question query engine to work with multiple

many times it fails to give correct answer -- yea, modifying the prompt template will likely help
https://docs.llamaindex.ai/en/stable/module_guides/models/prompts/usage_pattern.html#getting-and-setting-custom-prompts
Add a reply
Sign up and join the conversation on Discord