Find answers from the community

Updated last year

Hi , is there a way to make pandas query

At a glance

The community members are discussing ways to make the pandas query engine more contextual, such as by creating a chat engine that can handle follow-up questions. They also discuss issues with plotting and saving plots from the query engine, as well as passing column descriptions as metadata to help the engine handle similar column names. The community members also discuss ways to improve the inference time of a context chat engine without using a GPU, and how to make the query engine work with multiple CSV files. However, there is no explicitly marked answer in the comments.

Useful resources
Hi , is there a way to make pandas query engine more contextual like taking previous input/output into context for follow up questions
R
T
L
8 comments
You can try creating a chat_engine from the pandas query engine.

There are different chat modes you can use for chat engines, the following example creates a "condense_question" one which creates a standalone query from the chat history.

Plain Text
chat_engine = CondenseQuestionChatEngine.from_defaults(
                query_engine=pandas_query_engine,
            )
Will try this . Also pandas query engine is able to to plot few times inline in jupyter notebook, but I am unable to see any .jpg or plot instance in return response from pandas query engine. How can I save those plot and show in any other chatbot
Also how to pass column description as a meta data to pandas query engine as it is getting confused with few similar column names
Can we bring inference time to less than a minute for contextchatengine without GPU , by utilising all cpu cores ?
You'd probably need to customize the prompt template to include details of the data then
Are you running a local model? Most likely it's already using all the resources it can
Yes using llama code instruct model. But many times it fails to give correct answer. And how to make pandas query engine work with multiple CSV files
it doesn't really support multiple CSV files on its on -- each query engine using a single dataframe. You could combine with an agent or sub-question query engine to work with multiple

many times it fails to give correct answer -- yea, modifying the prompt template will likely help
https://docs.llamaindex.ai/en/stable/module_guides/models/prompts/usage_pattern.html#getting-and-setting-custom-prompts
Add a reply
Sign up and join the conversation on Discord