Find answers from the community

Home
Members
ihshan :)
i
ihshan :)
Offline, last seen 3 months ago
Joined September 25, 2024
I have a markdown file in which there are texts and table. Can you give me suggestions how can I ensure that my table is not chunked in the middle ? I want to ensure that the table is still intact
6 comments
k
i
Is there a way to get metadata when I am using chat engine ?, say response.metadata , but I am getting empty list. Basically I want to get from which document / filename that response comes from
7 comments
i
W
k
I am currently using agent with PandasQueryEngine as a tool for reading excel file. It can read information from whatever is available in the file. But is there a way so that it can also read a formula that is available in the excel file ?
5 comments
k
i
Hi I am using ReAct agent and I put query_engine_tools. Can ReAct agent pick up more than one tools when processing a query ? is it by default ? something like LLMMultiSelector. Fyi, I am using PandasQueryEngine for query_engine_tools
6 comments
k
i
Hi , I want to use RouterQueryEngine, is there a way where I can get a chat history as well as the metadata that I can use for citation ?
18 comments
k
i
I have implemented agent to query over PandasQueryEngine as tools. It works fine. However, how can I get metadata once it gives the results ? I need some metadata for example for citation purposes.
12 comments
k
i
@kapa.ai I want to do pre-processing for uploaded pdf files. I used PyMuPDF4LLM but it is not multithreading safe. The reason I am using that library because it removes headers and footer and turn the documents into markdown from which I can do some pre-processing. Do you have any suggestion how to turn uploaded PDF files into markdown so that I can remove headers and footers while implementing multithreading ?
6 comments
k
i
@kapa.ai I am currently using CitationQueryEngine, which provides a nice citation. However, it does not have a conversation history like what chat engine does. Can you give me some ideas / codes that allow me to use CitationQueryEngine but at the same time, it can maintain a context or preserve a conversation history so that it will have the ability of chat engine at the same time ?
10 comments
i
k
@kapa.ai Here is my code. I am wondering if there is a way to put conversation history as parameter as chat_engine.chat(query, conversation_history) instead of chat_engine.chat(query). The whole idea is I don't have to reinstantiate chat engine every time there is a new conversation history

def create_chat_engine(self):
# Create a chat memory buffer for conversation history
memory = ChatMemoryBuffer.from_defaults(token_limit=default_configuration.TOKEN_LIMIT_MEMORY_BUFFER_CHAT_ENGINE)

# Set a conversation history into a memory as chat_history
memory.set(chat_messages)

# Create a chat engine
chat_engine = CondensePlusContextChatEngine.from_defaults(
index.as_retriever(),
memory=memory,
llm=llm,
context_prompt=(
"You are a chatbot, able to have normal interactions as well as talk in a professional manner."
"Here are the relevant documents for the context:\n"
"{context_str}"
"\nInstruction: Use the previous chat history and the context above to interact and help the user."
),
verbose=False,
)

return chat_engine

def use_chat_engine(self):

chat_engine = self.create_chat_engine()
query = "Please tell me about LlamaIndex)
chat_engine.chat(query)
9 comments
W
i
k
@kapa.ai
I am using this code in order to maintain a conversation history. However, since I use Flask, python garbage collector frees up all values of variables after sending response to frontend. It means that values of memory and self.chat_engine are always wiped out after sending the response to the frontend. Do you have any suggestions to maintain memory and self.chat_engine considering client & servers work including Flask ?

Create a chat memory buffer for conversation history

memory = ChatMemoryBuffer.from_defaults(token_limit=3900)

# Create a chat engine
self.chat_engine = CondensePlusContextChatEngine.from_defaults(
index.as_retriever(),
memory=memory,
llm=llm,
context_prompt=(
"You are a chatbot, able to have normal interactions, as well as talk in a professional manner"
"about the attached document(s)."
"Here are the relevant documents for the context:\n"
"{context_str}"
"\nInstruction: Use the previous chat history, or the context above, to interact and help the user."
),
verbose=False,
)

return self.chat_engine
21 comments
k
i