Find answers from the community

J
Jo
Offline, last seen 3 months ago
Joined September 25, 2024
Hi team, I am building a customized chatbot based on user-uploaded pdf files. I'm wondering how can I find a PDF file's geographic location (narrow down to the axis or lines) for the response source when I use chat_engine.
5 comments
J
b
I have a question about the system's responses. I've observed that when I issue the command 'summarize the document,' the system correctly summarizes the content. However, when I use the command 'what is this document about,' it responds with, "this document is about the different tools available for natural language processing (NLP) tasks. It provides an overview of the tools and their functionalities, including default query engines, etc." How can I address this inconsistency in responses?
19 comments
J
L
Which reader should I use to load the document from an object without a file path? When a user uploads a document such as a PDF, I want to store this file when it passes some test. In this case, I don't have any file path, and I only have an object to read and load.
3 comments
J
b
Hi all, I have a question regarding the custom VectorStore Postgres table.
  1. How do I add other columns? From PGVectorStore, it creates id, text, metadata, node_id, and embedding only. How can I add another column like user_id?
  1. When I generate indexes, I want to generate vector index, keyword index, and list index. How can I store these three indexes in one table? The table column names like id, text, metadata, node_id, embedding, vector_index, keyword_index, list_index
  1. how to store keyword index in Postgres? From the tutorial keyword_index is generated by GPTKeywordTableIndex. and I modify the code as below:
vector_store = PGVectorStore.from_params(... )
storage_context = StorageContext.from_defaults(vector_store=vector_store)
storage_context.docstore.add_documents(nodes)
keyword_index=GPTKeywordTableIndex(nodes, storage_context=storage_context)
I am not sure if this is working or not because I cannot find any data from the table that was assigned or other tables in the database. no error messages.
4 comments
J
d
J
Jo
·

Chatgpt

Hi guys. I'm having issues with my custom chat prompts. Previously, I set up a rule that if a user asks a non-business domain question, it should respond with 'Please ask questions regarding the business domain.' However, it's not working as expected. For instance, when I asked, 'Where is San Francisco located?' instead of the expected response, it provided a location.

This prompt used to work but stopped working last Thursday. I've also tested other prompts, and it seems like they aren't being processed correctly. Can someone help me figure out what's going wrong?

def construct_index(file_dir="./docs", index_dir="./data"):
service_context = ServiceContext.from_defaults(llm=OpenAI(model="gpt-3.5-turbo"))
documents = SimpleDirectoryReader(file_dir).load_data()
index = GPTVectorStoreIndex.from_documents(documents, service_context=service_context)
index.storage_context.persist(persist_dir=os.path.join(index_dir, "index"))
return index

index_stored = construct_index()

chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content="You are an expert Business Professional chatbot. ensure that users' questions are directly related to the business domain. If users ask a question outside the business domain, you will reply with, 'Please ask questions regarding the business domain.' " ),
ChatMessage(
role=MessageRole.USER,
content="Context information is below.\n"
"{context_str}\n"
"Answer the question: {query_str}\n"),]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)

service_context = ServiceContext.from_defaults(
llm=OpenAI(temperature=0.2, model="gpt-3.5-turbo"))

index = load_index_from_storage(
StorageContext.from_defaults(persist_dir="./data/index"))

chat_engine = index.as_chat_engine(chat_mode='openai',
service_context=service_context,
text_qa_template=text_qa_template)

response = chat_engine.chat("Where is San Francisco located?")
print( response.response)
5 comments
L
J