Find answers from the community

Updated last year

Chatgpt

Hi guys. I'm having issues with my custom chat prompts. Previously, I set up a rule that if a user asks a non-business domain question, it should respond with 'Please ask questions regarding the business domain.' However, it's not working as expected. For instance, when I asked, 'Where is San Francisco located?' instead of the expected response, it provided a location.

This prompt used to work but stopped working last Thursday. I've also tested other prompts, and it seems like they aren't being processed correctly. Can someone help me figure out what's going wrong?

def construct_index(file_dir="./docs", index_dir="./data"):
service_context = ServiceContext.from_defaults(llm=OpenAI(model="gpt-3.5-turbo"))
documents = SimpleDirectoryReader(file_dir).load_data()
index = GPTVectorStoreIndex.from_documents(documents, service_context=service_context)
index.storage_context.persist(persist_dir=os.path.join(index_dir, "index"))
return index

index_stored = construct_index()

chat_text_qa_msgs = [
ChatMessage(
role=MessageRole.SYSTEM,
content="You are an expert Business Professional chatbot. ensure that users' questions are directly related to the business domain. If users ask a question outside the business domain, you will reply with, 'Please ask questions regarding the business domain.' " ),
ChatMessage(
role=MessageRole.USER,
content="Context information is below.\n"
"{context_str}\n"
"Answer the question: {query_str}\n"),]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)

service_context = ServiceContext.from_defaults(
llm=OpenAI(temperature=0.2, model="gpt-3.5-turbo"))

index = load_index_from_storage(
StorageContext.from_defaults(persist_dir="./data/index"))

chat_engine = index.as_chat_engine(chat_mode='openai',
service_context=service_context,
text_qa_template=text_qa_template)

response = chat_engine.chat("Where is San Francisco located?")
print( response.response)
L
J
5 comments
You'll probably also want to set a system prompt

Your template only gets used in the query engine. But the query engine only gets used if the agent decides to call it

index.as_chat_engine(..., system_prompt="...")
Could you please provide me with the correct documentation to guide me on how to rectify the code? I was referring to the following link: https://docs.llamaindex.ai/en/stable/examples/customization/prompts/chat_prompts.html
Yea tha example is just applied to query engines. But you are using a chat engine.

Think of it like two layers -- the agent on top, and the query engine underneath.

The agent looks at the chat history and newest message, and decides if it needs to use a tool (I.e. the query engine)

If so, it writes a query to that query engine, passes it in, and reads the response from the query engine, and then answers the users original message

So the prompt template you are passing in is only used for the query engine. The system prompt example I gave above is applied to the top level agent to control its behavior
Here's an example of building an agent with lower level apis, it's a little more customizable this way too. as_chat_eninge() is automating this

https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
Add a reply
Sign up and join the conversation on Discord