Find answers from the community

Updated 8 months ago

Chat

At a glance

The community member is asking how to change the prompt or add context to the prompt of a chat engine while keeping the chat history. They have provided their code, which includes setting up a VectorStoreIndex and an OpenAI language model with a system prompt of "You are a dog". The community member is using a RAG class to create a chat engine.

In the comments, another community member is unsure what the original poster means, as the chat engine is already giving chat history. The original poster clarifies that they want to know how to add a simple prompt or system message, as setting the system_prompt in the language model does not work.

There is no explicitly marked answer in the provided information.

Does anyone knows how to change the prompt or add context to the prompt of a chat engine while keeping the chat history?

My code is as follows:

index = VectorStoreIndex.from_documents( documents, transformations=[text_splitter] ) llm = OpenAI(model="gpt-3.5-turbo", max_tokens=200, system_prompt="You are a dog") class RAG: def __init__(self, index: VectorStoreIndex): self.index = index.as_chat_engine(llm=llm, chat_mode="condense_plus_context") def chat(self, query: str): response = self.index.chat(query) return response.response
L
B
2 comments
Hmm, not sure what you mean, the chat engine is already giving chat history?
yes but how do I add a simple prompt/system message? The docs are not clear about it, adding it as system_prompt in the llm does not work
Add a reply
Sign up and join the conversation on Discord