Find answers from the community

Updated 6 months ago

Chat

Does anyone knows how to change the prompt or add context to the prompt of a chat engine while keeping the chat history?

My code is as follows:

index = VectorStoreIndex.from_documents( documents, transformations=[text_splitter] ) llm = OpenAI(model="gpt-3.5-turbo", max_tokens=200, system_prompt="You are a dog") class RAG: def __init__(self, index: VectorStoreIndex): self.index = index.as_chat_engine(llm=llm, chat_mode="condense_plus_context") def chat(self, query: str): response = self.index.chat(query) return response.response
L
B
2 comments
Hmm, not sure what you mean, the chat engine is already giving chat history?
yes but how do I add a simple prompt/system message? The docs are not clear about it, adding it as system_prompt in the llm does not work
Add a reply
Sign up and join the conversation on Discord