memory
object (into a file so that I can load it later), as shown in the following sample code- llm = OpenAI(model="gpt-3.5-turbo", api_key=api_key)
memory = ChatMemoryBuffer.from_defaults(token_limit=2000)
index = create_index(text1, text2)
chat_engine = index.as_chat_engine(
llm = llm,
chat_mode="context",
memory=memory,
prefix_messages=[ChatMessage(role="system", content="..."),
ChatMessage(role="assistant", content="""Hello, and Welcome, please introduce yourself""")]
)
for i in range(5):
resp = chat_engine.chat("Hello my name is John").response