Find answers from the community

Home
Members
Lazarus
L
Lazarus
Offline, last seen 3 months ago
Joined September 25, 2024
Can anyone please help me out here? I want to save the memory object (into a file so that I can load it later), as shown in the following sample code-
llm = OpenAI(model="gpt-3.5-turbo", api_key=api_key) memory = ChatMemoryBuffer.from_defaults(token_limit=2000) index = create_index(text1, text2) chat_engine = index.as_chat_engine( llm = llm, chat_mode="context", memory=memory, prefix_messages=[ChatMessage(role="system", content="..."), ChatMessage(role="assistant", content="""Hello, and Welcome, please introduce yourself""")] ) for i in range(5): resp = chat_engine.chat("Hello my name is John").response
4 comments
L
W
Hi all, I'm currently developing a chat engine that would use the context of the vector index created from a plain text snippet. But I couldn't find any solutions in the Llamaindex docs that used raw text instead of feeding a list of files through SimpleDirectoryReader. Can anyone please help me on how to achieve this?
1 comment
L