Find answers from the community

Updated 3 weeks ago

Troubleshooting Chatbot History and Context Maintenance

Hello everyone, I've had an issue over the past couple of days and the documentation don't seem to give enough detail about it.
I am trying to test the chatbot by asking it questions and giving them answers to test the history and whether the conversation is maintained but it failes every time. I asked it my name, failed gave it the answer and then asked it again and it was unable to tell me meaning it does not save the context in the history.
Plain Text
async def ignite(self) -> None:
        """Initialize chat engine with vector store index"""
        try:
            # Other code
           
            index = VectorStoreIndex.from_vector_store(vector_store=vector_store)

            print("we have the index set up so now the engine")
            print(f"System Prompt: {self.system_prompt}")
            print(f"Memory Buffer: {self.memory_buffer}")

            engine = index.as_chat_engine(
                chat_mode=ChatMode.CONDENSE_PLUS_CONTEXT,
                memory=self.memory_buffer,
                system_prompt=self.system_prompt,
                verbose=True,
            )
            self.chat_engine = engine
            logger.info(f"Chat engine created: {self.chat_engine}")
        except Exception as e:
            logger.error(f"Failed to initialize chat engine: {str(e)}")
            raise

      # Querying over a document
      response = self.chat_engine.chat(message=message)
      
    logger.info(f"Chat response received successfully {response}")
W
b
L
5 comments
I'm assuming response line is not from the above shared method?

Also What llm are you using?
One thing that worked for me was to append instructions like these in the system prompt.

Plain Text
self.system_prompt = self.system_prompt + "user name is {user_name}, Always refer to user name when talking!"
So I thought of updating the memory buffer each time there's an update, but that would beat the sense of using chat right
for example self.memory_buffer.put("ChatMessage")
It wont matter that much as context is just a static text that is used whenever there is a llm call only.
So when users comes you can attacah the user name as its a one time thing and then you can have the conversations going
re-creating the chat engine every time will only work if the memory buffer stays updated (Probably its getting destroyed and losing memory? Hard to say without seeing more code)
Add a reply
Sign up and join the conversation on Discord