Find answers from the community

Updated 4 months ago

Here is my code. I am wondering if there is a way to put conversation history as parameter

@kapa.ai Here is my code. I am wondering if there is a way to put conversation history as parameter as chat_engine.chat(query, conversation_history) instead of chat_engine.chat(query). The whole idea is I don't have to reinstantiate chat engine every time there is a new conversation history

def create_chat_engine(self):
# Create a chat memory buffer for conversation history
memory = ChatMemoryBuffer.from_defaults(token_limit=default_configuration.TOKEN_LIMIT_MEMORY_BUFFER_CHAT_ENGINE)

# Set a conversation history into a memory as chat_history
memory.set(chat_messages)

# Create a chat engine
chat_engine = CondensePlusContextChatEngine.from_defaults(
index.as_retriever(),
memory=memory,
llm=llm,
context_prompt=(
"You are a chatbot, able to have normal interactions as well as talk in a professional manner."
"Here are the relevant documents for the context:\n"
"{context_str}"
"\nInstruction: Use the previous chat history and the context above to interact and help the user."
),
verbose=False,
)

return chat_engine

def use_chat_engine(self):

chat_engine = self.create_chat_engine()
query = "Please tell me about LlamaIndex)
chat_engine.chat(query)
k
i
W
9 comments
Please stand by @ihshan :) as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, you can pass the conversation history as a parameter to the chat method of the chat engine. In the LlamaIndex API, the chat method of the CondensePlusContextChatEngine class accepts an optional chat_history parameter, which is a list of ChatMessage objects.

Here is how you can modify your use_chat_engine method to include a conversation history:

Plain Text
def use_chat_engine(self, chat_history):
  
    chat_engine = self.create_chat_engine()
    query = "Please tell me about LlamaIndex"
    chat_engine.chat(query, chat_history)


In this code, chat_history is a list of ChatMessage objects that represent the conversation history. You would need to create this list before calling use_chat_engine.

Here is an example of how you can create a ChatMessage object:

Plain Text
from llama_index.core.llms import ChatMessage, MessageRole

chat_history = [
    ChatMessage(
        role=MessageRole.USER,
        content="Hello assistant, we are having a insightful discussion about Paul Graham today.",
    ),
    ChatMessage(role=MessageRole.ASSISTANT, content="Okay, sounds good."),
]


You can find more details in the LlamaIndex API reference ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai in the code that you suggested, I always have to instantiate chat_engine then before calling chat_engine.chat(query, chat_history) ? I don't want to instantiate it all the time. I just want to do that once
Let me check the knowledge base for you. One moment! :books:
----------------------
Yes, you can instantiate the chat engine once and then use it multiple times with different conversation histories. However, you need to ensure that the memory buffer of the chat engine is reset before each new conversation. Here is how you can modify your code:

Plain Text
class MyChatEngine:
    def __init__(self):
        # Create a chat memory buffer for conversation history
        self.memory = ChatMemoryBuffer.from_defaults(token_limit=default_configuration.TOKEN_LIMIT_MEMORY_BUFFER_CHAT_ENGINE)
        
        # Create a chat engine
        self.chat_engine = CondensePlusContextChatEngine.from_defaults(
            index.as_retriever(),
            memory=self.memory,
            llm=llm,
            context_prompt=(
                "You are a chatbot, able to have normal interactions as well as talk in a professional manner."   
                "Here are the relevant documents for the context:\n"
                "{{context_str}}"
                "\nInstruction: Use the previous chat history and the context above to interact and help the user."
            ),
            verbose=False,
        )

    def use_chat_engine(self, query, chat_history):
        # Set a conversation history into a memory as chat_history
        self.memory.set(chat_history)
        self.chat_engine.chat(query)


In this code, MyChatEngine is a class that wraps the chat engine. You can create an instance of MyChatEngine once, and then call the use_chat_engine method multiple times with different conversation histories.

Please note that the set method of the memory buffer will replace the existing conversation history in the memory buffer with the new one. If you want to append new messages to the existing conversation history, you should use the put method of the memory buffer instead.
You can find more details in the LlamaIndex API reference ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
You can pass chat_history in .chat() method
Thanks a lot. Yeah I got it
Add a reply
Sign up and join the conversation on Discord