----------------------
Yes, you can pass the conversation history as a parameter to the
chat
method of the chat engine. In the LlamaIndex API, the
chat
method of the
CondensePlusContextChatEngine
class accepts an optional
chat_history
parameter, which is a list of
ChatMessage
objects.
Here is how you can modify your
use_chat_engine
method to include a conversation history:
def use_chat_engine(self, chat_history):
chat_engine = self.create_chat_engine()
query = "Please tell me about LlamaIndex"
chat_engine.chat(query, chat_history)
In this code,
chat_history
is a list of
ChatMessage
objects that represent the conversation history. You would need to create this list before calling
use_chat_engine
.
Here is an example of how you can create a
ChatMessage
object:
from llama_index.core.llms import ChatMessage, MessageRole
chat_history = [
ChatMessage(
role=MessageRole.USER,
content="Hello assistant, we are having a insightful discussion about Paul Graham today.",
),
ChatMessage(role=MessageRole.ASSISTANT, content="Okay, sounds good."),
]
You can find more details in the
LlamaIndex API reference ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------