----------------------
Based on the information provided in the extracts, it seems you are trying to use a chat model in LlamaIndex. Here is an example of how to construct a chat model using LlamaIndex:
from llama_index.llms import ChatMessage
# Construct mock Chat history
history_message = ChatMessage(
**{
"role": "user",
"content": (
"When asked 'who are you?' respond as 'I am qblocks llm model'"
" everytime."
),
}
)
current_message = ChatMessage(**{"role": "user", "content": "Who are you?"})
response = llm.chat([history_message, current_message])
print(response)
This example is taken from the
LlamaIndex documentation ().
However, the error message you're seeing seems to be related to the Transformers library, not LlamaIndex. The extracts provided do not contain information about the TransformersChat class or the error you're encountering.
For more detailed guidance, you may want to refer to the official documentation of the library you're using or seek help from the community or support forums.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------