Find answers from the community

Updated 10 months ago

You need to use a chat model in order the use role blocks like with user():! Perhaps you m

@kapa.ai You need to use a chat model in order the use role blocks like with user():! Perhaps you meant to use the TransformersChat class? How can I deal with this error? I am using the llama-index-question-gen-guidance library?
k
2 comments
Please stand by @balanp as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Based on the information provided in the extracts, it seems you are trying to use a chat model in LlamaIndex. Here is an example of how to construct a chat model using LlamaIndex:

Plain Text
from llama_index.llms import ChatMessage

# Construct mock Chat history
history_message = ChatMessage(
    **{
        "role": "user",
        "content": (
            "When asked 'who are you?' respond as 'I am qblocks llm model'"
            " everytime."
        ),
    }
)
current_message = ChatMessage(**{"role": "user", "content": "Who are you?"})

response = llm.chat([history_message, current_message])
print(response)

This example is taken from the LlamaIndex documentation ().

However, the error message you're seeing seems to be related to the Transformers library, not LlamaIndex. The extracts provided do not contain information about the TransformersChat class or the error you're encountering.

For more detailed guidance, you may want to refer to the official documentation of the library you're using or seek help from the community or support forums.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord