Find answers from the community

Updated 2 months ago

Does Llamaindex support JSONmode and

Does Llamaindex support JSONmode and system prompt, when using Groq?

In the example given, I see this:

Plain Text
llm.complete(prompt)


is there a way to send a system prompt and turn json mode on?
p
L
4 comments
Okay, I can send a system message like this:

Plain Text
messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)


But what about JSONmode?
I don't think it's supported, at least explicitly

Probably you can provide it as a kwarg though that gets passed into the api call
llm.complete(..., json=True) or whichever the option is called
(I don't actually know)
Add a reply
Sign up and join the conversation on Discord