Find answers from the community

Home
Members
Marko911__
M
Marko911__
Offline, last seen 3 months ago
Joined September 25, 2024
Anyone have this problem where using a pydantic output parser with a LLMProgram and local Ollama returns escaped underscores for field names and messes up the json parsing ?
Plain Text
{
"type": "PLAY\_CARD",
"card\_id": "b473cd1a-0370-4971-8db5-c5a7f007fee1",
"end\_turn": false
}

this is what the LLM returns
3 comments
M
T
Simple chat engine and output parsing doesn't work.
If I do something like
Plain Text
llm = Ollama(
    model="mistral",
)
chat_engine = SimpleChatEngine.from_defaults(
    llm=llm,
    system_prompt=system_prompt,
    output_parser=PydanticOutputParser(output_cls=Choice),
)

The output doesn't get returned in the Pydantic output class. I just get a paragraph from the llm.
3 comments
M
L
Does anyone know if it's possible to do a SimpleChatEngine with a local model using Ollama but without having to use an index? I just want to do a chat with memory using a local model.
23 comments
W
M