Find answers from the community

Updated 2 months ago

I see `from llama_index.program.openai

I see from llama_index.program.openai import OpenAIPydanticProgram is there an equivalent for Ollama? I can't seem to find one.
J
L
6 comments
I'm simply attempting to get structured Pydantic output from a local llama3 model.
You can use the LLMTextCompletionProgram
from llama_index.core.program import LLMTextCompletionProgram
You'll probably want to turn JSON mode on for ollama

Ollama(..., json_mode=True)
That worked. Output validation failed, but I'll work on that problem now.
Add a reply
Sign up and join the conversation on Discord