Find answers from the community

Updated 10 months ago

Ollama - Instructor

Hi! Quick question about coercing query engine outputs to pydantic objects. Given Ollama has launched an experimental OpenAI API compatibility layer (https://ollama.com/blog/openai-compatibility), which is used by Instructor (https://jxnl.github.io/instructor/examples/ollama/#ollama), has Llamaindex updated its query engine support for Ollama to use the function calling feature? (https://docs.llamaindex.ai/en/stable/examples/query_engine/pydantic_query_engine.html#create-the-index-query-engine-openai)
L
2 comments
not yet -- does their client support the same tools api though? Tbh I would be surprised if it does

Instructor is just setting the mode to JSON, not the same as openai
Been meaning to make a change to better support LLMs that offer some kind of JSON mode, but it would be a bit of a bigger underlying structural change
Add a reply
Sign up and join the conversation on Discord