Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 10 months ago
0
Follow
Ollama - Instructor
Ollama - Instructor
Inactive
0
Follow
Q
Qile
10 months ago
Β·
Hi! Quick question about coercing query engine outputs to pydantic objects. Given Ollama has launched an experimental OpenAI API compatibility layer (
https://ollama.com/blog/openai-compatibility
), which is used by Instructor (
https://jxnl.github.io/instructor/examples/ollama/#ollama
), has Llamaindex updated its query engine support for Ollama to use the function calling feature? (
https://docs.llamaindex.ai/en/stable/examples/query_engine/pydantic_query_engine.html#create-the-index-query-engine-openai
)
L
2 comments
Share
Open in Discord
L
Logan M
10 months ago
not yet -- does their client support the same tools api though? Tbh I would be surprised if it does
Instructor is just setting the mode to JSON, not the same as openai
L
Logan M
10 months ago
Been meaning to make a change to better support LLMs that offer some kind of JSON mode, but it would be a bit of a bigger underlying structural change
Add a reply
Sign up and join the conversation on Discord
Join on Discord