Find answers from the community

Q
Qile
Offline, last seen 3 months ago
Joined September 25, 2024
I remember there was a Llamaindex feature support breakdown by vector store page somewhere. I can't seem to find it in the site docs anymore. Does anyone know where that is?
2 comments
Q
L
Hi! Quick question about coercing query engine outputs to pydantic objects. Given Ollama has launched an experimental OpenAI API compatibility layer (https://ollama.com/blog/openai-compatibility), which is used by Instructor (https://jxnl.github.io/instructor/examples/ollama/#ollama), has Llamaindex updated its query engine support for Ollama to use the function calling feature? (https://docs.llamaindex.ai/en/stable/examples/query_engine/pydantic_query_engine.html#create-the-index-query-engine-openai)
2 comments
L