Find answers from the community

Updated 2 years ago

Function api

Does LlamaIndex support OpenAI's function calling at the moment?
L
p
7 comments
I'm also working on an integration to use it for the router query engine (should hopefully help make the index selectors more stable by using pydantic)
Great, thank you! I saw the OpenAIAgent in the docs earlier but wasn't sure if it was the same thing haha
Is the reason for putting the functions in a vector store so that the LLM can decide which function to use by checking the embeddings to find the most relevant function?
Attachment
image.png
Yup pretty much! It's useful for when you have more tools than can fit in the model input
Would it make sense to create a function that takes in no params but then returns a JSON object of themes it found in a doc?

I basically want the input to be the context from my uploaded docs and then return a "summary" key for example with the value the LLM generated as the output
Wouldn't the input to the function be the themes it found then? πŸ‘€
Add a reply
Sign up and join the conversation on Discord