Find answers from the community

Updated 10 months ago

Hi all. Setting up a Function Calling

At a glance

A community member is setting up a Function Calling Mistral Agent using the Together AI platform, but is encountering an error that the model "mistralai/Mistral-7B-Instruct-v0.2" does not support the function calling API. The community member is asking where they can find a list of LLMs that do support function calling. Another community member responds that the local and hosted versions of Mistral may be different in how they handle the tool prompting, and Mistral has not shared details on this.

Useful resources
Hi all. Setting up a Function Calling Mistral Agent following this example: https://docs.llamaindex.ai/en/stable/examples/agent/mistral_agent/.

I am using Together AI to run Mistral, so my LLM setup is as follows:

llm = TogetherLLM(
model="mistralai/Mistral-7B-Instruct-v0.2",
api_key=os.getenv("TOGETHERAI_API_KEY")
)

However, when I run the example code with this LLM setup, I got an error:

Plain Text
ValueError: Model name mistralai/Mistral-7B-Instruct-v0.2 does not support function calling API. 

Can someone tell me where I can find the list of llms that support function calling? I can't think of any reason why the Together AI version of Mistral would be functionally different from a locally hosted one.
b
L
c
3 comments
The local and hosted one are different because they are handling the tool prompting all behind the api (and I don't think mistral shared how they do it)
Thanks! makes sense πŸ™‚
Add a reply
Sign up and join the conversation on Discord