Find answers from the community

Updated 7 months ago

Hi all. Setting up a Function Calling

Hi all. Setting up a Function Calling Mistral Agent following this example: https://docs.llamaindex.ai/en/stable/examples/agent/mistral_agent/.

I am using Together AI to run Mistral, so my LLM setup is as follows:

llm = TogetherLLM(
model="mistralai/Mistral-7B-Instruct-v0.2",
api_key=os.getenv("TOGETHERAI_API_KEY")
)

However, when I run the example code with this LLM setup, I got an error:

Plain Text
ValueError: Model name mistralai/Mistral-7B-Instruct-v0.2 does not support function calling API. 

Can someone tell me where I can find the list of llms that support function calling? I can't think of any reason why the Together AI version of Mistral would be functionally different from a locally hosted one.
b
L
c
3 comments
The local and hosted one are different because they are handling the tool prompting all behind the api (and I don't think mistral shared how they do it)
Thanks! makes sense πŸ™‚
Add a reply
Sign up and join the conversation on Discord