Find answers from the community

Home
Members
cognicurious
c
cognicurious
Offline, last seen 3 months ago
Joined September 25, 2024
Hi all. Setting up a Function Calling Mistral Agent following this example: https://docs.llamaindex.ai/en/stable/examples/agent/mistral_agent/.

I am using Together AI to run Mistral, so my LLM setup is as follows:

llm = TogetherLLM(
model="mistralai/Mistral-7B-Instruct-v0.2",
api_key=os.getenv("TOGETHERAI_API_KEY")
)

However, when I run the example code with this LLM setup, I got an error:

Plain Text
ValueError: Model name mistralai/Mistral-7B-Instruct-v0.2 does not support function calling API. 

Can someone tell me where I can find the list of llms that support function calling? I can't think of any reason why the Together AI version of Mistral would be functionally different from a locally hosted one.
3 comments
c
L
b