Create the agent
llm = MistralAI(model="Mistral-7B-Instruct-v0.3")
agent_worker = FunctionCallingAgentWorker.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
allow_parallel_tool_calls=False,
)
agent = agent_worker.as_agent()
**Source** (<https://github.com/jerryjliu/llama_index/blob/main/docs/docs/examples/cookbooks/mistralai.ipynb>)
4. **Use the agent**: You can now use the agent to perform tasks. Here's an example:
python
response = agent.chat("What is (26 * 2) + 2024?")
print(response)
``
**Source** (<https://github.com/jerryjliu/llama_index/blob/main/docs/docs/examples/cookbooks/mistralai.ipynb>)
Please note that you need to replace
"Mistral-7B-Instruct-v0.3"` with the actual model name of the new Mistral model. Also, the function definitions and the agent usage are just examples, you need to adapt them to your specific use case.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------