Find answers from the community

Updated 3 months ago

MistralAI has just released `Mistral-7B-Instruct-v0.3`, a local LLM is capable of function

MistralAI has just released Mistral-7B-Instruct-v0.3, a local LLM is capable of function-calling. How do I use that capability with LlamaIndex, preferrably interfacing via Ollama?
k
L
V
8 comments
Please stand by @Vicent W. as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To use the function-calling capability of the newly released Mistral-7B-Instruct-v0.3 with LlamaIndex, you would need to follow these steps:

  1. Set up the LLM: You need to set up the LLM (Language Model) to use the new Mistral model. You can do this globally or locally. Here's how you can do it:
Plain Text
# Global settings
from llama_index.core import Settings
from llama_index.llms.ollama import Ollama

Settings.llm = Ollama(model="Mistral-7B-Instruct-v0.3", request_timeout=60.0)

# Local settings
index.as_query_engine(llm=Ollama(model="Mistral-7B-Instruct-v0.3", request_timeout=60.0))

Source ()

  1. Define your functions: You need to define the functions that you want the LLM to call. Here's an example of how you can define a function and convert it into a tool that the LLM can use:
Plain Text
from llama_index.core.tools import FunctionTool

def multiply(a: int, b: int) -> int:
    """Multiple two integers and returns the result integer"""
    return a * b

multiply_tool = FunctionTool.from_defaults(fn=multiply)

Source ()

  1. Create an agent: You can create an agent that uses the LLM and the tools you've defined. Here's an example:
```python
from llama_index.llms.mistralai import MistralAI
from llama_index.core.tools import FunctionTool, FunctionCallingAgentWorker

Define more tools

def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b

add_tool = FunctionTool.from_defaults(fn=add)

Create the agent

llm = MistralAI(model="Mistral-7B-Instruct-v0.3")
agent_worker = FunctionCallingAgentWorker.from_tools(
[multiply_tool, add_tool],
llm=llm,
verbose=True,
allow_parallel_tool_calls=False,
)
agent = agent_worker.as_agent()
Plain Text
**Source** (<https://github.com/jerryjliu/llama_index/blob/main/docs/docs/examples/cookbooks/mistralai.ipynb>)

4. **Use the agent**: You can now use the agent to perform tasks. Here's an example:
python
response = agent.chat("What is (26 * 2) + 2024?")
print(response)
`` **Source** (<https://github.com/jerryjliu/llama_index/blob/main/docs/docs/examples/cookbooks/mistralai.ipynb>) Please note that you need to replace "Mistral-7B-Instruct-v0.3"` with the actual model name of the new Mistral model. Also, the function definitions and the agent usage are just examples, you need to adapt them to your specific use case.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Ollama doesn't have a tool usage API, so you have to do raw prompting (not very userfriendly)
Huggingface has a tools thing in their tokenizer, but llama-index doesn't make use of it just yet (waiting on it to become more official)
Sounds like I'm a too-early adopter haha
(Nice seeing you again Logan! I've been buried in work for too deep in the last couple of months.)
Add a reply
Sign up and join the conversation on Discord