Hello Guys! I am facing an issue would be very grateful for any help!
I noticed there aren't any Frameworks supporting Function Calling for Local LLMs like LlamaCpp (I tried ReAct Agent but it is not giving promising results). Thus, I decided to create a wrapper around LlamaIndex's FunctionCallingLLM wrapper powered by Llama 3 model fine-tuned for Function Calling by OG Trelis on Hugging Face.
I am facing several issues while doing the same where the datatypes aren't matching and while passing in the tools to the AgentRunner it isn't reaching to the LLM and getting lost somewhere in the middle and hence LLM seems to know nothing of the tools when responding. Also, I am pretty novice to all this...
Have any of you guys done anything sort of this. If so let me know and we can connect to figure this out.
Sorry, the code was too long to embed here so I am providing the github link
https://github.com/devpateltech007/LlamaIndexLocalFunctionCallingLLM/blob/main/FunctionCallingLlamaCpp.pyI have chosen MistralAI base.py and modified it since the tool-calling format for the Fine-tuned Llama is the same as Mistral. And to be honest I needed a starting point to understand the code and figure a way to bend it to my use case.