Find answers from the community

Updated 8 months ago

Can I run a llama_index function calling

Can I run a llama_index function calling agent with a Groq client?
L
z
3 comments
function calling hasn't quite made it into the groq implementation yet, been meaning to add that
I managed to have it with:
llm = Groq(model="llama3-8b-8192", is_function_calling_model=True)
openai_step_engine = OpenAIAgentWorker.from_tools(query_engine_tools, llm=llm, verbose=True)
agent = AgentRunner(openai_step_engine)
but yeah - that looks like a hack
Add a reply
Sign up and join the conversation on Discord