Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 7 months ago
0
Follow
Can I run a llama_index function calling
Can I run a llama_index function calling
Inactive
0
Follow
z
zby
7 months ago
ยท
Can I run a llama_index function calling agent with a Groq client?
L
z
3 comments
Share
Open in Discord
L
Logan M
7 months ago
function calling hasn't quite made it into the groq implementation yet, been meaning to add that
z
zby
7 months ago
I managed to have it with:
llm = Groq(model="llama3-8b-8192", is_function_calling_model=True)
openai_step_engine = OpenAIAgentWorker.from_tools(query_engine_tools, llm=llm, verbose=True)
agent = AgentRunner(openai_step_engine)
z
zby
7 months ago
but yeah - that looks like a hack
Add a reply
Sign up and join the conversation on Discord
Join on Discord