Find answers from the community

Updated 6 months ago

@Logan M is it possible to change the

@Logan M is it possible to change the functioncalling agent's api so it doesn't require openai I want to use ollama but im only familiar with implementing its base url call to "localhost:XXXX" amd having it connect. Is there a way to implement that in this scenario?
L
B
4 comments
ollama doesn't have a function calling api
so it cant be used as a function calling agent
gotcha is there a way to aim the base_url of the system though if i can find another llm that has a localhost port like that?
Hmm, yea you can set base_url of ollama

The default is
Plain Text
llm = Ollama(...., base_url="http://localhost:11434")
Add a reply
Sign up and join the conversation on Discord