Is it possible to create a chat agent (chatbot) with a local LLM (e.g. llm = LlamaCPP(...))?
With an OpenAI LLM instance, I do this
Plain Text
llm = OpenAI(model="gpt-3.5-turbo-0613")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)