Find answers from the community

Updated 2 months ago

chat agent with OSS models

Is it possible to create a chat agent (chatbot) with a local LLM (e.g. llm = LlamaCPP(...))?

With an OpenAI LLM instance, I do this
Plain Text
llm = OpenAI(model="gpt-3.5-turbo-0613")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
    "What's the weather like in San Francisco, Tokyo, and Paris?"
)
r
1 comment
Add a reply
Sign up and join the conversation on Discord