Find answers from the community

A
Andrew
Offline, last seen 4 months ago
Joined September 25, 2024
Hello, I am trying to run Fast API Python backend for fullstack application ( from create-llama command). But with Replaced default LLM Provider. I am trying to replace OpenAI llm with PaLM llm and OpenAI embeddings with palm embeddings and I got some issue. Listing is here: https://gist.github.com/OTR/2eeca8d7fa8d5087397a3f9944b6a0fb
24 comments
A
L