Find answers from the community

Updated 9 months ago

Cli

At a glance
Hi πŸ‘‹ Is it possible to use llamaindex-cli rag with a local model? (preferably these that llama-cpp cli is able to run really fast with on an M1 Mac https://huggingface.co/ggml-org). Looking at the source code, it looks like OpenAI is hardcoded into the CLI.
L
C
2 comments
ok, thank you, I'll give that a try πŸ™‚
Add a reply
Sign up and join the conversation on Discord