Find answers from the community

Updated 3 months ago

whats the best local model inferencing

whats the best local model inferencing API server to use with LlamaIndex? vLLM? Ollama? LM Studio?
L
a
4 comments
I think ollama is probably the best supported at the moment
vLLM is a close second
(but I think vLLM has a few bugs, been meaning to clean it up)
Thank you @Logan M ! Always appreciated
Add a reply
Sign up and join the conversation on Discord