Find answers from the community

Updated 3 months ago

Hi all,

Hi all,
I am new to llmaindex. I was reading about how to get started. I cannot use open ai for getting started as my free credits have expired. In the documentation alternate ways is given using ollama but that required 32GB of RAM (https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ ). Is there a simpler way to get started with llma-index as my system do not have this much of RAM
Attachment
Screenshot_2.png
W
1 comment
Ollama now has smaller LLMs as well like microsoft phi-3.

But yeah you'll require some juice like 16GB atleast to let it run efficiently.

SInce you are starting out I would suggest use google colab, You can run llama 2, phi 3 using ollama there
Add a reply
Sign up and join the conversation on Discord