Find answers from the community

Updated 4 months ago

Hi all,

At a glance

The community member is new to llmaindex and is looking for a simpler way to get started, as their system does not have 32GB of RAM required for the recommended ollama approach. A comment suggests using Google Colab to run smaller LLMs like Microsoft Phi-3, which would require at least 16GB of RAM. The comment indicates this could be a more suitable option for the community member who is just starting out.

Useful resources
Hi all,
I am new to llmaindex. I was reading about how to get started. I cannot use open ai for getting started as my free credits have expired. In the documentation alternate ways is given using ollama but that required 32GB of RAM (https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ ). Is there a simpler way to get started with llma-index as my system do not have this much of RAM
Attachment
Screenshot_2.png
W
1 comment
Ollama now has smaller LLMs as well like microsoft phi-3.

But yeah you'll require some juice like 16GB atleast to let it run efficiently.

SInce you are starting out I would suggest use google colab, You can run llama 2, phi 3 using ollama there
Add a reply
Sign up and join the conversation on Discord