The community member is new to llmaindex and is looking for a simpler way to get started, as their system does not have 32GB of RAM required for the recommended ollama approach. A comment suggests using Google Colab to run smaller LLMs like Microsoft Phi-3, which would require at least 16GB of RAM. The comment indicates this could be a more suitable option for the community member who is just starting out.
Hi all, I am new to llmaindex. I was reading about how to get started. I cannot use open ai for getting started as my free credits have expired. In the documentation alternate ways is given using ollama but that required 32GB of RAM (https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ ). Is there a simpler way to get started with llma-index as my system do not have this much of RAM