The community member is looking for recommendations to use LlamaIndex without any OpenAI API key or embeddings, index, or vector data from OpenAI. In the comments, another community member suggests using open-source language models and embedding models instead of OpenAI, and persisting all data locally. They provide links to the LlamaIndex documentation for using local LLMs and embedding models. However, the last link they provided for setting the service context globally does not work.
any recommendations to do Discover LlamaIndex: Bottoms-Up Development With LLMs without openapi key usage at all? no embeddings/index/vector nothing from openai to be used in llama setup/dev