Is there a way create a simple LlamaIndex app that does not involve OpenAI at all? I'd like to avoid any risk of communicating with external LLMs or embedding models. I have my own classes for llm and embed_model, but even just importing basic things from llama_index seems to trigger errors, like this one: "You tried to access openai.Completion, but this is no longer supported in openai>=1.0.0..."