text_list = [documents[0].text] documents = [Document(t) for t in text_list]
index = VectorStoreIndex.from_documents(documents)
index.storage_context.persist()
storage_context = StorageContext.from_defaults(persist_dir="./storage") index = load_index_from_storage(storage_context) query_engine = index.as_query_engine()
while True: user_input = input("Please enter a question: ") if user_input.lower() == 'exit': break response = query_engine.query(user_input) print(response)
When I use this example from the doc I get the following error: how can I solve it? File "main.py", line 5, in <module> from llama_index.llms import OpenAI ModuleNotFoundError: There is no module named 'llama_index.llms'