llama-index-question-gen-guidance
in order to use guidance with an open-source LLM (https://docs.llamaindex.ai/en/stable/examples/output_parsing/guidance_sub_question.html) but upon execution of from llama_index.question_gen.guidance import GuidanceQuestionGenerator
I get the error ImportError: cannot import name 'LLM' from 'llama_index.core.llms
. Installing llama-index-question-gen-guidance
also breaks my llama_index
installation. Any advise?llama-index
. Now I have llama-index==0.10.11
and things work. However, the new version shifts most the key modules to llama_index.core
. So if you were using an older version you will get an import error in from llama_index import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
. You need to change it to from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
for the imports to work.