Find answers from the community

Updated 4 months ago

hm... with `llama_index.set_global_

At a glance

The community member is using llama_index.set_global_handler("simple") but is not seeing any verbosity or debug messages when building the vector store index. Other community members suggest using show_progress=True on the VectorStoreIndex to see progress, and also provide additional logging options to get more output. There is no explicitly marked answer, but the community members provide helpful suggestions to address the issue.

hm... with llama_index.set_global_handler("simple") I am not seeing any verbosity/debug messages for building the vector store index
L
t
e
16 comments
it's really meant purely for llm inputs/outputs
when i use the external TEI server I get more output
this is the first time i've done a massive embedding of local stuff
The arize integration is probably our best observability tool right now
all local tracing, nice ui
i don't need that much
would just be nice to know that like it's progressing and not frozen
but, personal nit, really
theres a show_progress=True kwarg you can set
ah! that's the one
is that on the VectorStoreIndex
Yup! πŸ‘
yassss that was it
not sure if helpful/already there but

Plain Text
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))


and verbose=True for mo logging πŸ™‚
Add a reply
Sign up and join the conversation on Discord