Find answers from the community

Updated 2 months ago

hm... with `llama_index.set_global_

hm... with llama_index.set_global_handler("simple") I am not seeing any verbosity/debug messages for building the vector store index
L
t
e
16 comments
it's really meant purely for llm inputs/outputs
when i use the external TEI server I get more output
this is the first time i've done a massive embedding of local stuff
The arize integration is probably our best observability tool right now
all local tracing, nice ui
i don't need that much
would just be nice to know that like it's progressing and not frozen
but, personal nit, really
theres a show_progress=True kwarg you can set
ah! that's the one
is that on the VectorStoreIndex
yassss that was it
not sure if helpful/already there but

Plain Text
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))


and verbose=True for mo logging πŸ™‚
Add a reply
Sign up and join the conversation on Discord