Find answers from the community

Updated 2 months ago

```

Plain Text
import logging
import tqdm
import sys
from importlib import reload

reload(logging)
logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', stream=sys.stdout, level=logging.DEBUG, datefmt='%I:%M:%S')
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

response_synthesizer = get_response_synthesizer(
    service_context=service_context,
    response_mode='compact'
)
response = response_synthesizer.synthesize(
  prompt,
  nodes=retrieved_nodes
)
display_response(response)


Nothing gets printed to the logger. Could you please help me figure out the issue here?
L
D
11 comments
hmm, pretty weird. That looks correct at first glance. I can take a deeper look in a bit
Thanks! I too feel it's weird. I have also tried restarting the notebook kernel but it didn't help.
Just in case it's relevant, the llm of the service_context is a langchain.llms.Anthropic
Maybe try using our own Anthropic LLM integration? Although there should be more being printed than just LLM stuff πŸ€”
I didn't know Anthropic integration was added to llama-index. Thanks for sharing this info! I will give it a try and then let you know!
Hmmm still nothing printed
Attachment
image.png
Oh you are querying the llm directly
There's nothing extra to print here actually? πŸ€”
I just did the response_synthesizer thing using llama-index's Anthropic integration but still nothing got printed to the logger. I was just trying to see the actual prompt passed to Anthropic. Any suggestions? @Logan M
Yea anthropic just doesn't seem to log it. Would be a quick PR to add the debug log if you had time πŸ˜‰ https://github.com/jerryjliu/llama_index/blob/main/llama_index/llm_predictor/base.py#L94

You can also programatically access the prompts and completions if you use the token counter

https://gpt-index.readthedocs.io/en/latest/examples/callbacks/TokenCountingHandler.html#advanced-usage
Cool, thanks! I will try to submit a PR if I get a chance.
Add a reply
Sign up and join the conversation on Discord