Find answers from the community

Updated 3 months ago

When using the `llama_index.llms import

When using the llama_index.llms import OpenAI how to set the log to "debug"?

For getting openai logs seen suggested openai.log = "debug", but doesn't work for my setup.

Here's a snippet:

Plain Text
...
from llama_index.llms import OpenAI
import openai

openai.log = "debug"
app = FastAPI()
loader = SitemapReader()

llm = OpenAI(temperature=0.1, model="gpt-3.5-turbo")
service_context = ServiceContext.from_defaults(llm=llm)


Followed the documentation, from most basics simply set an environment variable for OPEN AI api etc. Happens that I extended my setup to include OpenAI, as I wanted to modify the model, otherwise I wouldn't need it.

So, when someone says to openai.log = "debug", I wonder what they mean in llama index context or setup?
1
T
P
W
6 comments
Yeah it seems it doesn't work for the latest version of Llamaindex

@WhiteFang_Jr Have you experienced the same with?
Plain Text
openai.log = "debug"
@Teemu not at all. Thanks for checking.
Yeah @Teemu I'm at 0.8.66, just checked, and debugging did not work.
Dang, that sucks, I guess the new openai client actually did remove that πŸ€” there must be some new way to enable it though, those logs were helpful
Got something as logs but it is not giving much information in the new openai client.
Had to set OPENAI_LOG in the environment at the top before the OpenAI import.

Plain Text
import os
os.environ['OPENAI_LOG'] = "debug"

from llama_index.llms import OpenAI
Attachment
image.png
hmmm yea, not quite as helpful. Before it logged the entire request. But closer!
Add a reply
Sign up and join the conversation on Discord