Find answers from the community

Updated 2 months ago

I think I stumbled upon an issue

I think I stumbled upon an issue.
Looks like when llama_index.llms.OpenAI gets serialized, it also serializes the OpenAI API key.... 😱
I'm using WandbCallbackHandler so that it log the traces to wandb and I can clearly see my openai key up there (see screenshot)
Is that expected, am I doing something wrong?
Attachment
image.png
L
A
7 comments
hmmm the serialized thing should not be logged to wandb, oof
I changed this https://github.com/jerryjliu/llama_index/blob/main/llama_index/llms/openai.py#L54
to
api_key: str = Field(default=None, description="The OpenAI API key.", exclude=True)
I think this change kinda fixes the problem. I mean, not even thinking about WandB, why would one want to serialize the api key...
let me know if you want me to do a PR with this change
I mean I made the change locally and run the wandb logging to check if it worked as expected
yea feel free to submit a PR -- maybe do a quick check across other llms / embedding models lol
OpenAI one is ready
https://github.com/jerryjliu/llama_index/pull/7920
maybe do a quick check across other llms / embedding models
yup, I'll try to do that tomorrow, it's almost 1 am for me now πŸ˜†
I'll keep you posted
ha yea no worries, thanks!
Add a reply
Sign up and join the conversation on Discord