Find answers from the community

Updated 9 months ago

With the latest llama-index 0.10.27, I

With the latest llama-index 0.10.27, I see the following exception. Anything that I am missing?
from llama_index.llms.azure_openai import AzureOpenAI
2024-04-09T05:58:23.168094571Z File "/app/.venv/lib/python3.9/site-packages/llama_index/llms/azure_openai/init.py", line 1, in <module>
2024-04-09T05:58:23.168096031Z from llama_index.llms.azure_openai.base import (
2024-04-09T05:58:23.168097378Z File "/app/.venv/lib/python3.9/site-packages/llama_index/llms/azure_openai/base.py", line 13, in <module>
2024-04-09T05:58:23.168098849Z from llama_index.llms.openai import OpenAI
2024-04-09T05:58:23.168100197Z File "/app/.venv/lib/python3.9/site-packages/llama_index/llms/openai/init.py", line 1, in <module>
2024-04-09T05:58:23.168101683Z from llama_index.llms.openai.base import AsyncOpenAI, OpenAI, SyncOpenAI, Tokenizer
2024-04-09T05:58:23.168103123Z File "/app/.venv/lib/python3.9/site-packages/llama_index/llms/openai/base.py", line 53, in <module>
2024-04-09T05:58:23.168104570Z from llama_index.llms.openai.utils import (
2024-04-09T05:58:23.168105902Z File "/app/.venv/lib/python3.9/site-packages/llama_index/llms/openai/utils.py", line 24, in <module>
2024-04-09T05:58:23.168108870Z from openai.types.chat.chat_completion_token_logprob import ChatCompletionTokenLogprob
2024-04-09T05:58:23.168110335Z ModuleNotFoundError: No module named 'openai.types.chat.chat_completion_token_logprob'
W
G
L
5 comments
I think you need to update openai
I checked the open ai version that gets resolved. It is already on 1.14.3.
Are you sure? Works fine for me

Plain Text
(venv) pip show openai
Name: openai
Version: 1.14.3
Summary: The official Python library for the openai API
Home-page: 
Author: 
Author-email: OpenAI <support@openai.com>
License: 
Location: /Users/loganmarkewich/giant_change/llama_index/venv/lib/python3.10/site-packages
Requires: anyio, distro, httpx, pydantic, sniffio, tqdm, typing-extensions
Required-by: llama-index-agent-openai, llama-index-core, llama-index-legacy
(venv) python
Python 3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:41:52) [Clang 15.0.7 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from openai.types.chat.chat_completion_token_logprob import ChatCompletionTokenLogprob
>>> 
If you are running in a noteook, you might have to restart it
@Logan M & @WhiteFang_Jr You are right. Took a deeper look. The module A which is directly dependent on lllama-index is pulling 1.14.3 of openai. However the upstream service B depending on A is pulling in an older version of openai and hence the problem. Thank you for tipping me on this!
Add a reply
Sign up and join the conversation on Discord