Find answers from the community

Updated 2 months ago

Heh there, I get this error when running

Heh there, I get this error when running this code(I'm trying to use Litellm to enable me use models hosted on Together AI) :
Plain Text
 python 
from llama_index.llms import ChatMessage, LiteLLM

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="Tell me a story"),
]
resp = LiteLLM("teknium/OpenHermes-2-Mistral-7B").chat(messages)

Got this from the docs, link : https://docs.llamaindex.ai/en/stable/examples/llm/litellm.html
ERROR :
Plain Text
raceback (most recent call last):
  File "c:\Users\User1\Documents\AI AGENTS\llamaindex\starter.py", line 9, in <module>
    resp = LiteLLM("teknium/OpenHermes-2-Mistral-7B").chat(messages)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\llama_index\llms\litellm.py", line 74, in __init__
    validate_litellm_api_key(api_key, api_type)
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\llama_index\llms\litellm_utils.py", line 191, in validate_litellm_api_key
    import litellm
  File "C:\Users\User1anaconda3\envs\llamaindexenv\Lib\site-packages\litellm\__init__.py", line 332, in <module>
    from .timeout import timeout
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\litellm\timeout.py", line 20, in <module>
    from openai.error import Timeout
ModuleNotFoundError: No module named 'openai.error'
L
D
3 comments
I think you either need to update litellm, or litellm is not supporting the latest openai client yet
I'm using the latest version of litellm, so maybe it could be the latter, would wait till it's fixed then.
Looks like you aren't the only one πŸ˜… https://github.com/BerriAI/litellm/issues/799
Add a reply
Sign up and join the conversation on Discord