Find answers from the community

Home
Members
DannyBr
D
DannyBr
Offline, last seen 3 months ago
Joined September 25, 2024
Heh there, I get this error when running this code(I'm trying to use Litellm to enable me use models hosted on Together AI) :
Plain Text
 python 
from llama_index.llms import ChatMessage, LiteLLM

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="Tell me a story"),
]
resp = LiteLLM("teknium/OpenHermes-2-Mistral-7B").chat(messages)

Got this from the docs, link : https://docs.llamaindex.ai/en/stable/examples/llm/litellm.html
ERROR :
Plain Text
raceback (most recent call last):
  File "c:\Users\User1\Documents\AI AGENTS\llamaindex\starter.py", line 9, in <module>
    resp = LiteLLM("teknium/OpenHermes-2-Mistral-7B").chat(messages)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\llama_index\llms\litellm.py", line 74, in __init__
    validate_litellm_api_key(api_key, api_type)
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\llama_index\llms\litellm_utils.py", line 191, in validate_litellm_api_key
    import litellm
  File "C:\Users\User1anaconda3\envs\llamaindexenv\Lib\site-packages\litellm\__init__.py", line 332, in <module>
    from .timeout import timeout
  File "C:\Users\User1\anaconda3\envs\llamaindexenv\Lib\site-packages\litellm\timeout.py", line 20, in <module>
    from openai.error import Timeout
ModuleNotFoundError: No module named 'openai.error'
3 comments
L
D