Find answers from the community

Updated 11 months ago

llama_index/llama_index/agent/types.py a...

V
L
S
11 comments
Yeah not sure what's up lol
I don't even see anywhere in the code where extra_state is dircetly assigned -- it's always assigned/accessed like a dictionary
If you find an easy way to replicate, happy to debug
I also get an error:
Attachment
image.png
This is the code
It's easy to set up locally, I just need a litellm proxy and call Ollama instead of actual openai gpt 3.5
Plain Text
model_list:
  - model_name: gpt-4
    litellm_params:
      model: "ollama/mixtral:latest"
      api_base: "http://172.16.129.1:11434"
  - model_name: gpt-3.5-turbo
    litellm_params:
      model: "ollama/solar:latest"
      api_base: "http://172.16.129.1:11434"
  - model_name: text-embedding-ada-002
    litellm_params:
      model: huggingface/jinaai/jina-embeddings-v2-base-en
      api_base: http://172.16.129.1:3000
litellm_settings:
  telemetry: False
  drop_params: True


this is my litellm config
Plain Text
print(task)
task_id='94587537-69df-45e1-a60b-1eceabe1c44a' input='hi' memory=ChatMemoryBuffer(token_limit=3072, tokenizer_fn=functools.partial(<bound method Encoding.encode of <Encoding 'gpt2'>>, allowed_special='all'), chat_history=[ChatMessage(role=<MessageRole.USER: 'user'>, content='hi', additional_kwargs={})]) extra_state={'sources': [], 'n_function_calls': 0, 'new_memory': ChatMemoryBuffer(token_limit=3000, tokenizer_fn=functools.partial(<bound method Encoding.encode of <Encoding 'gpt2'>>, allowed_special='all'), chat_history=[])}
so it was because of the chat history is empty, but you are trying to access it with [-1]
Ah good find, thank you!
Add a reply
Sign up and join the conversation on Discord