ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. llama-index-llms-ollama 0.3.2 requires llama-index-core<0.12.0,>=0.11.0, but you have llama-index-core 0.12.13 which is incompatible. llama-index-llms-anthropic 0.3.1 requires anthropic[vertex]<0.29.0,>=0.26.2, but you have anthropic 0.44.0 which is incompatible. llama-index-llms-anthropic 0.3.1 requires llama-index-core<0.12.0,>=0.11.0, but you have llama-index-core 0.12.13 which is incompatible.
llama_index.core.workflow.errors.WorkflowRuntimeError: Error in step 'prepare_chat_history': "Could not resolve authentication method. Expected either api_key or auth_token to be set. Or for one of the `X-Api-Key` or `Authorization` headers to be explicitly omitted"
os.environ["ANTHROPIC_API_KEY"] = "sk-ant-api03-ntK
Anthropic().tokenizer
Anthropic()
call defaults to claude-2.1, so you need your code to do the instantiation with a good model before getting the tokenizer, e.g. llm = Anthropic(model="claude-3-haiku-20240307", max_tokens=MAX_TOKENS) tokenizer = llm.tokenizer Settings.tokenizer = tokenizer
llama-index
isn't even a real package, just a wrapper on some starter installs