Find answers from the community

Updated 5 months ago

Replacing default chatgpt with anthropic llm in tutorial still asks for openai api key

At a glance

The community member is trying to replace the default ChatGPT with an Anthropic LLM in a tutorial, but is still being asked for an OpenAI API key, which they suspect is for embeddings. The community members suggest using a local embedding model instead of the default OpenAI embeddings, and defining the LLM as global using Settings.llm=llm to resolve the issue. The community members also discuss that defining the LLM or other models through Settings. will override their local/abstraction/object-based definitions.

Useful resources
I am replacing default chatgpt with anthropic llm in tutorial(https://docs.llamaindex.ai/en/stable/examples/index_structs/doc_summary/DocSummary/). But it is still asking me for OpenAI API key, is it for embeddings? If yes, how to make it completely Anthropic based tutorial?
W
J
5 comments
Yes default embedding is OpenAI too, So if you wanna drift away from defaults you;ll need to use local embedding model as well.

Which is pretty easy:

Plain Text
# install Llama-index Huggingface lib
pip install llama-index-embeddings-huggingface

# now load the model
Settings.embed_model = "local:BAAI/bge-base-en-v1.5"
It is throwing below error
Attachments
image.png
image.png
Ah DocumentSummaryIndex uses openai for generating summaries.
Just define your llm as global with Settings.llm=llm and the error will go away
It's working now! thanks. Just out of curiosity, defining LLM or any other model through Settings. will override their local/abstraction/object based definition i.e. DocumentSummaryIndex.from_documents(documents, llm=llm,), right?
Add a reply
Sign up and join the conversation on Discord