Find answers from the community

Updated 4 months ago

When we set embedd model to something

At a glance

The community member is using the HuggingFaceEmbedding class to set an embed model, and they expect the model to be downloaded into the $TRANSFORMERS_CACHE folder. However, they don't see the downloaded model in that folder. One of the comments suggests that the model is actually downloaded into the $LLAMA_INDEX_CACHE_DIR folder, but the community member doesn't have an environment variable with that name. There is no explicitly marked answer in the comments.

When we set embedd model to something from Huggingface, I assume that it will be downloaded into the $TRANSFORMERS_CACHE folder. For example,
"""
embed_model = HuggingFaceEmbedding(model_name='BAAI/bge-large-en-v1.5')
service_context = ServiceContext.from_defaults(
chunk_size=1024,
chunk_overlap=256,
llm=llm,
embed_model=embed_model,
)
"""
I would think I can see the 'BAAI/bge-large-en-v1.5' will be downloaded and saved in the folder transformers cache folder. But I don't see it there .Am I missing something?
L
S
4 comments
It's downloaded into $LLAMA_INDEX_CACHE_DIR
I don't have an env variable like that. Is there a default location for that?
Add a reply
Sign up and join the conversation on Discord