Find answers from the community

Updated 7 months ago

llama_index/docs/docs/examples/finetunin...

I'm trying to reload a finetuned embedding model using one of the llama-index examples. https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/finetuning/embeddings/finetune_embedding_adapter.ipynb
Running into an issue with an import, so I'm assuming the source has changed or library name has changed.

Anyone know the current way to accomplish this:
from llama_index.core.embeddings import LinearAdapterEmbeddingModel
R
S
2 comments
You must install the llama-index-embeddings-adapter package using pip inside your env:
Plain Text
pip install llama-index-embeddings-adapter


then to import:
Plain Text
from llama_index.embeddings.adapter import LinearAdapterEmbeddingModel


note that LinearAdapterEmbeddingModel is the same thing as AdapterEmbeddingModel. they kept the LinearAdapterEmbeddingModel label just for backwards compatibility as shown here:
https://github.com/run-llama/llama_index/blob/7849b1a851d88ee28e1bfd05d19f18e40d5b8e10/llama-index-integrations/embeddings/llama-index-embeddings-adapter/llama_index/embeddings/adapter/base.py#L115

Usually just search for the module you're trying to import on the github search. It will show you where this Class or module currently is at.
Thank you! I find it difficult to navigate all the changes in syntax for the imports, on top of there being so many various llama-index-this-and-that to install. I had been trying to search for documentation in the docs without success. Thanks for the tip to search for this type of content in github directly.
Add a reply
Sign up and join the conversation on Discord