Find answers from the community

Updated 10 months ago

```rerank = SentenceTransformerRerank( model="cross-encoder/ms-marco-MiniLM-L-2-v2", t

@kapa.ai
Plain Text
rerank = SentenceTransformerRerank(
    model="cross-encoder/ms-marco-MiniLM-L-2-v2", top_n=3
)
gives error
Plain Text
AttributeError                            Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1363         try:
-> 1364             return importlib.import_module("." + module_name, self.__name__)
   1365         except Exception as e:

18 frames
AttributeError: module 'torch._subclasses' has no attribute 'functional_tensor'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1364             return importlib.import_module("." + module_name, self.__name__)
   1365         except Exception as e:
-> 1366             raise RuntimeError(
   1367                 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1368                 f" traceback):\n{e}"

RuntimeError: Failed to import transformers.models.bert.modeling_bert because of the following error (look up to see its traceback):
module 'torch._subclasses' has no attribute 'functional_tensor'
k
a
8 comments
Please stand by @andysingal as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Based on the information provided in the extracts, it seems like you're trying to use the SentenceTransformerRerank function from the llama_index.postprocessor module in LlamaIndex. However, the error you're encountering seems to be related to the PyTorch library, not directly to LlamaIndex.

The error message module 'torch._subclasses' has no attribute 'functional_tensor' suggests that there might be an issue with your PyTorch installation or the version of PyTorch you're using.

Unfortunately, the extracts do not provide specific information on how to resolve this error. I would recommend checking your PyTorch installation and ensuring you're using a compatible version with your current environment.

For more detailed help, you might want to refer to the official LlamaIndex documentation () or the LlamaIndex GitHub repository ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
what version of torch i need to use @kapa.ai
Let me check the knowledge base for you. One moment! :books:
----------------------
The extracts provided do not specify a particular version of PyTorch that is required for LlamaIndex. However, it's generally a good practice to use the latest stable version of PyTorch unless there are specific version requirements mentioned in the library's documentation or setup instructions.

For the most accurate information, please refer to the official LlamaIndex documentation () or the LlamaIndex GitHub repository ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai is SentenceTransformerRerank deprecated?
Let me check the knowledge base for you. One moment! :books:
----------------------
Based on the information provided in the extracts, there is no indication that the SentenceTransformerRerank module from LlamaIndex is deprecated. It is actively used in various examples and mentioned in the documentation.

For instance, in the LlamaIndex documentation (), SentenceTransformerRerank is listed as one of the Node Postprocessor Modules. It's also used in examples provided in the LlamaIndex GitHub repository ().

However, for the most accurate and up-to-date information, it's always best to refer to the official LlamaIndex documentation or its GitHub repository.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord