Find answers from the community

Updated last year

hey all I think 0 7 24 broke something I

hey all I think 0.7.24 broke something. I got:

Plain Text
bi@bi:~/ai/quiz-maker-be$ python3 main.py 
Traceback (most recent call last):
  File "/home/bi/ai/quiz-maker-be/main.py", line 5, in <module>
    from llama_index import SimpleDirectoryReader
  File "/home/bi/.local/lib/python3.10/site-packages/llama_index/__init__.py", line 12, in <module>
    from llama_index.data_structs.struct_type import IndexStructType
  File "/home/bi/.local/lib/python3.10/site-packages/llama_index/data_structs/__init__.py", line 3, in <module>
    from llama_index.data_structs.data_structs import (
  File "/home/bi/.local/lib/python3.10/site-packages/llama_index/data_structs/data_structs.py", line 14, in <module>
    from llama_index.schema import BaseNode, TextNode
  File "/home/bi/.local/lib/python3.10/site-packages/llama_index/schema.py", line 9, in <module>
    from llama_index.bridge.langchain import Document as LCDocument
  File "/home/bi/.local/lib/python3.10/site-packages/llama_index/bridge/langchain.py", line 21, in <module>
    from langchain.embeddings import HuggingFaceEmbeddings, HuggingFaceBgeEmbeddings
ImportError: cannot import name 'HuggingFaceBgeEmbeddings' from 'langchain.embeddings' (/home/bi/.local/lib/python3.10/site-packages/langchain/embeddings/__init__.py)


and then down graded to 0.7.23 and it worked fine.
```
B
M
L
4 comments
here's the code:

Plain Text
import os
import sys
sys.path.append(os.path.join(os.getcwd(), '..'))

from llama_index import SimpleDirectoryReader
# Make our printing look nice
from llama_index.schema import MetadataMode

def load_docs(filepath):
    loader = SimpleDirectoryReader(
        input_dir=filepath,
        recursive=True
    )

    return loader.load_data()

# load our documents from each folder.
# we keep them seperate for now, in order to create separate indexes later

# load the recent provincial offences act
recent_act = load_docs("docs/recent_act")

# load the old provincial offences act
old_act = load_docs("docs/old_act")

from llama_index import ServiceContext, set_global_service_context
from llama_index.llms import OpenAI

# create a global service context
service_context = ServiceContext.from_defaults(llm=OpenAI(model="gpt-4", temperature=0))
set_global_service_context(service_context)

from llama_index import VectorStoreIndex, StorageContext, load_index_from_storage

# create a vector store index for each folder

# try and load the index if already made
try:
    recent_act_index = load_index_from_storage(StorageContext.from_defaults(persist_dir="./recent_act_index"))
    #old_act_index = load_index_from_storage(StorageContext.from_defaults(persist_dir="old_act_index/"))

except:
    recent_act_index = VectorStoreIndex.from_documents(old_act)
    recent_act_index.storage_context.persist(persist_dir="./recent_act_index")
Try updating langchain. This fixed it for me.

pip install --upgrade langchain
Thank you thank you ser
appologies for this! We actually just reverted 0.7.24 (and released 0.7.24.post1).

We need to do a 0.8.0 release I think, with some updated dependencies in the setup πŸ™‚
Add a reply
Sign up and join the conversation on Discord