Find answers from the community

Updated 5 months ago

as per latest import statements, change these llamaindex imports:from llama_index.legacy

@kapa.ai as per latest import statements, change these llamaindex imports:
from llama_index.legacy import (
SimpleDirectoryReader, StorageContext,load_index_from_storage
)
from llama_index.legacy import Document
from llama_index.legacy import VectorStoreIndex
from llama_index.legacy import ServiceContext
from llama_index.legacy.llms import OpenAI
from llama_index.legacy.embeddings import OpenAIEmbedding
from llama_index.legacy.query_engine import FLAREInstructQueryEngine
k
S
L
5 comments
Please stand by @Shera as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Based on the latest LlamaIndex documentation, the import statements should be updated as follows:

Plain Text
from llama_index.core import (
    SimpleDirectoryReader, StorageContext, load_index_from_storage, VectorStoreIndex
)
from llama_index.core import Document
from llama_index.llms.openai import OpenAI
from llama_index.embeddings.openai import OpenAIEmbedding

Please note that ServiceContext was deprecated in LlamaIndex v0.10.0, so it should be removed from your code.

As for FLAREInstructQueryEngine, it's not mentioned in the provided knowledge sources, so I can't provide the updated import statement for it. You might want to check the latest LlamaIndex documentation for the correct import statement.

Sources:
  • Updating to v0.10.0 ()
  • CitationQueryEngine Setup ()
  • Jaguar Vector Store Imports ()
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how will these lines change as per latest code change:
service_context = ServiceContext.from_defaults(llm=llm)

also I want to use OpenAIEmbedding(), then in line:
Settings.embed_model = embed_model , what should be value of embed_model?
@kapa.ai how will these lines change as per latest code change:
service_context = ServiceContext.from_defaults(llm=llm)

also I want to use OpenAIEmbedding(), then in line:
Settings.embed_model = embed_model , what should be value of embed_model?
Plain Text
Settings.embed_model = OpenAIEmbedding(...)
Settings.llm = OpenAI(..)


These just change the global defaults. You can also override locally
Plain Text
index = VectorStoreIndex.from_documents(..., embed_model=embed_model)
index.as_query_engine(..., llm=llm)
Add a reply
Sign up and join the conversation on Discord