Find answers from the community

Updated 5 months ago

I want to use OpenAIEmbedding(), then in line:Settings.embed_model = embed_model , what s

@kapa.ai I want to use OpenAIEmbedding(), then in line:
Settings.embed_model = embed_model , what should be value of embed_model?

also how will these lines change as per latest code change:
service_context = ServiceContext.from_defaults(llm=llm)
k
S
9 comments
Please stand by @Shera as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
The embed_model in the line Settings.embed_model = embed_model should be an instance of the OpenAIEmbedding class. You can initialize it with the model name you want to use. Here is an example:

Plain Text
from llama_index.embeddings.openai import OpenAIEmbedding

embed_model = OpenAIEmbedding(model="text-embedding-3-small")


Then you can set this embed_model to the Settings object:

Plain Text
from llama_index.core import Settings

Settings.embed_model = embed_model


As for the ServiceContext, it has been deprecated in the v0.10.0 update of LlamaIndex. The preferred way of setting up the same options is now either globally in the Settings object or locally in the APIs that use certain modules. So, the line service_context = ServiceContext.from_defaults(llm=llm) would now look like this:

Plain Text
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI

Settings.llm = OpenAI(model="gpt-3.5-turbo")


You can find more details about this migration in the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how to upgrade from llamaindex version 0.9 to 1.x version?
Let me check the knowledge base for you. One moment! :books:
----------------------
To upgrade from LlamaIndex version 0.9 to version 1.x, you need to follow the migration guide provided in the LlamaIndex documentation. Here are the steps based on the information available:

  1. Install the new version: You can do this by running pip install llama-index in your terminal. This will install the latest version of LlamaIndex.
  1. Update your imports: Many imports have changed in the new version. You can either update your imports manually or use the provided command-line tool to update your existing code and notebooks. Here's how you can do it:
  • Option 1: Use temporary legacy imports: You can update your imports from from llama_index import VectorStoreIndex to from llama_index.legacy import VectorStoreIndex and so on.
  • Option 2: Full migration: After installing v0.10.0, you can upgrade your existing imports automatically by running the following commands in your terminal:
Plain Text
     llamaindex-cli upgrade-file <file_path>
     # OR
     llamaindex-cli upgrade <folder_path>
     


Please note that the CLI tool updates files in place. Ensure you have your data backed up to undo any changes as needed.

  1. Update your code: The ServiceContext was deprecated in the new version. You should replace it with the new way of setting up options, either globally in the Settings object or locally in the APIs that use certain modules.
You can find more details about this migration in the LlamaIndex documentation (). Please note that this guide is for upgrading to version 0.10.0. For upgrading to version 1.x, you should check the specific migration guide for that version if available.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how to uninstall older versions and install newer ones?
Let me check the knowledge base for you. One moment! :books:
----------------------
To uninstall the older version of LlamaIndex, you can use the pip uninstall command:

Plain Text
pip uninstall llama-index -y


After uninstalling the older version, you can install the newer version using pip install:

Plain Text
pip install -U llama-index --upgrade --no-cache-dir --force-reinstall


If you are upgrading from v0.9.X, it is recommended to follow the migration guide (), as well as uninstalling your previous version first.

For specific packages, you can also use pip to uninstall and install. For example, to install the llama-index-llms-deepinfra package, you would use:

Plain Text
pip install llama-index-llms-deepinfra


And to uninstall it:

Plain Text
pip uninstall llama-index-llms-deepinfra -y


Remember to replace llama-index-llms-deepinfra with the name of the package you want to install or uninstall.

Sources:
  • LlamaIndex Documentation ()
  • LlamaIndex Migration Guide ()
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord