Find answers from the community

Updated 8 months ago

Hi all, I used to use the

Hi all, I used to use the VectorStoreIndex.from_documents to create index and save them into the PGVectorStore. When creating the index, I used the serviceContext to configure the chunk size and chunk overlap. However, the ServiceContext is deprecated. Does anyone have any idea how i can configure the chunk zise now?
T
r
2 comments
Plain Text
from llama_index.core import Settings

Settings.node_parser = SentenceSplitter(chunk_size=512, chunk_overlap=20)

Here is the full guide: https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration/?h=settings
Thanks a lot!!!!
Add a reply
Sign up and join the conversation on Discord