Find answers from the community

Updated 2 months ago

Handling Larger Text Inputs with LlamaIndex

At a glance

The community member's post asks what will happen if they provide a larger chunk of text to their 512 token input size embedding model, and how the llamaindex library will handle it. A comment from another community member suggests that the HuggingFaceEmbedding class will simply truncate the input text to fit the 512 token limit.

If my embedding model has 512 token input size then what will happen if I will give bigger chunk of text to embed how llamaindex handle it?
L
1 comment
Depends on the embed_model class you use. HuggingFaceEmbedding will just truncate it
Add a reply
Sign up and join the conversation on Discord