ValueError: Metadata length (407) is longer than chunk size (128). Consider increasing the chunk size or decreasing the size of your metadata to avoid this.
tokenizer
arg, but the output from doing so simply returned the text input without chunking it all. Simply using the SentenceSplitter as is, without passing in any tokenizer works as expected.