The community member is experiencing an issue where the textNode they are using is getting truncated with a "..." due to a specific code path in the llama_index library. This is causing issues with their response synthesis, as the full text response is not available. The community members discuss potential workarounds, such as using node.text or node.get_content() instead of the default behavior, and creating a custom component that returns strings instead of nodes. The community members also suggest reporting the issue and potentially submitting a pull request to fix the bug.
Where my Text node is being truncated so my response synthesis cannot use the full text response because of the truncation. Has anyone hit this issue or am I doing something wrong? Thanks!