Find answers from the community

Updated 3 months ago

llama_index/llama-index-core/llama_index...

Hi there, I have an issue with the textNode I am using getting truncated with a "..." because of https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/schema.py#L329-L336

Where my Text node is being truncated so my response synthesis cannot use the full text response because of the truncation. Has anyone hit this issue or am I doing something wrong? Thanks!
L
n
13 comments
What code are you running?

All the internal code should be using node.text or node.get_content()
sql_retriever = SQLRetriever(sql_database)

The output of this is calling
str method of the text node which incorrectly is passing into the context when I was debugging
in query.py validate and convert stringable: https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/base/query_pipeline/query.py#L61-L62

bc node with score is a stringable input-- I see this getting called
Interesting, seems like a bug
Workaround would be a custom component that returns strings instead of nodes
yeah- tried this and it works (overwrote locally as well also works)
@Logan M should I report an issue or?
yea probably worth it. Although if you can think of a fix, a PR would be even better ❀️
πŸŽ‰ Will review soon!
Any comments? Should be a super quick fix
bit of a one-man show in the reviews -- will get to it at some point today πŸ˜…
Add a reply
Sign up and join the conversation on Discord