Find answers from the community

Updated 2 months ago

Vllm error: keyerror 'text' in get_response

Hi all, I wanted to use VLLM for my RAG, but this error showed up

, line 9, in get_response
return data["text"]
KeyError: 'text'

Does this have a solution?
W
1 comment
Can you share the full traceback and how you have used vllm?

vllm implentation example is given here: https://docs.llamaindex.ai/en/stable/examples/llm/vllm/
Add a reply
Sign up and join the conversation on Discord