A community member is trying to use the HuggingFaceHub from LangChain along with LLamaIndex, but the solution shared on GitHub by another community member is not working for them. They are getting an "Empty Response" output and ask the other community member for help. The other community members suggest checking the prompt helper settings, specifically the context window and chunk size, as the Falcon model may have limitations on these values. They recommend trying a context window of 2048 and a chunk size of 512.
@Logan M I'm trying to use the HuggingFaceHub from LangChain along with LLamaIndex. The solution that you shared on GitHub, I tried it but somehow it's not working for me. Can you please help me out?