Find answers from the community

Updated 3 months ago

Does anyone else have a probkem running

Does anyone else have a probkem running perplexity with llama_index? I always get http 400 error?
W
k
7 comments
Did you follow this tutorial to setup: https://docs.llamaindex.ai/en/latest/examples/llm/perplexity/?h=perpl ?

If you could share a traceback that will help to understand better.

You get 400 when your request has missing items that are required.
There could be multiple factors for this:
  • Perplexity may have updated their API which may not be reflecting on llama-index.
  • You may be leaving something while setting up.
Yeah, I followed exactly this
The last update in pplx module is 3 months.
Then there is a high chance that pplx made some changes in their API.
Feel free to look into it if you want and we welcome a PR πŸ™Œ
Hey @WhiteFang_Jr
I have made the required changes
How can I test it? I have tested that llm.complete and llm.chat functions work?
Add a reply
Sign up and join the conversation on Discord