Find answers from the community

N
Nitish
Offline, last seen 3 months ago
Joined September 25, 2024
Hi Jerry asked me to post the issue here for tracking purpose: so the issue that I'm facing is that I have a RAG POC system built using llamaINdex SubQuestionQueryEngine that I am supposed to present to 500 or so people tomorrow to promote llmaIndex as I absolutely love it, but today I've noticed that I just started getting the error "you tried to access openai.chatcompletion, but this is no longer supported" Is there a fix in LlamaIndex that I can code to overcome this? This happens when I try and call chat function.

Here is the snippet:
s_engine = SubQuestionQueryEngine.from_defaults(query_engine_tools=query_engine_tools, use_async=False) response = s_engine.chat("Question")

I guess this is due to the versioning issue between langchain and llmaIndex regarding openAI version. I changed the llm from langchain to llamaIndex OpenAI and downgraded llamaIndex to 0.8.62 and openAI to 0.28 but then the PDFReader part breaks (attached here)

Can you plz suggest a fix so I can present it and show what a great library LlamaIndex is?
24 comments
N
L