Find answers from the community

Updated 2 months ago

Hi @Logan M Jerry asked me to post the

Hi Jerry asked me to post the issue here for tracking purpose: so the issue that I'm facing is that I have a RAG POC system built using llamaINdex SubQuestionQueryEngine that I am supposed to present to 500 or so people tomorrow to promote llmaIndex as I absolutely love it, but today I've noticed that I just started getting the error "you tried to access openai.chatcompletion, but this is no longer supported" Is there a fix in LlamaIndex that I can code to overcome this? This happens when I try and call chat function.

Here is the snippet:
s_engine = SubQuestionQueryEngine.from_defaults(query_engine_tools=query_engine_tools, use_async=False) response = s_engine.chat("Question")

I guess this is due to the versioning issue between langchain and llmaIndex regarding openAI version. I changed the llm from langchain to llamaIndex OpenAI and downgraded llamaIndex to 0.8.62 and openAI to 0.28 but then the PDFReader part breaks (attached here)

Can you plz suggest a fix so I can present it and show what a great library LlamaIndex is?
Attachment
image.png
L
N
24 comments
You can downgrade openai and llamaindex as a quick fix

Plain Text
pip install llama-index==0.8.62
pip install openai==0.28


Otherwise, it sounds like maybe an issue with the LLM you are using (if you are using langchain LLMs, then they will break for now)

We are removing langchain as a dependency later this week lol
Oh wait, I misread
now a different error lol
from llama_index.readers import PDFReader
instead of using download_loader
I think that issue was fixed in 0.8.63
let me know if that helps @Nitish
Thanks @Logan M I tried with 0.8.63 but then get the following error from importing llamaIndex
Attachment
image.png
ah I guess 0.8.63 requires the new openai client (which was causing issues for you)
stick to 0.8.62
and use the import I gave above for the PDFReader
got it, will do. Can you plz let me know when is the fix supposed to go out? @Logan M
The PDF reader thing is already fixed. But I think your original iossue was due to some compatibility between llama-index and langchain?
yeah the original issue was due to LlamIndex relying on openai > 1.0 and also on Langchain which is still openai <1.0
Are you using any specific features from langchain? Chances are we have equivilant features
I am using the openAI library but it used to throw me error when I call subquestionquery engine's query, it gave me an error "you tried to access openai.chatcompletion, but this is no longer supported"
You are using from llama_index.llms import OpenAI ? With gpt-3.5 or gpt-4?

You might need to completely uninstall and re-install llama-index, it sounds like some old version of the library is cached
I can run the subquestion query engine just fine on the latest version πŸ€”
I was using gpt-3.5-turbo, let me try and create a fresh environment and see if that helps, but you're are suggesting to use the latest version of llama_index, yeah?
yea, like, it shouuuuld be working πŸ™
alright lemme try again and openAI 1.2.4 which gets installed thru LlamaIndex yeah?
That seems to work, I guess the issue was dependency on Langchain
thank you @Logan M
Add a reply
Sign up and join the conversation on Discord