You'll need to downgrade both llama index and openai
pip install llama-index==0.8.62 "openai<1.0.0"
if the downgrade is too old it shows me this:
ImportError: cannot import name 'AzureOpenAIEmbedding' from 'llama_index.embeddings'
yeah just found this too, but now I encounter this error: openai.error.InvalidRequestError: Resource not found
did you set a global service context?
service_context = ServiceContext.from_defaults(
llm=llm,
prompt_helper=prompt_helper,
embed_model=embed_model,
)
from llama_index import set_global_service_context
set_global_service_context(service_context)
and so set_global_service_context(service_context)
you do both and get that error?
yeah, but just restarted the jupyter engine (even if I just did it after the downgrades), and it seems to work now
do you know when the azure management in llama_index will be updated (or will it be writed in the logs ?)
it's not up to us, it will be when azure updates their api version to be compatible with the new openai client π€
btw if you need an azure access to debug anything don't hesitate to ask me
azure is lowkey the worst library to support hahaha
Someone on our team has access too. I should probably get the details from them at somepoint lol
also here it seems the chat engine don't care of the query engine
query_engine_builder = QASummaryQueryEngineBuilder(qa_text=qa_text, summary_text=summary_text, service_context=service_context)
query_engine = query_engine_builder.build_from_documents(documents)
chat_history=[]
chat_engine = SimpleChatEngine.from_defaults(
query_engine=query_engine,
chat_history=chat_history,
verbose=True
)
chat_engine.chat_repl()
I use this code, but the chat engine just don't answer the questiion regarding the context
Example: Human: What is the context about ?
Assistant: I'm sorry, but I need more information to provide you with the context. Could you please provide more details or specify what you are referring to?
Simple chat engine doesn't use the query engine (it's just talking to the LLM by itself)
finally, using directly the summary index/the vector store index
give good answers compared to using the QASummaryQueryEngineBuilder
Like when I ask for names, the first one answer properly, whereas the second one tells me he does not have the answer
is the node system breaking all ?
and also, with azure and french request, I often got the openai.error.InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation:
https://go.microsoft.com/fwlink/?linkid=2198766