Find answers from the community

Updated 2 months ago

Downgrade

sorry for disturbing again, but with azure openai, I encounter an issue like this one: https://discord.com/channels/1059199217496772688/1172580616999534593 but for the CondenseQuestion chat engine, and downgrade is not sufficient
L
R
33 comments
You'll need to downgrade both llama index and openai

pip install llama-index==0.8.62 "openai<1.0.0"
if the downgrade is too old it shows me this:
ImportError: cannot import name 'AzureOpenAIEmbedding' from 'llama_index.embeddings'
right because the way azure embeddings work changed.

Here's a link to the docs at that version
https://gpt-index.readthedocs.io/en/v0.8.62/examples/customization/llms/AzureOpenAI.html
yeah just found this too, but now I encounter this error: openai.error.InvalidRequestError: Resource not found
did you set a global service context?
Plain Text
service_context = ServiceContext.from_defaults(
    llm=llm,
    prompt_helper=prompt_helper,
    embed_model=embed_model,
)
I've done this
Plain Text
from llama_index import set_global_service_context

set_global_service_context(service_context)
and so set_global_service_context(service_context)
you do both and get that error?
yeah, but just restarted the jupyter engine (even if I just did it after the downgrades), and it seems to work now
do you know when the azure management in llama_index will be updated (or will it be writed in the logs ?)
it's not up to us, it will be when azure updates their api version to be compatible with the new openai client πŸ€”
ow they are not lmao
btw if you need an azure access to debug anything don't hesitate to ask me
azure is lowkey the worst library to support hahaha

Someone on our team has access too. I should probably get the details from them at somepoint lol
also here it seems the chat engine don't care of the query engine
query_engine_builder = QASummaryQueryEngineBuilder(qa_text=qa_text, summary_text=summary_text, service_context=service_context)
query_engine = query_engine_builder.build_from_documents(documents)
chat_history=[]
chat_engine = SimpleChatEngine.from_defaults(
query_engine=query_engine,
chat_history=chat_history,
verbose=True
)
chat_engine.chat_repl()
I use this code, but the chat engine just don't answer the questiion regarding the context
Example: Human: What is the context about ?
Assistant: I'm sorry, but I need more information to provide you with the context. Could you please provide more details or specify what you are referring to?
Simple chat engine doesn't use the query engine (it's just talking to the LLM by itself)
Ok I better understand
finally, using directly the summary index/the vector store index
give good answers compared to using the QASummaryQueryEngineBuilder
Like when I ask for names, the first one answer properly, whereas the second one tells me he does not have the answer
what is different ?
is the node system breaking all ?
and also, with azure and french request, I often got the openai.error.InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766
Add a reply
Sign up and join the conversation on Discord