404
error or APIConnectionError
depending on how I query the query engine (or when its wrapped over by a Context Augmented Agent). I've attached my code here in a text file because I don't think it'll fit. (Traceback is within the code as well) Exception: APIConnectionError: Connection error.
AzureOpenAI
class/wrapper via LangChain and it works fine on its own when in a simple notebook I create the object and prompt it. But when its wrapped in an index/engine it starts to have connection issues as shown in my code/traceback.context_agent = ContextRetrieverOpenAIAgent.from_tools_and_retriever( query_engine_tools, context_index.as_retriever(similarity_top_k=1), verbose=True, )
if not isinstance(llm, OpenAI): raise ValueError("llm must be a OpenAI instance")
isinstance(llm, OpenAI)
, this time in rags/core/agent_builder/utils.py. The result is that, when I was trying to use this very convenient RAGs app with a local LLM, via Ollama in this case, it started using the ReActAgent protocol instead of the function protocol that OpenAI supports. But, but, ... other LLMs support the function protocol too. >>> from llama_index.llms import OpenAI, AzureOpenAI >>> openai = OpenAI() >>> azure = AzureOpenAI(engine="fake") >>> isinstance(openai, OpenAI) True >>> isinstance(azure, OpenAI) True