Find answers from the community

Updated 4 months ago

I am facing following error everytime i

At a glance

The community member is facing an APIConnectionError when using the query_engine_tools in the OpenAIAssistantAgent. The error message indicates an issue with the header value. The community members discuss potential solutions, such as ensuring the OpenAI API key is set correctly in the environment or passing the key directly to the OpenAIEmbedding model. One community member suggests that the issue may be related to the embeddings, and the solution of setting the API key directly in the code seems to have resolved the problem.

I am facing following error everytime i use tools(query_engine_tools) in Openaiassiatantagent

LocalProtocolError Traceback (most recent call last)
LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

APIConnectionError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/openai/_base_client.py in _request(self, cast_to, options, remaining_retries, stream, stream_cls)
895 stream_cls=stream_cls,
896 )
--> 897 raise APIConnectionError(request=request) from err
898
899 return self._process_response(

APIConnectionError: Connection error.
L
k
12 comments
Did you set your openai key in your env?
Yes, openai_tools=[{"type": "retrieval"}] works, but custom tools are not showing this error
which line of code is causing the error?
like, where does the traceback start
When i run agent.chat("anything")
do you have the full traceback? It can be useful to inspect the full call stack. So far I have no idea lol
I haven't ran into this
Here are few lines that appear before error

=== Calling Function ===
Calling function: owasp_asvs with args: {"input": "Here is my input"}
WARNING:llama_index.llms.openai_utils:Retrying llama_index.embeddings.openai.get_embedding in 0.2524790043089481 seconds as it raised APIConnectionError: Connection error..
ah so its embeddings cool

you are sure you did export OPENAI_API_KEY="sk-..." ? Or os.environ["OPENAI_API_KEY"] = "sk-..." ? Maybe do both?
i am doing 2nd one
could also try passing the key directly

Plain Text
from llama_index.embeddings import OpenAIEmbedding
from llama_index import set_global_service_context

embed_model = OpenAIEmbedding(api_key=...)
service_context = ServiceContext.from_defaults(embed_model=embed_model)

set_global_service_context(service_context)


But that feels weird, this works fine for me on the latest version of llama index
Thank you so much Logan 😁, its working now
Add a reply
Sign up and join the conversation on Discord