Find answers from the community

Updated 2 years ago

Run the example ts I got

At a glance

The community member is experiencing an APIConnectionError when running the example.ts file, which they believe is due to being behind a proxy server. They note that they can access ChatGPT in their browser with the proxy server configured, and they think LlamaIndex should also be able to connect to the OpenAI API via a proxy.

The community member then investigates the llm/openai.ts file and finds that the getOpenAISession function can be passed options, which are then passed to the OpenAI package. They also know that the OpenAI package supports the httpAgent option, which can be used to specify a proxy.

The community member's question is how to enable a proxy in LlamaIndex.

In the comments, another community member responds that they haven't had a chance to test behind a proxy, but in theory it should work if an option is passed. The community member then says they see the issue and will cut a new version today.

In a later comment, the community member says that version 0.0.20 has been published, and the community member should be able to pass the httpProxy

Useful resources
Run the example.ts, I got:

Plain Text
APIConnectionError: Connection error.
    at OpenAI.makeRequest (/Users/lucas/dev/llamaindex/node_modules/.pnpm/openai@4.0.1/node_modules/openai/src/core.ts:462:5)
    at processTicksAndRejections (node:internal/process/task_queues:95:5) {
  ...
  cause: FetchError: request to https://api.openai.com/v1/embeddings failed, reason: connect ETIMEDOUT 108.160.161.20:443
     ...
    code: 'ETIMEDOUT'
  }


See "cause: FetchError: request to https://api.openai.com/v1/embeddings failed, reason: connect ETIMEDOUT 108.160.161.20:443", I realize that's because I am behind a proxy server. I could access ChatGPT in browser with proxy server configured. So I thought maybe LlamaIndex TS should also be able to connect to openai API via proxy. Then I dig into llm/openai.ts , I found that getOpenAISession can be passed in a options. this option will then pass to openai.
Plain Text
export function getOpenAISession(
  options: ClientOptions & { azure?: boolean } = {}
) {
...
    session = new OpenAISession(options);
...
  return session;
}

And I know openai@4.0.1 package does support httpAgent as an option which can be used to specify proxy
So my question is, how to enable proxy in LlamaIndex?

Thanks a lot.
L
Y
5 comments
@Yi Ding is llamaindex.ts able to support proxies on openai?
I haven’t had a chance to test behind a proxy. In theory it should work but we might need to pass in an option.
I see the issue. Will cut a new version today.
@lucaskhliu Just published 0.0.20. You should be able to pass in the httpProxy/httpsProxy using the additionalSessionOptions
Add a reply
Sign up and join the conversation on Discord