Run the example.ts, I got:
APIConnectionError: Connection error.
at OpenAI.makeRequest (/Users/lucas/dev/llamaindex/node_modules/.pnpm/openai@4.0.1/node_modules/openai/src/core.ts:462:5)
at processTicksAndRejections (node:internal/process/task_queues:95:5) {
...
cause: FetchError: request to https://api.openai.com/v1/embeddings failed, reason: connect ETIMEDOUT 108.160.161.20:443
...
code: 'ETIMEDOUT'
}
See "cause: FetchError: request to
https://api.openai.com/v1/embeddings failed, reason: connect ETIMEDOUT 108.160.161.20:443", I realize that's because I am behind a proxy server. I could access ChatGPT in browser with proxy server configured. So I thought maybe LlamaIndex TS should also be able to connect to openai API via proxy. Then I dig into llm/openai.ts , I found that getOpenAISession can be passed in a options. this option will then pass to openai.
export function getOpenAISession(
options: ClientOptions & { azure?: boolean } = {}
) {
...
session = new OpenAISession(options);
...
return session;
}
And I know openai@4.0.1 package does support httpAgent as an option which can be used to specify proxy
So my question is, how to enable proxy in LlamaIndex?
Thanks a lot.