I'm stuck π₯
Does anyone know how can I route the default llamaindex requests to my azure proxy service? I'd tried to to it as the following:
const serviceContext = serviceContextFromDefaults({
llm: new OpenAI({
session: new OpenAISession({ baseURL: 'http://azureopenaiproxy.service/handler' }),
temperature: 0.1,
}),
});
And I have account with 1000 tokens, but its seems like it is still routing to the default gpt api.
(The error is:
429 Rate limit reached for text-embedding-ada-002 in organization org-******** on requests per min (RPM): Limit 3, Used 3, Requested 1. Please try again in 20s. Visit https://platform.openai.com/account/rate-limits to learn more. You can increase your rate limit by adding a payment method to your account at https://platform.openai.com/account/billing.'
)