Find answers from the community

Updated last year

[Bug]: can't set extra request headers w...

At a glance

A community member is having trouble using LlamaIndex with Helicone.ai or other services that proxy requests to OpenAI. They have recorded their unsuccessful attempts in a GitHub issue. Another community member suggests using OpenAILike or LocalAI classes from the LlamaIndex library to address the issue. The original poster's issue has been resolved, and a new release of LlamaIndex is forthcoming.

Useful resources
Hello! Has anyone tried using LlamaIndex with Helicone.ai or any other service that proxies requests to OpenAI?

I've recorded my unsuccessful attempts at: https://github.com/run-llama/llama_index/issues/9082 (i suspect there's a bug in LlamaIndex)
W
L
g
3 comments
@gkk I just merged a fix for this (and also commented on your ticket there)

Cutting a new release soon πŸ™‚
@Logan M wow, that was fast! thank you!
Add a reply
Sign up and join the conversation on Discord