@Logan M/ @ravitheja / @Jerry Liu Our Org has wrapped the Azure Open AI and makes the completion and embedding capabilities available via REST endpoints and keys. Is there a way to integrate that with LlamaIndex so that it refers to this as the LLM and Embedded model.
Thats kind of beyond the scope of llama-index and the docs π You would just user requests.get(...) or similar to ping your REST API and get responses