Find answers from the community

Updated 2 months ago

I have deployed my custom LLM model on

I have deployed my custom LLM model on Azure now I wanted to use llama_index for indexing, qna, etc. Can you help me how can I do this . thanks
V
t
10 comments
i don't know if you wanted to do it by yourself
but that's my implementation
Thanks for the solution but I have some specific requirement like I have a open source model Flan-T5-large I have Azure cloud. Now, what I wanted to do is that I will deploy that model on Azure and use it's API for Inferences. So, I wanted to integrate it's API.
I have just integrated the azure API too, you just need to change the deployement name that is called in the llm deployement ?
if your LLM is located in the deployements in Azure
I am thinking of using docker or Azure ML.
I am beginner in Azure can you tell me the best option posible. Thanks for the help
Add a reply
Sign up and join the conversation on Discord