Find answers from the community

Updated last month

Beginner's guide to building an api for question answering model

Hello Team,

I am a beginner and I’m looking to create an API for a model that can answer questions based on proprietary materials, like my own notes or books. My aim is for the model to be able to provide answers directly from these sources. Could you please guide me on the right approach and the most suitable machine learning techniques for this? Additionally, I would appreciate recommendations on the best hosting platforms for the API, dataset storage, and deployment, along with an understanding of the associated costs.
W
r
4 comments
If you want the model to answer from your notes and books then you should try RAG: https://docs.llamaindex.ai/en/stable/getting_started/starter_example/

You dont need to train any llm model for this.
Also for hosting, it totally depends on your needs
i have to use that model's api further which platform is best for deployment??
It depends on your requirements. for instance AWS, Azure , GCP all works
How about Hugging face ? will it be Secure??
Add a reply
Sign up and join the conversation on Discord