Find answers from the community

Updated 3 months ago

Hey do you guys know anyways businesses

Hey do you guys know anyways businesses can try and protect their confidential information when utilizing a rag? Is it possible to do this
1
E
W
a
4 comments
Yes, you can use local llms and embeddings
Plus you can use PII masking to protect important information being sent to llm.
https://docs.llamaindex.ai/en/stable/examples/node_postprocessor/PII.html
Other options that I've heard companies consider is Microsoft's Private OpenAI service. So instead of sending LLM calls to the public endpoints, a business can register for private and spin up their own private LLM endpoints.

It really depends on the biz use case and the risk-analysis conducted on it. One for example may want to employ differential privacy (on embeddings or own prompts, or elsewhere) to protect privacy of data subjects.
Ahh very interesting, thank you!
Add a reply
Sign up and join the conversation on Discord