Find answers from the community

Updated 5 months ago

Hi guys my understanding was that all

At a glance
Hi guys, my understanding was that all data and docs were kept private and not sent to openai, however when I enable logging I can see the entire document being sent for embeddings and then excerpts of it being sent for querying.. is this normal?
W
M
4 comments
Yeah to generate response based on the docs, Llamaindex needs to sent the context and the query to OpenAI. You can try open source models from Huggingface in that case no data will leave your environment.
great thanks, would that generally be quite slow just on a standard VPS ?
You would need a GPU SERVER otherwise it will be very very slow.
legend cheers
Add a reply
Sign up and join the conversation on Discord