Find answers from the community

Updated 3 months ago

Gpu

it looks like it's been asked previously but no clear answer and the docs aren't super clear either -- how do I make the embedding/indexing use GPU? or, how do I know if it used/might use GPU?
L
t
6 comments
You can set device="cuda" in the constructor, but it should automatically use the gpu if cuda is available

You can check nvidia-smi in your terminal to view if a python process is using gpu memory
i thought that was the case
waiting for this environment to restart. i was getting oomkilled because it hink it wasn't picking up the gpu
Hmm interesting

I know you linked a page on onnx/Optimum embeddings earlier. Using GPU for those is a little more tricky (tbh I never got it to work)
yeah no not doing the optimum stuff
just the regular stuff
Add a reply
Sign up and join the conversation on Discord