Find answers from the community

Updated last year

looking at https://docs.llamaindex.ai/en

At a glance
looking at https://docs.llamaindex.ai/en/stable/examples/vector_stores/SimpleIndexDemoLlama-Local.html if I don't import torch or set the torch kwargs, does it default to using CPU, or will it automatically use GPU regardless?
L
t
9 comments
It will use gpu automatically, if it's available πŸ‘

In this case, it looks like torch was just being used to set the tensor type to 16bit (although I'm not sure if that's needed even, since load_in_8bit is also set to true)
@ilan_pinto was having some performance issues and it wasn't clear if his system was actually accessing the GPU or not
we're gonna do some troubleshooting
I think he was following some of your evaluation examples and it was taking a really long time
it wasn't clear which part was the hangup - embedding, generation, whatever
will let you know
hmm, should be able to run nvidia-smi to check GPU usage

Evals can be slow though lol
Sounds good!
Add a reply
Sign up and join the conversation on Discord