Find answers from the community

Updated 2 months ago

Anyone have a working version using

Anyone have a working version using huggingface beddings and llm that works that I can see how you did it? I'm looking to load PDF files from a dir.
L
M
13 comments
how does this thing work? lol
the google colab thing...
Oh, google colab is just code that runs on a free (or paid) computer from Google haha

It runs as a notebook in your browser. If that sounds like jibberish, try Googling for what a python notebook is πŸ˜…

In Google colab, you can set your runtime type to use a GPU, so that you can try things out even if you don't have a GPU yourself

That specific example might require a paid tier πŸ˜… but it illustrates how things work
I just had a scary thought lol
It involves a retired crypto mining rig lol but that is later first I have to get this fixed...
In that notebook, are all those code blocks in 1 .py or each it's own file?
Also if I was to split mine into 2 separate operations... Like one to embed and add to a DB and the other just read from the DB. Do you think it would run lighter?
So notebooks allow you to run your code in steps/chunks

So any variables in one block, will be available in any other block after I run it

It's super helpful for iterating and debugging small pieces of code
It would run a tiny bit lighter, but tbh the embedding model is already quite small. But maybe worth a shot
I'm going to look into that.
I apologize if I am bugging you or being annoying. I just see a possible path to build something great and really useful using llama_index... I'm a passionate person. Lol
Add a reply
Sign up and join the conversation on Discord