Find answers from the community

Updated 3 months ago

Is there an nice example for using

Is there an nice example for using huggingface models in RAG ? I think I still do something wrong, maybe I can see something I do different
L
o
7 comments
there are some extensive notebooks in the docs

This one is probably a good example
https://colab.research.google.com/drive/1UoPcoiA5EOBghxWKWduQhChliMHxla7U?usp=sharing

But also, hugginface can be a bit annyoing to setup. If you have limited resources, I also like using ollama for the LLM and huggingface for embeddings

Plain Text
service_context = ServiceContext(
  llm=Ollama(model="mistral", request_timeout=300.0),
  embed_model=HuggingFaceEmbedding(model_name="BAAI/bge-small-en-v1.5")
)
oh that's great stuff thank you so much!!!
I really like to run GGUF or GPTQ but I did not yet find how to do that
and now as you did it, I like to see a sample for an ollama RAg too πŸ˜„
oh I just see you already put ollama code there, so I have to learn to read more patient
yea! Ollama is so easy to use πŸ™‚
great and again thank you so much. for sure I come back with new questions, but for today I got to finish.. its 13 hours at the desk already
Add a reply
Sign up and join the conversation on Discord