Find answers from the community

D
Drewzy
Offline, last seen 3 months ago
Joined September 25, 2024
Are the llama packs designed to be combined? For example, if I wanted to set up an app using ollama, hybrid retriever, and a faiss db, would I combine for instance the ollama pack and the hybrid fusion pack? Or would I modify one of them to include the parameters. Sorry for the newb question I've looked through both and found myself stuck
3 comments
D
L
llama index always wants to use openai, despite me specifying not to use it in my app. I'm assuming that I am calling for the models incorrectly. Can somebody look at my code and let me know what I'm doing wrong: https://pastebin.com/9ddUR9mb as of now the only way I can get it to work is by modifying both llms/utils.py and embeddings/utils.py within the llama_index module
5 comments
L
D
for some reason vectorstoreindex isn't working for me in 0.8.51
1 comment
a
is there any reason to use both Settings, and ServiceContext?
2 comments
D
W
I cannot for the life of me find where to allow remote code for the embedding model. Can anybody point me in a good direction?
8 comments
W
L
D