Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Hi! I'm trying to run llamaindex fully
Hi! I'm trying to run llamaindex fully
Inactive
0
Follow
g
giulio
last year
·
Hi! I'm trying to run llamaindex fully local using Ollama as an embedding model. Has anyone managed to do this?
W
g
2 comments
Share
Open in Discord
W
WhiteFang_Jr
last year
LlamaIndex currently supports these embedding platforms only and sadly Ollama isn't there.
https://docs.llamaindex.ai/en/stable/module_guides/models/embeddings.html#list-of-supported-embeddings
You can use the custom embedding class to wrap ollama embedding on your own.
https://docs.llamaindex.ai/en/stable/examples/embeddings/custom_embeddings.html#custom-embeddings
g
giulio
last year
ok I’ll use the custom embeddings then, thank you!
Add a reply
Sign up and join the conversation on Discord
Join on Discord