Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 4 months ago
0
Follow
Local
Local
Inactive
0
Follow
At a glance
s
southVpaw
last year
Β·
Would anyone be able to help me set up Llama Index entirely locally? I tried to follow the examples in the docs, but it still asks me for an OpenAI API form the example Service Context line
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
last year
You'll need to pass both the llm and embed_model in order to avoid OpenAI case.
This will help you to setup LlamaIndex locally:
https://colab.research.google.com/drive/16Ygf2IyGNkb725ZqtRmFQjwWBuzFX_kl?usp=sharing
Add a reply
Sign up and join the conversation on Discord
Join on Discord