Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Avoid openai
Avoid openai
Inactive
0
Follow
j
jorgebt
last year
ยท
hi all, is there anyway to use vectorstoreindex.fromDocuments() without needing an openAI key? Curious to know if i can generate the text_embeddings from the llama model im running locally
L
j
4 comments
Share
Open in Discord
L
Logan M
last year
Definitely! But you'll need two things
An LLM
Local Huggingface:
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/llms/usage_custom.html#example-using-a-huggingface-llm
Custom:
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/llms/usage_custom.html#example-using-a-custom-llm-model-advanced
An embed model
Local Huggingface:
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/embeddings/usage_pattern.html#embedding-model-integrations
Custom:
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/embeddings/usage_pattern.html#custom-embedding-model
Both need ti be set in the service context.
L
Logan M
last year
And if you are setting a service context, it's usually easiest to set it in the global service context and be done
https://gpt-index.readthedocs.io/en/latest/core_modules/supporting_modules/service_context.html#setting-global-configuration
j
jorgebt
last year
thanks!
I was able to use this + llama 2 7b to create a pdf reader locally .
Granted the results are pretty....bad....lol but it worked
L
Logan M
last year
actually, with llama 2, it's super important that the prompt is formatted correctly.
See this message
https://discord.com/channels/1059199217496772688/1131796873670295722/1131812802194051122
Add a reply
Sign up and join the conversation on Discord
Join on Discord