Find answers from the community

Updated 2 months ago

Avoid openai

hi all, is there anyway to use vectorstoreindex.fromDocuments() without needing an openAI key? Curious to know if i can generate the text_embeddings from the llama model im running locally
L
j
4 comments
And if you are setting a service context, it's usually easiest to set it in the global service context and be done

https://gpt-index.readthedocs.io/en/latest/core_modules/supporting_modules/service_context.html#setting-global-configuration
thanks!

I was able to use this + llama 2 7b to create a pdf reader locally .

Granted the results are pretty....bad....lol but it worked
actually, with llama 2, it's super important that the prompt is formatted correctly.

See this message
https://discord.com/channels/1059199217496772688/1131796873670295722/1131812802194051122
Add a reply
Sign up and join the conversation on Discord