Find answers from the community

Updated 2 years ago

Embed model error

Hey @Logan M , What I have done is, I created the embeddings using HuggingFaceEmbeddings and then I passed it to the openAI LLM model for query but it's throwing an error. Do we have any such scenario where we can create embeddings and save them into index.json and then pass this to openAi llm for query?
L
2 comments
What kind of error are you getting?
Oh, you need to put the service context into the vector index

.from_documents(documents, service_context=service_context)

And also when loading

.load_from_disk("index.json", service_context=service_context)
Add a reply
Sign up and join the conversation on Discord