Find answers from the community

Updated 2 years ago

Hi people Is it possible to combine the

At a glance
Hi, people. Is it possible to combine the HuggingFaceEmbeddings service context with the gpt-3.5-turbo model service context? I tried to test an implementation but when loading the index and making the search query, I am getting this error: {"status":{"error":"Wrong input: Vector inserting error: expected dim: 768, got 1536"}
L
e
6 comments
Definitely possible! But if you change embeddings you'll need to start with a new index. Also don't forget the service context when loading from disk

Plain Text
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo'))
embed_model = LangchainEmbedding(HuggingFaceEmbeddings())
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor, embed_model=embed_model)

index = GPTSimpleVectorIndex.from_documents(documents, service_context=service_context)
...
index.load_from_disk("index.json", service_context=service_context)
Thank you, Logan. This is great. Does HuggingFaceEmbeddings() is using all-mpnet-base-v2 by default?
It does! You can also pass In the name of any model from huggingface as well
Results change a lot 🀯
Definitely, text-embedding-ada-002 is the best one.
Yea, openAI embeddings are hard to beat. Plus, they are pretty cheap haha
Add a reply
Sign up and join the conversation on Discord