Find answers from the community

Updated 2 months ago

Hello here πŸ™‚ What vectorization method

Hello here πŸ™‚ What vectorization method you guys use? I tried BERT/ roBERTa/ Doc2Vec/ Word2Vec, all where mediocre.
On the other note. is it possible ot use LLAMA to query only vectors adn get query ouput with out asking GPT. Because i would like to have full controll over how i create my GPT prompt?
L
A
5 comments
by default llama-index uses text-embedding-ada-002 from openai, which is honestly pretty good
you can get just the retrieved nodes using a retriever

retriever = index.as_retriever(similarity_top_k=2)

nodes = retriever.retrieve("my query")
@Logan M Thank you again man. Used it ("text-embedding-ada-002") and it is f-ing AMAZING. It sooo mcuh better. yeah it is slower cos you need to use api for each single vector but its not like its Ultra slow + its slow only when you convert data. It didnt affected speed that much when you reading it. This is AMAZING! πŸ™‚
Add a reply
Sign up and join the conversation on Discord