Find answers from the community

Updated 4 months ago

I am using Ollama embeddings and was

I am using Ollama embeddings and was wondering if there are fields for query_instruction similar to HuggingFaceEmbedding:
Plain Text
    EMBED_MODEL = OllamaEmbedding(
        model_name=EMBED_MODEL_NAME, # mxbai-embed-large:latest
        base_url="http://localhost:11435",
        ollama_additional_kwargs={"mirostat": 0},
        # ? add query_instruction?
    )

for mxbai-embed-large:latest, it asks for: "Represent this sentence for searching relevant passages:" for the queries.

if there isn't a built in field, do I just need to manually add the prompt to the queries at retreival time?
Plain Text
retriever.retrieve(f"Represent this sentence for searching relevant passages: {query}")
F
L
11 comments
It looks like you're trying to figure out how to incorporate a query instruction with the Ollama embeddings, similar to what you might find in HuggingFaceEmbedding. Unfortunately, there doesn't seem to be a built-in field for query_instruction in the OllamaEmbedding class. However, you can definitely handle this by manually adding the prompt to your queries at retrieval time, like you suggested: retriever.retrieve(f"Represent this sentence for searching relevant passages: {query}"). This way, you can ensure that your queries are formatted correctly for the model. If you need any help with this implementation, I’m here to assist!
@FatherMonkey Are you a bot? πŸ‘€
You are responding to many issues (old and new) with a very gpt-like response πŸ˜…
@jayjoshy No param for this that I'm aware of -- your solution makes sense to me
Add a reply
Sign up and join the conversation on Discord