Find answers from the community

e
eden
Offline, last seen 3 months ago
Joined September 25, 2024
If I only ever plan on using pre-computed embeddings, setting Settings.embed_model = None gives me an error about an OpenAI key? Is there a way to set embed_model to None like there is with Settings.llm?
12 comments
k
b
e
L
When I call .retrieve() I get a NodeWIthScore() object, but the embedding is always None. How can I return the embedding in the NodeWithScore object as well?
7 comments
L
e
def seems like a LlamaIndex issue, LangChain works without an issue
3 comments
e
L
anyone know when the system_prompt argument was removed from the Ollama integration? and how to add one in the new implementation
6 comments
L
e
it seems like CitationQueryEngine is just a fancy prompt? is that correct?
3 comments
e
L
@kapa.ai When I call .retrieve() I get a NodeWIthScore() object, but the embedding is always None. How can I return the embedding in the NodeWithScore object as well?
2 comments
k
@kapa.ai How can I print out the intermediate prompts used in TreeSummarizer?
2 comments
k
@kapa.ai Is there a way to query an index using an embedding?
16 comments
k
e