Find answers from the community

Home
Members
unbittable
u
unbittable
Offline, last seen 3 months ago
Joined September 25, 2024
OK, running into a problem here. When I create a DecomposeQueryTransform and a TransformQueryEngine, and then query it with a string (as per the examples[0]), I get the error "Invalid Prompt Type". This seems to be because by default the DecomposeQueryTransform initializes itself with a DecomposeQueryTransformPrompt, which by default has prompt_type=PromptType.CUSTOM, which is not in the list of prompt types checked in MockLLMPredictor._predict(). What am I doing wrong here? How do I fix this?

[0] https://gpt-index.readthedocs.io/en/v0.6.0/how_to/query/query_transformations.html#single-step-query-decomposition
6 comments
d
u
Also, where are retriever_modes covered? Also looking for the values and what each does. Kapa is sending me to docs about response mode, when I ask it.
7 comments
u
d
Is there a list somewhere of response modes and what they do? the only one I see referenced at all in teh docs is "tree_summarize"
4 comments
d
u
Definitely not just an interface change. that facility is completely gone, and I'm trying to dig down into the code in order to find a way to serialize to string.
31 comments
u
d
n
so does the query string get passed to the model which will re-calculate the embeddings anyway, then? Or... since (to my understanding) the model doesn't understand the meaning of the string, only the tokens it's parsed into, how does the model use the raw string to synthesize the answer without having to re-embed it anyway?
11 comments
L
u