Find answers from the community

Home
Members
SlapDrone
S
SlapDrone
Offline, last seen 3 months ago
Joined September 25, 2024
bit of a hail mary here, just throwing this out there incase any devs know what might be going on:

in short: the validate_model function for the custom llama index BaseModel (in the deepest darkest depths of the codebase) is since recently somehow destroying perfectly good pydantic output responses (FYI: from a RetrieverQueryEngine with a fairly complex output_cls) at the eleventh hour. I can see that the output is there from the LLM before it hits the validation function, it even gets registered correctly in the event callback and i see the correct outputs in Arize phoenix. But the response from the query engine is just a blank BaseModel.

I can't say rn in exactly what version this cropped up, but certainly somewhere between 0.9 and 0.10. it's a bit of a labyrinth in there, imma write up a proper bug report and create a MWE when I have the time, but taking a shot at fixing it in the meantime as working to a tight deadline.
6 comments
S
L
S
SlapDrone
Β·

Few shot

Since functions are apparently prompts now, if this is useful to anyone, just made a cheeky wee few_shot decorator that allows you to define examples which get injected into your function's docstring: https://github.com/SlapDrone/few-shot

pip install few-shot

feedback/contribs welcome. tested it a bit with marvin. next: adapt to work with the likes of OpenAIPydanticProgram.
13 comments
S
L
οΏ½