Log in
Log into community
Find answers from the community
Most posts
Powered by
Hall
Home
Members
WillFuks
W
WillFuks
Offline
, last seen 3 months ago
Joined September 25, 2024
Contributions
Comments
Replies
W
WillFuks
last year
·
Response
Hi guys, do you know if it'd be possible to use the option
response_format
from the openai client(
https://github.com/openai/openai-python/blob/main/src/openai/resources/chat/completions.py#L69
) in llama_index?
For what i understood in the code there's no kwargs being sent to the client but i'm not sure if there's some other way around
1 comment
L
W
WillFuks
last year
·
Hi everyone, just wondering, is it
Hi everyone, just wondering, is it possible to pipeline query execution on different indices?
The use case would be to first send a query and then use the results to send it as input to query another index, like a pipelining of queries and indices.
4 comments
W
L