Find answers from the community

Home
Members
WillFuks
W
WillFuks
Offline, last seen 3 months ago
Joined September 25, 2024
Hi guys, do you know if it'd be possible to use the option response_format from the openai client(https://github.com/openai/openai-python/blob/main/src/openai/resources/chat/completions.py#L69) in llama_index?

For what i understood in the code there's no kwargs being sent to the client but i'm not sure if there's some other way around
1 comment
L
Hi everyone, just wondering, is it possible to pipeline query execution on different indices?

The use case would be to first send a query and then use the results to send it as input to query another index, like a pipelining of queries and indices.
4 comments
W
L