Find answers from the community

Updated 2 months ago

can we get a stream response from openai API during query?

can we get a stream response from openai API during query?
k
V
5 comments
Please stand by @V as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
Yes, you can get a stream response from the GPT index during a query by setting the streaming parameter to True during the index.query() call. When streaming=True, the query will return a StreamingResponse object instead of a standard Response object.

Here's the relevant extract from the documentation:

Plain Text
class gpt_index.response.schema.StreamingResponse(response_gen: Optional[Generator], source_nodes: List[NodeWithScore] = <factory>, extra_info: Optional[Dict[str, Any]] = None, response_txt: Optional[str] = None)


The StreamingResponse object has a response_gen attribute, which is the response generator. You can use the print_response_stream() method to print the response stream.

Source: GPT Index Documentation - Response (https://gpt-index.readthedocs.io/en/latest/reference/response.html)
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai I got this error:"ValueError: stream is only supported for OpenAI LLMs", I am using "llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=temperature, model_name="gpt-3.5-turbo"))" . I think it is OpenAI LLM, please tell me the list of OpenAI LLM you supported.
Let me check the knowledge base for you. One moment! :books:
----------------------
I apologize for the confusion. Based on the information extracts provided, I cannot find the exact list of supported OpenAI LLMs for streaming. However, I can provide you with a list of available models mentioned in one of the extracts:

  • text-davinci-003
  • text-davinci-002
  • text-curie-001
  • text-babbage-001
  • text-ada-001
  • code-davinci-002
  • code-cushman-001
Source: Custom LLMs How-To (https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html)

Please note that this list may not be exhaustive, and it's possible that not all of these models support streaming. You may need to refer to the OpenAI API documentation or the GPT index documentation for more information on streaming support for specific models.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord