Find answers from the community

s
F
Y
a
P
Updated last month

Potential bug in instance check in llm_predictor. Prevents streaming on ChatOpenAI model

Hey friend, I think I found a smol bug. the streaming flag in llama index's base query class checks to make sure the model is an OpenAI model. This is intentional, but it breaks when a ChatOpenAI instance is used. LangChain supports streaming functionality for ChatOpenAI, but it's getting blocked by the instance check in line 246 of llama_index/llm_predictor/base.py

Plain Text
 ~/.local/lib/python3.9/site-packages/llama_index/llm_predictor/base.py in stream(self, prompt, **prompt_args)
    245         """
    246         if not isinstance(self._llm, OpenAI):
--> 247             raise ValueError("stream is only supported for OpenAI LLMs")
    248         formatted_prompt = prompt.format(llm=self._llm, **prompt_args)
    249         raw_response_gen = self._llm.stream(formatted_prompt)

ValueError: stream is only supported for OpenAI LLMs
c
j
g
13 comments
Hey @jerryjliu0 it looks like you're tracking stuff via threads, figured I'd make one to save you the hassle
Ah it looks like gpt_index is using a different pattern than the streaming functionality in langchain:
https://python.langchain.com/en/harrison-docs-refactor-3-24/modules/models/llms/examples/streaming_llm.html
Hi @confused_skelly , just seeing this. Yeah this has popped up from a few users, hopefully we'll have a fix for this soon
Heya, thank you for responding!
I got it working on my end with callbacks, and a threaded task, but would love to see that abstracted away in the query response class
@confused_skelly Hi, I'm running into the same issue here... Can you tell me how you solved this issue? I'ma lso trying to use 'stream' with 'ChatOpenAI'.
Jerry implemented this functionality in a new Pull Request
Forgetting the exact number
But go to the GitHub issues. There’s an open issue on this that has the PR number
@confused_skelly Wow thanks, If you get to recall more details (i.e. issue number or keywords), please let me know! Thanks a lot, I'll look it up myself, too!
haven't had the chance to land quite yet, it's also a bit hacky - feel free to checkout the branch in git, take a look at the example notebook, and let me know what your thoughts are
Add a reply
Sign up and join the conversation on Discord