Find answers from the community

Updated 8 months ago

Hi everyone, Im getting `AttributeError

Hi everyone, Im getting AttributeError: 'Response' object has no attribute 'response_gen' for this code:

In service:

Plain Text
        ...
        llm = OpenAI(
            model=request.model.value,
            temperature=request.temperature,
            max_tokens=NUM_OUTPUTS,
        )

        service_context = ServiceContext.from_defaults(llm=llm)

        query_engine = index.as_query_engine(
            streaming=True,
            service_context=service_context,
            similarity_top_k=1,
        )

        response_stream = query_engine.query(input_text)
        def _stream_chat(generator):
            for chunk in generator:
                yield chunk

        return _stream_chat(response_stream.response_gen)


Im calling that service from:
Plain Text
        return StreamingResponse(
            content=IDPService().query(body),
            status_code=status.HTTP_200_OK,
            media_type="text/html",
        )


Any Ideas? Im following documentation for return StreamingResponse for frontend/user. πŸ™
L
1 comment
This works for me πŸ€·β€β™‚οΈ

(I'm not sure if you are using v0.10.x yet, but I used the slightly newer syntax)

Plain Text
from llama_index.llms.openai import OpenAI
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.core import Document, VectorStoreIndex

index = VectorStoreIndex.from_documents([Document.example()], embed_model=OpenAIEmbedding())

query_engine = index.as_query_engine(llm=OpenAI(), streaming=True)

response = query_engine.query("Tell me about LLMs.")
for token in response.response_gen:
  print(token, end="", flush=True)
Add a reply
Sign up and join the conversation on Discord