Find answers from the community

Updated 2 months ago

from llama_index.core.response import

from llama_index.core.response import StreamingResponse
i cant impore this for somereason
L
d
16 comments
Can you try with a fresh venv?

In a new terminal
Plain Text
pip uninstall llama-index  # remove any potential global install
python -m venv venv
source venv/bin/activate
pip install llama-index
No bueno 😦
StreamingResponse
actually nvm its not there
Looks like it got moved, and the backwards compatibility import got missed during the v0.10.x explosion
from llama_index.core.base.response.schema import StreamingResponse
that will work
im currently using langchain llms does llama index have a huggingface text generation wrapper
I see this but i have my own dedicated serve with a url
https://docs.llamaindex.ai/en/stable/examples/llm/huggingface.html#hugging-face-llms
Are you using the openai api version of TGI?
If so, you can use the OpenAILike class
Otherwise you can wrap TGI with the LangChainLLM class
Add a reply
Sign up and join the conversation on Discord