Find answers from the community

Home
Members
ChuanYue
C
ChuanYue
Offline, last seen 4 months ago
Joined September 25, 2024
I built a graph on top of an index, I want to use a streaming response, but it doesn't seem to be there, what do I do
3 comments
k
how to customize Text Splitter use SpacyTextSplitter?
11 comments
b
C
L
As of today, I have been having problems with llama-index, but it is no problem to openAI directly, what is the reason?
Plain Text
pythonres = OpenAI().complete('hi')
print(res)
res = VectorStoreIndex([]).as_query_engine().query('Hi')
print(res)
Hello! How can I assist you today?
Traceback (most recent call last):
File "E:\demo\CUITCCA\backend\venv\lib\site-packages\openai\api_requestor.py", line 753, in _interpret_response_line
data = json.loads(rbody)
File "D:\DevelopmentEnvironment\Python\Python310\lib\json__init.py", line 346, in loads return _default_decoder.decode(s) File "D:\DevelopmentEnvironment\Python\Python310\lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "D:\DevelopmentEnvironment\Python\Python310\lib\json\decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from Nonejson.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)The above exception was the direct cause of the following exception:Traceback (most recent call last): File "E:\demo\CUITCCA\backend\venv\lib\site-packages\tenacity__init.py", line 382, in call
result = fn(args, kwargs) File "E:\demo\CUITCCA\backend\venv\lib\site-packages\llama_index\embeddings\openai.py", line 122, in get_embedding return openai.Embedding.create(input=[text], model=engine, kwargs)["data"][0][ File "E:\demo\CUITCCA\backend\venv\lib\site-packages\openai\api_resources\embedding.py", line 33, in create response = super().create(args, **kwargs)
File "E:\demo\CUITCCA\backend\venv\lib\site-packages\openai\api_resources\abstract\engine_apiresource.py", line 153, in create response, , api_key = requestor.request(...
4 comments
C
L
C
ChuanYue
·

Output

Router Query Engine will output the completed query to the response, is that my problem?
2 comments
C
L
I built a graph on top of an index, I want to use a streaming response, but it doesn't seem to be there, what do I do
1 comment
p
I am using gptListIndex and I would like to ask if the cost is normal. If there are a large number of data queries, what kind of index can be used to reduce costs and improve query speed?
2 comments
C
L