Find answers from the community

Updated 2 years ago

Error after upgrade

Happened after I upgraded to 0.5.x
L
w
j
22 comments
V0.5.0 had some breaking changes. Are you loading an old index from disk?
Nope, I'm querying an existing Opensearch vector index directly.
@Logan M I downgraded for now but would love to get it figured out
How were you building/querying the index when you got the error? I'll see if I can spot what needs to be changed
Something like this
Plain Text
client = ElasticsearchVectorClient(account_id=account_id,
                                           app_id=app_id)
index = GPTOpensearchIndex([], client=client)

query_kwargs = {"similarity_top_k": 2}

response = index.query(q, **query_kwargs)
ElasticsearchVectorClient is an extended client that I built on my own but I don't believe that is related to the error because I've tried with the regular client
The lack of documentation and the amount of issues here are killing me :/ I can't even do the simplest stuff
The docs did get a bit of a revamp in the latest versions, if you haven't checked it out yet. If there's anything specific you think would be helpful to add I can work to get that in there.

Do you mind sharing the full stack trace you were getting before? I don't really see anything obviously wrong with the code
Sure, here it is

Plain Text
KeyError                                  Traceback (most recent call last)
Cell In[7], line 1
----> 1 response = index.query("I'd like to connect my Facebook channel to Rasayel", **query_kwargs)
      3 display(Markdown(f"{response}</b>"))

File ~/.local/lib/python3.8/site-packages/llama_index/indices/base.py:244, in BaseGPTIndex.query(self, query_str, mode, query_transform, use_async, **query_kwargs)
    230 query_config = QueryConfig(
    231     index_struct_type=self._index_struct.get_type(),
    232     query_mode=mode_enum,
    233     query_kwargs=query_kwargs,
    234 )
    235 query_runner = QueryRunner(
    236     index_struct=self._index_struct,
    237     service_context=self._service_context,
   (...)
    242     use_async=use_async,
    243 )
--> 244 return query_runner.query(query_str)

File ~/.local/lib/python3.8/site-packages/llama_index/indices/query/query_runner.py:341, in QueryRunner.query(self, query_str_or_bundle, index_id, level)
    323 """Run query.
    324 
    325 NOTE: Relies on mutual recursion between
   (...)
...
    169     docstore=self._docstore,
    170     **query_kwargs,
    171 )

KeyError: 


If you would like, we can jump on a call and I can show it to you as well
(Can't jump on a call, also technically at work as well LOL)

That's a little annoying that it's not showing the exact line with the error in the stack trace πŸ˜… And you had updated to the latest version (v0.5.3 i think) when running this?
Plain Text
Traceback (most recent call last):
  File "hcaa/hcaa.py", line 30, in <module>
    main()
  File "/home/youssef/.local/lib/python3.8/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/home/youssef/.local/lib/python3.8/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/home/youssef/.local/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/youssef/.local/lib/python3.8/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/youssef/.local/lib/python3.8/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "hcaa/hcaa.py", line 25, in query
    result = LlamaIndexService().query(account, app, query)
  File "/home/youssef/octopods/rasayel-core/hcaa/lib/llama_index_service.py", line 23, in query
    response = index.query(q, **query_kwargs)
  File "/home/youssef/.local/lib/python3.8/site-packages/llama_index/indices/base.py", line 244, in query
    return query_runner.query(query_str)
  File "/home/youssef/.local/lib/python3.8/site-packages/llama_index/indices/query/query_runner.py", line 341, in query
    return query_combiner.run(query_bundle, level)
  File "/home/youssef/.local/lib/python3.8/site-packages/llama_index/indices/query/query_combiner/base.py", line 66, in run
    return self._query_runner.query_transformed(
  File "/home/youssef/.local/lib/python3.8/site-packages/llama_index/indices/query/query_runner.py", line 182, in query_transformed
    query_obj = self._get_query_obj(index_struct)
  File "/home/youssef/.local/lib/python3.8/site-packages/llama_index/indices/query/query_runner.py", line 165, in _get_query_obj
    query_cls = INDEX_STRUT_TYPE_TO_QUERY_MAP[index_struct_type][mode]
KeyError: <IndexStructType.OPENSEARCH: 'opensearch'>
Here's a more complete stack trace
my notebook was borked
aha! Well there's a problem. Opensearch must have gotten lost from this dict while the refactor :PSadge: Apologies for that!
Attachment
image.png
Alright thanks! I guess it'll be fixed in a few days

I have another question, if you have the capacity:
I can't find my way to customize my prompts for chatgpt, do you have any idea where to look?
There are a few prompts specific to chatgpt defined here. If a prompt isn't in here, it just gets set to a singe "Human" message
https://github.com/jerryjliu/llama_index/blob/main/gpt_index/prompts/chat_prompts.py

You should be able to follow that to create your own prompts
I'll remember to ping you when a fix is in πŸ™‚
Awesome! thanks a lot @Logan M ❀️
@jerryjliu0 just pinging for visibility, haven't had a chance to fix it yet lol
@walid sorry about that!
will put out a fix soon
Thanks for the amazing work!
Add a reply
Sign up and join the conversation on Discord