Find answers from the community

Updated 3 months ago

Hi

Hi,
I created a list and vector index and then engines for a few txt files.
list_tool = QueryEngineTool.from_defaults(
query_engine=list_query_engine,
description="Useful for summarization of the podcast",
)
vector_tool = QueryEngineTool.from_defaults(
query_engine=chat_engine,
description="Useful for retrieving specific context related to the podcast topic",
)

I'm using RouterQueryEngine to switch between them with a PydanticSingleSelector

query_engine = RouterQueryEngine(
selector=PydanticSingleSelector.from_defaults(),
query_engine_tools=[
list_tool,
vector_tool,
],
)

Query about specific content executes but the query about the summary reults into a RuntimeError: asyncio.run() cannot be called from a running event loop

What could cause this?
If I run the first query again ir returns final response.
L
J
29 comments
are you running in a notebook? You'll need to do this

Plain Text
import nest_asyncio
nest_asyncio.apply()
yes, i just added the lines, now I get IndexError: list index out of range... looks like something is wrong with the list index
whats the full error? I doubt its the list index πŸ˜‰
IndexError Traceback (most recent call last)
Cell In[19], line 1
----> 1 response = query_engine.query("What is the summary of the Sleep toolkit tools for optimizing sleep & sleep wake Podcast?")
2 pprint_response(response, show_source=True)
3 print(response)

File ~/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages/llama_index/indices/query/base.py:23, in BaseQueryEngine.query(self, str_or_query_bundle)
21 if isinstance(str_or_query_bundle, str):
22 str_or_query_bundle = QueryBundle(str_or_query_bundle)
---> 23 response = self._query(str_or_query_bundle)
24 return response

File ~/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages/llama_index/query_engine/router_query_engine.py:118, in RouterQueryEngine._query(self, query_bundle)
117 def _query(self, query_bundle: QueryBundle) -> RESPONSE_TYPE:
--> 118 with self.callback_manager.event(
119 CBEventType.QUERY, payload={EventPayload.QUERY_STR: query_bundle.query_str}
120 ) as query_event:
121 result = self._selector.select(self._metadatas, query_bundle)
123 if len(result.inds) > 1:
File /Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/contextlib.py:117, in _GeneratorContextManager.enter(self)
115 del self.args, self.kwds, self.func
116 try:
--> 117 return next(self.gen)
118 except StopIteration:
119 raise RuntimeError("generator didn't yield") from None

File ~/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages/llama_index/callbacks/base.py:140, in CallbackManager.event(self, event_type, payload, event_id)
138 # create event context wrapper
139 event = EventContext(self, event_type, event_id=event_id)
--> 140 event.on_start(payload=payload)
142 yield event
144 # ensure event is ended

File ~/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages/llama_index/callbacks/base.py:213, in EventContext.on_start(self, payload, kwargs) 211 if not self.started: 212 self.started = True--> 213 self._callback_manager.on_event_start( 214 self._event_type, payload=payload, event_id=self._event_id, kwargs
215 )
216 else:
217 logger.warning(
218 f"Event {str(self._event_type)}: {self._event_id} already started!"
219 )

File ~/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages/llama_index/callbacks/base.py:76, in CallbackManager.on_event_start(self, event_type, payload, event_id, **kwargs)
73 """Run handlers when an event starts and return id of event."""
74 event_id = event_id or str(uuid.uuid4())
---> 76 parent_id = global_stack_trace.get()[-1]
77 self._trace_map[parent_id].append(event_id)
78 for handler in self.handlers:

IndexError: list index out of range
If it is not the index than why do i get IndexError: list index out of range?
that's an error with the callback manager πŸ€” What version of llama-index do you have?
pip show llama-index
Name: llama-index
Version: 0.7.22
Summary: Interface between LLMs and your data
Home-page: https://github.com/jerryjliu/llama_index
Author: Jerry Liu
Author-email:
License: MIT
Location: /Users/jana/work/LlamaIndexLangChain/jupyter/myenv/lib/python3.9/site-packages
Requires: beautifulsoup4, dataclasses-json, fsspec, langchain, nest-asyncio, numpy, openai, pandas, sqlalchemy, tenacity, tiktoken, typing-extensions, typing-inspect, urllib3
Required-by:
I create it like this:
from llama_index import (
ListIndex,
ServiceContext,
)
from llama_index.tools.query_engine import QueryEngineTool

documents = SimpleDirectoryReader('assets/AndrewHuberman/sleep', filename_as_id=True).load_data()


service_context = ServiceContext.from_defaults(chunk_size=1024)
nodes = service_context.node_parser.get_nodes_from_documents(documents)

storage_context = StorageContext.from_defaults()
storage_context.docstore.add_documents(nodes)

list_index = ListIndex(nodes, storage_context=storage_context)
but you are using a router query engine right? Let me test this locally, to make sure something is not too broken lol
Yes this is the code: query_engine = RouterQueryEngine(
selector=PydanticSingleSelector.from_defaults(),
query_engine_tools=[
list_tool,
vector_tool,
],
)
hmmm, I was not able to replicate the error πŸ€”
I was following the example notebook from the docs
looks like you are as well
Yes, hm strange
Not sure what to do
Does it work if you run the code in a normal python file, rather than jupyter?
I rebuild the index and now it works
looks like something went wrong while building the index
When would it be better to use PydanticSingleSelector selector vs LLMSingleSelector ... I'm not quite sure
Pydantic will always be better if you are using OpenAI (it only works with OpenAI, but is much more reliable)
Thank you for clarifying πŸ™‚
One more πŸ™‚ In a response synthesiser, when is it better to use tree_summarize and when compact mode?
Tree summarize is just best for generating summaries πŸ™‚
Oh, time to get away from the computer, hehe, thank you πŸ˜‰
hahaha no worries πŸ€”
Add a reply
Sign up and join the conversation on Discord