Find answers from the community

Updated 5 months ago

Do we have anyway to make the below code

At a glance

The community members are discussing how to make the provided code asynchronous. A community member suggests running the transformations manually and asynchronously, but notes it may be "hacky". They provide sample code using the arun_transformations function and a custom DummyTransform class.

Other community members raise issues with the suggested approach, such as the index becoming a coroutine object and not being able to access certain functions. They also encounter errors when trying to use the asynchronous code with FastAPI.

The discussion concludes that this problem is currently unsolved, as Python constructors cannot be asynchronous, and the library does not have an asynchronous insert function. Potential solutions suggested include setting the event loop type to asyncio or running the ingestion in a separate thread.

Do we have anyway to make the below code an asynchronous call?
Plain Text
index = PropertyGraphIndex.from_documents(documents=[document],
                                          llm=llm_model,
                                          embed_model=embedding_model,
                                          kg_extractors=[
                                          ImplicitPathExtractor(),
                                          LLM_extractor,
                                          ],
                                          show_progress=True,
                                          )
L
p
17 comments
Python constructors cant be async. But, you could run what its doing manually and async. I think its a tad hacky at the moment though

Plain Text
from llama_index.core.ingestion import arun_transformations

# this will insert the kg nodes into the metadata of each text node
nodes = await arun_transformations([document], transformations=[SentenceSplitter(), ImplicitPathExtractor(), LLM_extractor])

# need some dummy extractor I think, but then throw it into the index
from llama_index.core.schema import TransformComponent

class DummyTransform(TransformComponent):
  def __call__(self, nodes, **kwargs):
    return nodes

index = PropertyGraphIndex(nodes=nodes, kg_extractors=[DummyTransform()], ...)
@Logan M it seems that the index becomes the type of
Plain Text
Coroutine[Any, Any, PropertyGraphIndex]
and I can't operate some other functions on it, such as index.property_graph_store (it says
Plain Text
'coroutine' object has no attribute 'property_graph_store'
. Is there anyway to keep the most of original functionalities?
And I don't see how I specify the embedding_model there...?
what, how πŸ˜… What line of code does it beome this?

And you can specify the embedding model in the constructor
@Logan M Actually, nvm on that πŸ˜… Just, even after implementing this, I got
Plain Text
asyncio.run() cannot be called from a running event loop
error, and if I use nest.asyncio, the error would become
Plain Text
 this event loop is already running
. Do we have any methods to resolve this?
What version of llama-index-core do you have? Pretty sure I fixed that
@Logan M I have llama-index-core of 0.10.57 currently. Which version do we have this fixed?
Hmm pretty sure that should have had it fixed πŸ€” I'm guessing you are running in fastapi or some async server?
@Logan M Yeah I'm calling the above with fastapi through an async function... πŸ˜… Has anyone had this case solved?
I think it probably requires some change to the library
Although longer term we really shouldn't be nesting async
But python constructors can't be async πŸ˜…
So there needs to he an async insert function (there currently isnt)
So you mean this problem is currently unsolved...? Or is there any way I can change that...?
Unsolved (although tbh I've never had an issue either so unsure actually)

Some possible solutions
  • set the loop type to asyncio if you haven't already uvicorn.run(..., loop="asyncio")
  • run the ingestion in a separate python thread so that it can use its own async loop
Got it. Thank you!
Add a reply
Sign up and join the conversation on Discord