Find answers from the community

Updated 2 weeks ago

Callbacks

Does AzureAISearchVectorStore support a callback manager?
I only get the message
Plain Text
**********
Trace: index_construction
**********

and no other Events

My code is as follows:
Plain Text
vector_store = AzureAISearchVectorStore(
    search_or_index_client=search_client,
    filterable_metadata_field_keys=metadata_fields,
    index_management=IndexManagement.VALIDATE_INDEX,
    id_field_key="id",
    chunk_field_key="chunk",
    embedding_field_key="embedding",
    embedding_dimensionality=1536,
    metadata_string_field_key="metadata",
    doc_id_field_key="doc_id",
)

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    [],
    storage_context=storage_context,
    callback_manager=callback_manager
)
L
S
14 comments
I don't think any vector store supports callbacks. They all happen around/outside of it
But as per your docs you add a callback manager when creating an index right?
Yea. And in your code, there won't be anything to trace, because there's no nodes (that last line of code is essentially a no-op)
Ah okay so I should be using method β€œfrom nodes” not β€œfrom documents”?
Or as per your first comment it can’t be done in llama index unless I have access to azure ai search sub-processes
You can use from_documents or the constructor itself for nodes i.e. VectorStoreIndex(nodes=nodes, ...), but the point is since you are passing in an empty list, there is nothing to trace πŸ‘€
there are no documents or nodes to process
Yeah I was confused about that also, I grabbed it from the notebook tutorial.
I assumed the storage context had the docs already (stored already on Azure) hence the empty array?
Maybe? Depends which docs page lol
So adding a callback manager like:
Plain Text
callback_manager = CallbackManager([LlamaDebugHandler()])

to
Plain Text
# Load documents
documents = SimpleDirectoryReader("../data/paul_graham/").load_data()
storage_context = StorageContext.from_defaults(vector_store=vector_store)

Settings.llm = llm
Settings.embed_model = embed_model
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context,
callback_manager=callback_manager
)

doesnt work with no Events emitted besides Trace: index_construction
Does Azure emit events to llama index I am unaware of? If it doesnt at all then just let me know.. lol
that should work, but since you manually setup the llm and embedding model, you'll need to attach it there as well
How do I do that? Any hints pls πŸ™
For example

llm = OpenAI(..., callback_manager=callback_manager)

embed_model = OpenAIEmbedding(..., callback_manager=callback_manager)
Add a reply
Sign up and join the conversation on Discord