Find answers from the community

I
Icksir
Offline, last seen 4 days ago
Joined February 17, 2025
Hey! I am curring facing issues with the use of memory inside a workflow, and I don't know where else to ask. I am creating a chatbot to chat with multiple documents, and my workflow is now looking like the image with this message. The "ingest" path just creates the top agent to retrieve the documents, and the "ask" path is meant to consult the LLM with the indexes.

My ask step looks like this, but the chat stores just overwrites itself after the top agent call. It doesn't remember the chat history, and I don't know if I am doing something wrong or simple I shouldn't use SimpleChatStore (I just wanted to do a proof of concept).

Any advice is welcomed

Plain Text
@step
async def ask(self, ev: StartEvent) -> StopEvent | None:

    obj_index = ev.get("obj_index")
    query = ev.get("query")
    chat_store = ev.get("chat_store")
    user = ev.get("user")
    if not obj_index or not query:
        return None
    
    user_file = f"./conversations/{user}.json"

    if not os.path.exists(user_file):
        chat_store = SimpleChatStore()
    else:
        chat_store = SimpleChatStore.from_persist_path(persist_path=user_file)
    
    chat_memory = ChatMemoryBuffer.from_defaults(
        token_limit=3000,
        chat_store=chat_store,
        chat_store_key=user,
    )

    top_agent = OpenAIAgent.from_tools(
        tool_retriever=obj_index.as_retriever(similarity_top_k=3),
        system_prompt=PROMPT,
        memory=chat_memory,
        verbose=True,
    )
    
    response = top_agent.query(query)
    chat_store.persist(persist_path=user_file)

    return StopEvent(result={"response": response, "source_nodes": response.source_nodes})
13 comments
I
L