I think the best solution here would be to convert the elastisearch reader into a function tool, and then use that tool with an agent
Small example (note that the docstring is used as the tool description, this is how the agent decides to use the tool or not)
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
from llama_index.tools import FunctionTool
multiply_tool = FunctionTool.from_defaults(fn=multiply)
from llama_index.agent import OpenAIAgent
from llama_index.llms import OpenAI
llm = OpenAI(model="gpt-3.5-turbo")
agent = OpenAIAgent.from_tools([multiply_tool], llm=llm, verbose=True)