The community member has created a chatbot using LlamaIndex with various text data sources and the OpenAI language model. They have set up vector indices and query engines for each data source, but when querying the chatbot, it provides a general response indicating it does not have the relevant information, even though the data is chunked in the indices. The community member is looking for ways to improve the performance and fetch appropriate data from the indices based on the query.
In the comments, another community member suggests trying to add a system prompt to the agent to provide more context about the information it has access to, or writing better tool descriptions.
Hello, i have created chatbot using llama index with various different sources which are in text format and we used open ai llm model, we have created vector indices for each data sources and created a query engine for each indices. when i use SubQuestionQueryEngine/RouterQueryEngine and create a QueryEnginetool use along with OpenAIAgent. When i query with some question to the chatbot it doesn't fetch the data even though data is chunked in indexes still it replies with general answer like 'i doesn't have info......and kindly visit official website' is there any other way to improve the performance and fetch the appropriate data from indices according to the query?