Find answers from the community

Updated last year

Hello i have created chatbot using llama

Hello, i have created chatbot using llama index with various different sources which are in text format and we used open ai llm model, we have created vector indices for each data sources and created a query engine for each indices. when i use SubQuestionQueryEngine/RouterQueryEngine and create a QueryEnginetool use along with OpenAIAgent. When i query with some question to the chatbot it doesn't fetch the data even though data is chunked in indexes still it replies with general answer like 'i doesn't have info......and kindly visit official website' is there any other way to improve the performance and fetch the appropriate data from indices according to the query?
L
1 comment
Try adding a system prompt to the agent to give it more context to what info it has access to, or write better tool descriptions
Add a reply
Sign up and join the conversation on Discord