Hi, I'd like to process the documents ( lemmatization and removing stopwords) before creating index. And when query, I want use the processed text to find related resources but use original question when sending to gpt-3.5. Thess are my steps:
- Process documents and generate vector store index
- Process the question
- Find top k related sources using processed question
- Send original question and processed sources context to GPT to answer the question
Is it possible to do this in llama index ? (maybe using QuestionAnswerPrompt template ?)