Hi! When I am using a custom LLM provider from the list of AI providers of Lanchain. How can I send custom parameters via the retrieve method? For example:
service_context = ServiceContext.from_defaults(llm=llm) index = VectorStoreIndex.from_documents(all_docs, service_context=service_context) retriver = index.as_retriever() retriver.retrieve("what is the pricing")[0].node.text