Hi everyone. I would like to build a question-answering app that retrieves embeddings from a vector store and uses them as context in the prompt to answer a question. In this app, I am not using OPENAI_API_KEY as my LLM is from Hugging Face Hub. Specifically, I created my LLM instance ("llm") by HuggingFacePipeline, and provided it to the following:
llm_predictor = LLMPredictor(llm=llm)
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
index = GPTVectorStoreIndex.from_documents(documents, service_context=service_context)
However, GPTVectorStoreIndex throws
AuthenticationError: No API key provided.
Could anyone help me implement a vector store index without OPENAI_API_KEY?? (Or is a vector store index necessary to build an app if I am going to have an external vector store like FAISS or Pinecone?) Thank you in advance π