I am trying to use Groq with llama_index. Below is the code for the same. But, while executing code , I am getting : openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Invalid API Key'
My concern is not related to openai invalid api key. It's more of why it is even referring the openai key when I am not using it anywhere. My ideas was to have the application built on the open source model. Does llama index needs to refer the openai for its internal working even though we mention open source models ? Please suggest @Logan M
#pip install llama-index-llms-groq from llama_index.llms.groq import Groq #pip install python-dotenv from dotenv import load_dotenv from llama_index.core import SimpleDirectoryReader, VectorStoreIndex from llama_index.core import PromptTemplate, Settings from llama_index.core.embeddings import resolve_embed_model
def groq_ingest_load(query): # only load PDFs files required_exts = [".pdf"]