BAAI/bge-small-en
for embeddings model and llama2-chat-13b
for the LLMquery_engine = index.as_query_engine(service_context=service_context) ... that? im sorry im still learning python and i was trying to follow the docs here https://docs.llamaindex.ai/en/stable/getting_started/customization.html
from llama_index import set_global_service_context set_global_service_context(service_context)
from llama_index import set_global_service_context set_global_service_context(service_context) # OR index = VectorStoreIndex.from_documents(documents, service_context=service_context)