Hey... I'm using llama index but have launched llama-cpp-python using the docker image. Now I've the llm api endpoints, but how do I use them in llama-index
Hey... another quick question. Is it also not possible to use the sub question query engine with LlamaCPP ? Even though llama2-13b-chat is initialised, I'm still getting openai key error