Hi. Having issues with Llama-3 with NLSQLTableQueryEngine. It generates a response but then starts hallucinating. Inside the response starts to create imaginary questions and answers. I suspect that I need to set completion_to_prompt, messages_to_prompt but struggle to figure out how.
@Logan M I am using vLLM and connecting to it using the "from llama_index.llms.openai_like import OpenAILike". I cannot use Ollama. I am running my processes on a HPC and setting Ollama there was imposible.