Find answers from the community

Updated 3 months ago

Hi. Having issues with Llama-3 with

Hi. Having issues with Llama-3 with NLSQLTableQueryEngine. It generates a response but then starts hallucinating. Inside the response starts to create imaginary questions and answers. I suspect that I need to set completion_to_prompt, messages_to_prompt but struggle to figure out how.
L
j
5 comments
how are you running llama3?
I would just use ollama tbh lol it handles all the prompting for you
it does have some tendency to hallucinate like that in my experience though
@Logan M I am using vLLM and connecting to it using the "from llama_index.llms.openai_like import OpenAILike". I cannot use Ollama. I am running my processes on a HPC and setting Ollama there was imposible.
thats fair. it probably is the prompt then, the llama3 prompt format is terribly complicated

you can set messages to prompt and completion to prompt, and maybe use huggingface tokenizer.apply_chat_template to make it easier?
Add a reply
Sign up and join the conversation on Discord