Find answers from the community

Home
Members
patrasq
p
patrasq
Offline, last seen 3 months ago
Joined September 25, 2024
has anyone faced issues with llama3 with the ollama integration with llamaindex ?
same prompt, no custom parameters, diferent responses from ollama run llama3 and llm.complete

it feels way more dumb when called through llm.complete

i want llama3 to fix a json and with ollama run llama3 it returns a perfect json, while llm.complete messes the structure and returns only 10-20% of the string
4 comments
L
p