Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
The json is a big object and in the
The json is a big object and in the
Inactive
0
Follow
G
Gabig
7 months ago
Β·
The json is a big object and in the explanation ollama model doesn't read it correctly and gives wrong values. Meta-Llama-8B does it perfectly. Could it be the quantization?
W
G
L
3 comments
Share
Open in Discord
W
WhiteFang_Jr
7 months ago
Yes maybe
G
Gabig
7 months ago
The Ollama model is quantize to 4-bit. I think that is not possible run the original Meta-Llama 3 model in Ollama π¦
L
Logan M
7 months ago
Could also be different sampling params (temperature, etc.)
Add a reply
Sign up and join the conversation on Discord
Join on Discord