Find answers from the community

H
HerryC
Offline, last seen 3 months ago
Joined September 25, 2024
But what local llm model I can use? Could I use local deploy chatglm and have a good performance?
3 comments
L
W