The only problem is that the American models are super fracking dumb. Arcee Thinking Large (398B) is orders of magnitude worse than even Qwen 3.5 35B, getting stuck in thinking loops with incredibly basic questions that Google could answer in 500ms.
reply