upvote
More like 80-90 before and around 50 afterward.

This is a boring NVR workload with a bit of GPU usage, with total system utilization around 10% with turbo off. Apparently the default behavior is to turbo from the normal ~3GHz up to 5.4GHz, and I don’t know why the results were quite so poor.

This is an i9-13900H (Minisforum MS-01) machine, so maybe it has some weird tuning for gaming workloads? Still seems a bit pathetic. I have not tried monitoring the voltages with turbo on and off to understand exactly why it’s performing quite so inefficiently.

reply
If it was running at over 80° C it was not lightly loaded, it was pegging one or two cores and raising the clocks as high as they could go. That's what gives the best instantaneous performance. It's possible in your particular case that doesn't give the best instructions/J because you're limited by real world constraints (such as the capture rate of the camera), but the data comes fast enough to not give the CPU time to switch back down to a lower power state. Or it's also possible that the CPU did manage to reach a lower power state, but the dinky cooling solution was not able to make up for the difference. I'd monitor the power usage at each setting.
reply