Also, `#fff` is ambigous -- if you mean device colour, then there's no brightness (nits) specified at all, it may be 200 or 2000 or 10,000. If sRGB is implied, as in `#fff in sRGB colour space` then the standard specifies 80 nits, so when you say you don't want brighter than that, then you can't have much of HDR since sRGB precludes HDR by definition (can't go brighter than 80 nits for the "whitepoint" aka white).
I think if you want HDR you need a different colour space entirely, which either has a different peak brightness, or one where the brightness is specified additionally to e.g. R, G and B primaries. But here my HDR knowledge is weak -- perhaps someone else may chime in. I just find colour science fascinating, sorry to go on a tangent here.
And no, #fff is not a "device color". The syntax originates from the web where sRGB is implied ever since we had displays brighter than that.
`#fff` is device color, it's short for `#ffffff` which is 24-bit RGB that predates sRGB, as does true color device support. I was sending 24-bit RGB to VESA-compliant graphics cards before sRGB became a thing. `#fff` was supported by Photoshop and Macromedia products as straightforward device colour format, before sRGB was adopted by at least the latter, mind you. The use by CSS is co-incidental, not where the format was introduced.