Of course, if you know about the format, there are better ways, but it goes beyond the scope of a hex editor, though the most advanced ones support things like template files and can display structured data, disassembly, etc...
Most of us have internalized the relationship between digits in [0-9] for a very long time. Adding 6 more glyphs after that is quite easy (and they're also somewhat well known in the world), and after a while you stop even thinking about the glyphs consciously anyway. A hex 'C' intuitively means to me '4 from the end'. A hex 'F' intuitively means to me 'all 4 bits are set to 1'. I don't see any advantage to switching to a different glyph set for this base, other than disruption for disruption's sake.
> Or even beyond the hexadecimal grouping, wouldn’t be more relevant to render something "intuitively" far more easy to grap without several layer of internalized interpretation through acculturation?
Modern computers deal with 8-bit bytes, and their word sizes are a multiple of bytes - unless you're dealing with bit-packed data, which is comparatively rare (closest is bit twiddling of MMIO registers, which is when you sometimes switch to binary; although for a 4-bit hex nibble you can still learn arbitrary combinations of bits on/off into its value).
This means you can group 8 bits into 1 digits of 8 bits as one glyph (alphabet too large to be useful), 2 digits of 4 (hex), 4 digits of 2 (alphabet too small to give a benefit over binary) and 8 digits of 1 (binary). Hex just works really well as a practical middle ground.
Back when computers used 12 bit words (PDP-8 and friends) octal (4 digits of 3 bits represented in the 0-7 alphabet) was more popular.
Then byte was represented as 16x16 matrix where each 4x4 area had the lower digit pattern, and these were arranged in the shape of the higher digit.
But at the end of the day, it wasn't really more readable.