However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
* https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...
In the commercial/industrial space this may be worth it: how long do these bulbs last? how much (per hour (equivalent)) do you pay your facilities folks? how much time does it take for employees or tenants to report an outage and for your folks to get a ladder (or scissor lift) to change the bulb?
The part that would genuinely be cheaper is avoiding problematic flicker. It takes a reasonably high quality LED driver to avoid 120Hz flicker, but a DC-supplied driver could be simpler and cheaper.
The gain from DC-DC converters is small and DC devices are small part of usage compared appliances. There is no way will pay back costs of replacing all the appliances.
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)