We installed 120 LED ceiling lights in our home circa 2020, all of which were run with high voltage (romex) and accompanied by 120 little transformer boxes that mount inside the ceiling next to them.
Later ...
We installed outdoor lighting with low voltage, outdoor rated wiring and powered by a 12V transformer[1] and I felt the same way you did: why did we use a mile of romex and install all of those little mini transformers when we could have powered the same lights with 12V and low voltage wire ?
I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.
You don't have this problem with outdoor lighting because the entire transformer is on a switch leg and is off most of the time.
So ... I like the idea of removing a lot of unnecessary high voltage wire but it's not as simple as "just put all of your lights behind a transformer".
[1] https://residential.vistapro.com/lex-cms/product/262396-es-s...
That's not a constraint of physics, you can absolutely build a DC power supply that is efficient in a wide load range. (Worst case it might involve paralleling and switching between multiple PSUs that target different load ranges.) But of course something like that is more expensive...
More expensive than an inefficient unit, but it should still be a lot cheaper than 120 separate units, right?
And I expect one big fat unit to do a better job of smoothing out voltage and avoiding flicker than a bunch of single-light units. Especially because the output capacitors are sized for the entire system, but you'll rarely have all the lights on at the same time.
Though for efficiency I'd think you'd want 48v and not 12v.
With double-conversion, generally yes.
I recently ran across the (patented?) concept of a delta conversion/transformer UPS that seems to eliminate/reduce the inefficiencies:
* https://dc.mynetworkinsights.com/what-are-the-different-type...
* a bit technical: https://www.youtube.com/watch?v=nn_ydJemqCk
* Figures 6 to 8 [pdf]: https://www.totalpowersolutions.ie/wp-content/uploads/WP1-Di...
The double-conversion only occurs when there's a 'hiccup' from utility power, otherwise if power is clean the double-conversion is not done at all so the inefficiencies don't kick in.
USB-C could be that connector, using USB-PD instead of PoE. Though I'm not sure I'd want to need that much smarts for every single power outlet.
I find it a little hard to imagine that those devices outnumber things like stoves, dishwashers, washers/dryers, kettles, hair dryers... by 4:1.
Unsure why PoE would be better for LED lighting than the standard approach of screwing a bulb directly into AC, either. How many lumens do you get out of strip lights these days? And you still have AC-DC conversion for whatever's sourcing power onto the Ethernet link.
In practice PoE will have lower efficiency than mains powered, since it'll usually be at least double conversion, often three converters in series, plus the losses of the thin network wires, and the relatively high idle losses / poor low-load efficiency of the necessarily over-dimensioned PSE.
Efficiency isn't as straightforward either. You're still being fed by 120V/230V AC, so you're going to need some kind of centralized rectifier and down converter. It'll need to be specced for peak use, but in practice it'll usually operate at a fraction of that load - which means it'll have a pretty poor efficiency. A per-device PSU can be designed exactly for the expected load, which means it'll operate at its peak efficiency.
We also don't use 5V DC grids because the wire losses would be horrible, so a domestic DC grid should probably operate at pretty close to regular AC voltage as well. In practice this means the most sensible option would be to have a centralized rectifier and a grid operating at whatever voltage it outputs - but what would be the point?
As to PoE: I personally really like the idea, but I don't believe it'll have a bright future. For its traditional use the main issue is that there doesn't seem to be a future for twisted-pair beyond 10Gbps. 25GBASE-T might exist as a standard on paper, but the hardware never took off due to complete disinterest from the datacenter market, and it is too limited to be of use in offices and homes. I fully expect that 25G will arrive in the home and office as some form of fiber-optic interconnect - with fiber+copper hybrid for things like access points.
On the other hand, for a lot of IoT applications PoE seems to be too complicated and too expensive. It makes sense for things like cameras, but individual lights, or things like smoke sensors are probably better served in office/industrial applications by either a regular AC supply or a local DC one, plus something like KNX, X10, CAN, or Modbus for comms: just being able to be wired as a bus rather than a star topology is already a massive advantage. And for domestic use the whole "has a wire" thing is of course a massive drawback - most consumers strongly prefer using Wifi over running a dedicated wire to every single little doodad.
Even if we were to standardize a low (<50V) voltage for DC distribution within homes, we'd still need ~120/240VAC to power big stuff, or we'd instead need even-larger conductors (more copper) than we use today to do the same work with low voltage.
But, sure -- we can play it out. So let's say we have an in-home 48VDC distribution standard and decide that this is the path forward and we enshrine it in law.
We need to convert whatever the solar system has available to 48VDC. Then, we need to distribute that 48VDC using a completely separate network of cabling. Finally, we still need to convert 48VDC to whatever it is that devices can actually use.
That's not representative of a reduction in steps, or an increase in efficiency.
That is instead just an increase in installed infrastructure expense, and a decrease in device compatibility. It takes what we have, which is simply universal (at least within any given geographical area) and adds complexity.
And for what? What's the perceived benefit?
Also, you'll need wires that 5 times thicker. Instead of needing a reasonably 1mm^2 for a normal 16A line, you'll need 5mm^2 for the same power.
DC infrastucture makes sense in highly specialised environments.... Like new gigawatt AI farms
I think it's highly unlikely we'll see mass scale retrofits, but if enough momentum builds up, I can see it as a great bonus feature for new builds.
I got lucky with my house and every room has a dedicated phone line meeting at a distribution panel (a couple of 2x4s with screw terminals) built in the 50s. I'm in the process of converting it to light duty DC power. The wiring is only good for an amp or two, but at 48v that's still significant power transmission.
I imagine rooftop solar could also source DC for the house directly (or via a battery), before hitting the inverter... ?
The main problem I see is educating consumers. Maybe that starts with a standard for DC outlets and plugs that can't be confused with AC... ?
(Now I'm imagining desktop computers with much simpler power supplies; but you'd presumably have to wire for dozens of amps incoming...)
It's super nice because you only need to put the UPS/ATS at the PoE switch and then you get power redundancy everywhere you have ethernet running (i.e. the phones don't go down).
1. One of these is simplicity. With AC, one single home run of cabling (eg, Romex) can feed a whole room full of stuff, like a bedroom or a living room. At one end of the run is a circuit breaker (a fairly simple electromechanical device) and at the other end is a series of outlets (which are physically daisy-chained, but are functionally just wired in parallel with eachother).
Since one single run of cable can feed many devices, it is easy to accomplish.
2. Another advantage is that it is universal. Anything can plug into these outlets. Whatever a person brings into the home to use, they can plug it into an outlet and it works. It works this same way in every home.
3. And there's quite a lot of power available: A common 20A 120v branch circuit cabled up with 12AWG Romex is stated to supply up to 16A continuously, or 1920W. For intermittent loads, it can supply 20A -- or 2400W. That's tiny by European standards, but it's still quite a lot of power. It's plenty to run a space heater when Grandma visits and she complains about the guest room being cold (even as you start to sweat when you cross the threshold to investigate) and a big TV and a whole world of table lamps, all at once. And you can plug this stuff into any outlets in a room, and it Just Works.
4. But, sure: Lots of devices want DC, not AC. So there's a necessary conversion step that is either integral to the device being plugged in, or in the form of the external wall warts we all know very well.
So let's compare to power-over-ethernet.
1. It's also simple, but only tangentially-so. One home-run cable per outlet, whether that outlet is used or not, is something that can be rationalized as being a simple topology. A PoE switch at the head-end instead of a central box with circuit breakers is a simple-enough thing to transition to. And a lot more individual cables are required, but they're relatively small and are generally easier to install.
2. It's standardized, but it's not universal at all. I've got a few PoE widgets around the house, but I'm pretty friggin' weird when it comes to what I do with electricity. I can't go to Wal-Mart and buy more PoE widgets to use at home, and when people visit they aren't bringing PoE adapters to charge their phones and other electronics. My computer monitor doesn't have a PoE input. I can easily imagine a table lamp or a fan that connects to PoE, and also uses it as a network connection for automation, and that sounds pretty sweet in ways that tickle my automation bones in the most filthy of fashions... but that's getting even further into the weeds compared to how regular people expect to do regular things.
3. There isn't a lot of power available. 802.3bt Type 4 is the highest spec. And within that spec: While switch ports can output up to 100W, a device being powered is limited drawing no more than 71.3W. Now, sure, that's 71.3W per port, but in a room with 10 ports that's still only ~700W -- at most -- in that room. And Grandma's space heater won't run on 71.3W, nor her electric blanket. My laptop wants more than this. The list of useful, portable things that we casually plug into a wall that only draw less than 71.3W is pretty short and most don't benefit from the main advantage of PoE, which is a combination of [some] power alongside high-speed Ethernet data.
4. We still need wall warts since PoE is nominally ~48VDC. For example: Phones use less than 71.3W while charging, but they don't run on 48V. That means 120V AC comes in from the grid, gets shifted to 48VDC for distribution within the dwelling, and then gets shifted yet again to the produce the power (5, 9, 15, and 20V are common-enough in USB PD world) that devices actually want. That's more lossy conversion steps, not fewer -- and we still get to keep the extra conversion (wall warts) as punishment for our great ideas. This is not the path towards increased energy efficiency.
---
PoE is great for the things we use it for today. A camera, a wireless access point -- you know, fixed-location stuff that uses networked data as its primary function and also requires power.
Installed PoE light fixtures (like, say, task lights in a kitchen) also sounds neat -- unless they die prematurely and no PoE replacements are to be found. (Now, you have not just one or two problems, but many: The lights aren't working in that space and they can't be replaced with a trip to Lowes because the Romex that would normally have been installed was deliberately deleted from the plan. It could have been a 20-minute DIY fix that costs less than $100, but now it involves drywall and paint and retrofitting new cabling. Or maybe PoE replacements do exist, but it's now 2035 and the new ones don't talk the same network protocols as the old ones did.)
But there are other upsides: I've got an 8-port PoE-powered network switch that works a treat. It's a dandy little thing. And it sure would be neat to plug my streaming box in with PoE and kill two birds with one cable; I would like that very much.
But most people? Most people don't give a damn about ethernet (PoE, or not!) these days, or streaming boxes, and that trend is increasing. They just plug their lamp into the regular outlet on the wall like they always have, and deal with whatever terrible UI is built into their smart TV, and use wifi for anything that needs data.
And when they buy a home that is filled with someone else's smart infrastucture, their first task (more often than not) is to figure out who to call to erase those parts completely and put it back to being normal and boring.