Skip Navigation

DC (direct current) power network

As we all know, AC won the "War of the Currents". The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there's the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I'm willing to go into the maths, if that's relevant to the discussion.

68 comments
  • I heard it said many years ago that if DC won the battle, we'd have power stations every 10 miles and power lines as thick as your wrist.

    Converting local power is fairly easy, with AC inverters added for universal compatibility.

    But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop. You'd still have adapters all over the place.

  • DC is used for long-range transmission in high-voltage DC (HVDC) transmission lines today.

    https://en.wikipedia.org/wiki/High-voltage_direct_current

    high-voltage direct current (HVDC) electric power transmission system uses direct current (DC) for electric power transmission, in contrast with the more common alternating current (AC) transmission systems. Most HVDC links use voltages between 100 kV and 800 kV.

    HVDC lines are commonly used for long-distance power transmission, since they require fewer conductors and incur less power loss than equivalent AC lines. HVDC also allows power transmission between AC transmission systems that are not synchronized. Since the power flow through an HVDC link can be controlled independently of the phase angle between source and load, it can stabilize a network against disturbances due to rapid changes in power. HVDC also allows the transfer of power between grid systems running at different frequencies, such as 50 and 60 Hz. This improves the stability and economy of each grid, by allowing the exchange of power between previously incompatible networks.

    However, since grids are AC, it's just to send power to a grid or pull from one.

    We also do have some increasingly beefy DC in individual households in some forms:

    • You mention solar PV systems, but more generally, 12V systems used in vehicles (and the related 24V and 48V systems that are sometimes used to push more power) are more common, with lithium batteries that can do many more charge cycles than lead-acid being available.
    • USB PD can negotiate pushing up to 240W now at 48V, which is a fair bit.
  • Well, most all DC generators these days are actually AC alternators with the output rectified, because alternators can be run a lot more efficiently. So you're already losing on efficiency there.

    You need to consider the consumer side as well. Dinky residential loads like your computer would be fine on DC. But most of the world, especially heavy industry, runs on synchronous or induction AC motors, big ones. Big huge tens-of-megawatts motors that often run upwards of 97% line efficiency, which is insane for any industrial process.
    The best you could replace those with would be modern brushless DC motors, which require really expensive inverter controls that die frequently due to the magnetic transients and still top out at an efficiency of only 90% if you're lucky. And that would incur huge costs that just aren't worth it.

  • PV inverters often have around 1-2% losses. This is not very significant. You also need to convert the voltage anyway because PV output voltage varies with light level.

    Buck/boost converters work by converting the DC current to (messy) AC, then back to DC. If you want an isolating converter (necessary for most applications for safety reasons) that converter needs to handle the full power. If it's non isolating, then it's proportional to the voltage step.

    Frequency provides a somewhat convenient method for all parties to know whether the grid is over- or under- supplied on a sub-second basis. Operating solely on voltage is more prone to oscillation and requires compensation for voltage drop, plus the information is typically lost at buck/boost sites. A DC grid would likely require much more robust and faster real-time comms.

    The AC grid relies on significant (>10x overcurrent) short-term (<5s) overload capability. Inrush and motor starting requires small/short overloads (though still significant). Faults are detected and cleared primarily through the excess current drawn. Fuses/breakers in series will all see the same current from the same fault, but we want only the device closest to the fault to operate to minimise disruption. That's achieved (called discrimination, coordination, or selectivity) by having each device take progressively more time to trip on a fault of a given size, and progressively higher fault current so that the devices upstream still rapidly detect a fault.

    RCDs/GFCIs don't coordinate well because there isn't enough room between the smallest fault required to be detected and the maximum disconnection time to fit increasingly less sensitive devices.

    Generators are perfectly able to provide this extra fault current through short term temperature rise and inertia. Inverters cannot provide 5-fold overcurrent without being significantly oversized. We even install synchronous condensers (a generator without any actual energy source) in areas far from actual generators to provide local inertia.

    AC arcs inherently self-extinguish in most cases. DC arcs do not.

    This means that breakers and expulsion type fuses have to be significantly, significantly larger and more expensive. It also means more protection is needed against arcs caused by poor connection, cable clashes, and insulation damage.

    Solid state breakers alleviate this somewhat, but it's going to take 20+ years to improve cost, size, and power loss to acceptable levels.

    I expect that any 'next generation' system is likely to demand a step increase in safety, not merely matching the existing performance. I suspect that's going to require a 100% coverage fibre comms network parallel to the power conductors, and in accessible areas possibly fully screened cable and isolated supply.

    EVs and PV arrays get away with DC networks because they're willing to shut down the whole system in the event of a fault. You don't want a whole neighborhood to go dark because your neighbour's cat gnawed on a laptop charger.

  • When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself.

    Your frequency is still influenced by a million and more consumers. And that's before blind currents come into play.

    When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

    ...and? Everyone is getting the current frequency via the network, everyone knows what it should be, everyone can do their own small part is speeding it up or slowing it down.

    The actual issue is that huge big turbines have lots of inertia which, through their inertia, naturally stabilise the frequency. On the flip side inverters (like with solar panels) can regulate the frequency actively, what's iffy is smaller AC generators like wind mills. But then there's also battery and capacitor banks.

    It's a thing network engineers have to worry about, but it's not some insurmountable problem. We're already doing it. Insular networks have been doing it for ages, e.g. in Germany Berlin's network wasn't part of the eastern one and they always used stuff like capacitor banks to stabilise it.


    All that aside yes in the future there's probably going to be a high voltage DC network in Europe. Less so for private consumers, at least not in the foreseeable future, but to connect up large DC consumers, that is, industry, with DC power sources. If you're smelting aluminium with solar power going via AC is just pure conversion loss.

  • When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

    That's why you have standards and codes, that ensure everybody's equipment is capable of syncing to the grid properly before they are allowed to connect. It's not that hard for an inverter to do. Then you have the constant background supply to stabilize it like battery farms and other energy storage technologies. And a bunch of capacitor banks to correct power factor issues.

    I don't think we are getting away from centralized production anytime soon. Even with the move to wind and solar, although I think nuclear should be included in that mix.

  • I'm thinking there would be a transition period, where some devices would accept AC, and the others DC.

    There will be two different types of power socket outlets, and two converters to convert between AC/DC or DC/AC. (Rectifier and Inverter)

  • Basically everything runs off 3.7-12v DC at the end of the day

    It does make sense to eventually phase out AC in most home and commercial applications

68 comments