Edison lost the war of the currents to Westinghouse and Tesla because his DC current couldn't be transformed to the higher voltages required for long range transmission. Alternating current, however is more difficult to use, once it gets to it's destination. A single-phase AC motor was a difficult engineering problem, and to this day is less efficient than either DC or 3-phase AC.
I think the issue wasn’t in converting it up (because you could just generate it at a higher voltage), but converting it down to the voltages needed at the consumer.
But DC power can now be transformed to higher or lower voltages without transformers. Solid state converters have become ubiquitous and relatively cheap. For example, fast-charging in your smartphone is accomplished by raising the voltage in the cable, allowing more power to be transferred without exceeding the current capacity of those small wires.
Right. And this alone should explain why moving to even lower voltages would be stupid: wires get thicker and thicker.
Heck, we are at the point where DC-DC converters are so commonly placed right at the load on a PCB that newer PC motherboards no longer receive anything but a 12V input. No more 5V and 3.3V rails, with their correspondingly higher currents.
In other words: we are now better positioned than ever to use HIGHER voltages and convert them on the spot.
And indeed that’s exactly what we’ve been seeing, like USB going from 5V to now all the way to 48V.
Your 30V DC fever dream is never going to happen, because we’d need insane amounts of copper (and no, going to silver wouldn’t make this problem go away) and absurdly chunky switches and circuit protection devices.
All to get rid of AC, which isn’t even a problem. You know that converting AC to DC is a problem we solved over a century ago, right? Taking AC mains and turning it into a safe (=galvanically isolated) low-voltage DC output is no harder than doing it from DC, and in fact can be easier.