Forget the concept of "0V" or ground or earth, since they are all fiction that has some use in some limited instances.
After reading that, I felt like asking if there is such a thing as a standard voltage (or standard ground), kinda like we have a GPS frequency standard and some way of determining absolute power (dBm). But then what you are saying suggests that V is merely relative, like dB. Yet, V seems to be a fundamental component of power.
If such thing was needed, the earth is the absolute 0.
If only that was true, it would make engineering certain systems much simpler.
Start by considering why, in a thunderstorm, you are advised to keep your two feet together (amongst other advice). Consider what happens when there is a lightning strike nearby and large currents flow through the ground you are standing on. Hint: you don't want sufficient potential difference between your feet that some current goes throught your body.
If you don't like high current engineering, then consider antennas. Start by considering the physical processes by which monopole antennas "appear" to be twice their physical length. Hint: it is because currents flowing in the earth create a virtual image of the real physical antenna.
You are considering extreme conditions. But even in that conditions, the lightning strike just flows to the phisical ground because the earth doesn't have any potential and the current allways flows to the lowest potential possible with less resistence. So I don't see your point...
And also, I don't see any significan simplicity considering earth as the absolute 0V. Like some guys said here, voltage is just a matter of potential difference and the earth potential is a bit irrelevant.
In what way, exactly, do antennas represent "extreme conditions"?
If you want something "closer to home", consider the current flows in ground planes on PCBs. The naive think that, because copper is a good conductor, a PCB everywhere on a ground plane is at the same potential. Unfortunately not, and the more experienced hardware engineers know it. Start by realising that if there is a signal conductor between a transmitter and receiver, then the return current
doesn't go back directly from the receiver to the transmitter. Instead, at high frequencies, it is concentrated underneath
and follows the signal conductor. And current flow is intimitely related to potential differences, and vice versa.
Exactly analogous phenomena occur in electrical distribution networks, except that the frequencies tend to be lower and the currents higher.
So no, the phenomena I allude to are real and important, not theoretial and esoteric.