Have I been living under a misconception? I thought when supply dropped, the main consequence was:
-- Reduced current flowing through the generators at the plant,
-- This reduced current means there's less torque holding back the turbines, so the resulting power imbalance causes the rotor's speed start to increase (assuming no control system intervenes)
-- This higher rotor speed manifests as a higher
frequency in our power sockets at home, not higher
voltage (the voltage at the plant might maybe move as well, but there's so much voltage regulation/autotransformers between there and your house that there's not going to be any correlation left over.)
Thus, in a very real sense, all the generators in the grid are rotating in lockstep, as if they're connected by some sort of mystical invisible (albeit slightly stretchy) belt, with all the steam turbines trying to spin the generators faster and all the users extracting energy by trying to slow the rotor down (extracting power in the process). Hence, the problem of matching supply to demand reduces to a problem of regulating steam to a turbine in order to regulate its rotational speed (albeit actually very complicated with so many different plants etc). Of course, the grid operators have control over many of our hot water cylinders, which is an excellent place to dump pulses of excess load (if it helps, think of it as a "brake" for the turbines), in addition to pumped hydro and battery storage, etc.
Hence, the short-term imbalance storage capacity of the grid is in the combined mechanical inertia of all the generators in the grid, not (traditionally) capacitors. This manifests as the mains frequency shifting over time,
as shown here. The cool part is that many grid operators guarantee that although the frequency might change in the short term, the total number of cycles per day or per week are tightly regulated to be correct; so you can use it as a pretty decent time source for a clock (and indeed, many clocks used to work this way.)