You do not measure power with 2 leads, you need 3 or 4 leads. This is because power = voltage * current, you need to measure voltage and current independently.
A (large?) nuclear reactor generates approximately 1.21 GW.
Assuming DC power, and a max voltage of 600 VDC, we'd be dealing with 200 MA (megaamps). Now clearly the meter cannot handle 200MA, but let's assume the shunt in the meter is 10 milliohms. If we put a a 0.1 nanoohm resistor in parallel with that shunt, we'd have 2A running through the meter when the 0.1 nanoohm shunt is dealing with 200MA; and it's scale accordingly. The display on the meter would simply be "off" by a factor of 100,000,000, simply place a sticker over the 10A switch labelled "1GA" and read the meter accordingly, and you're good to go.
The 0.1 nanoohm shunt will dissipate 4 MW. Good quality cooling would be encouraged.
If you ever encounter 1.21GW, it's much more likely to be 121 kV * 10 kA. This would actually be "quite straightforward" to measure, provided you could arrange a similar voltage divider as well as a current "splitter/divider". The shunt, 2 microohm, would dissipate much less heat: a completely manageable 200 W.
You'd need to measure voltage and current separately, the 121GW probably doesn't have a power measurement mode where it multiplies the two for you. A metrahitt-whatever-it's-called could do that job, though, with suitable stickers redefining what the voltage, current and power ranges mean.