MOVs would give up if limiar overvoltage is applied for too long - the circuitry could probably withstand 1.1~1.2kV, but the MOVs would suffer thermal stress. All in all, I don't think any meter would be too different in this regard (unless they used higher voltage MOVs).
I've had people suggest similar things but it really makes no sense to me. Maybe you could explain why you feel this way. What is causing all of this thermal stress you mention?
If a MOV rated (loose definition of "rated" here) for 1kV is subjected to, say, 1.2kV, it will try to clamp this voltage differential using the only means it knows: by dissipating the surplus energy through its body. But that you already know.
In the scenario above, considering the output impedance of the source is low enough, the amount of the surplus energy that needs to be dissipated depends on the waveform. In a single pulse transient, the extra energy is perfectly contained with minor (if at all) stress to the MOV. On the other hand, a 1.2kV
AC at 50 or 60 Hz will demand the MOV to continuously dissipate the energy contained in the upper and lower cycles of the sinewave. By the same logic and what was reported in the thread I mentioned, 1.2kV
DC is the worst scenario as there is no time for the MOV to cool.
In the AC scenario (and to a much lesser extent to DC), the survivabilty of the MOV is highly dependent on its physical characteristics, as well as its environmental (temperature and humidity) and the surrounding heatsink ability of its PCB (large copper areas, clearance, etc.). That is why I mentioned that most (if not all) DMMs would have the same outcome as reported in the linked thread.