Hi, I'm in the market for a new multimeter and I'm curious as to how much true rms bandwidth one should have. I actually can't remember ever even using the AC voltage setting on my current multimeter at home other than to merely see if an outlet is on or not. At work I've used them to measure overall power consumption of a rack of equipment for sizing up power supplies, so all just 60Hz stuff.
From what I can tell, here's the bandwidths of 3 nice $100 class meters
BK Precision BK2709B 500Hz
Amprobe AM270 20KHz
Brymen BM257 400Hz (per Lightages correction, thanks)
and for reference, the venerable Fluke 87-V is listed as 20KHz
So clearly the amprobe sticks out of the $100 bunch, but what is that good for? My guess is that the trms is often used with crest mode to get the RMS reading of the ac wave along with the max value, to give you an idea of the level of distortion present. I could see where this could be very useful in an industrial setting with large inductive loads like motors. So 50/60Hz is obviously important (and distortion introduces higher harmonics, so you'd want several multiples of that). But is past a couple of kilohertz needed often?
So what do you use AC voltage mode for and how much bandwidth is necessary?