I thought I understood this, but apparently I don't.
With frequency counters, I thought gate time determined the resolution. Say I have 10MHz clock. I apply 10MHz to input. I set gate time to 1 second, 10,000,000 cycles pass through while the gate is open. So I can read down to 1Hz. I thought it would be impossible to resolve 0.1Hz or below, for example. I thought to have 0.1Hz resolution, I need 10 second gate time.
I have a HP53131a. If I set gate time to 1Hz, it will happily display down to millli-hertz. How is this possible? Shuldn't I need 1000 second to do that?
For this discussion, please disregard accuracy/precision of clock. Let's say it's accurate enough.