They don't count per second. They count per ref. clock cycle. Phase noise and jitter are random effects and wouldn't change the result.
No. The counter at hand has a 10 MHz ref clock. If it counted "per ref. clock cycle," any frequency less than 10 MHz would count as 0 or 1. Only the very old, or el cheapos, rely on simply counting during the gate time. Most modern "counters" might more properly called frequency meters.
While calibrating the time base in the video, the counter appeared to be using a 100 ms gate time, yet had a resolution (not accuracy) of .1 Hz. With a 10 MHz input, that's only 7 digits if simply counting (1,000,000 cycles), yet it produces a resolution of 9 digits. That's because it's doing interpolation of the signal, not simply counting. Interpolation commonly involves counts within the gate time, adjusted with a measurement of how long it takes for the next "count" to occur. And that's not measured using ref clock cycles - it's often a
ramp interpolator. Interpolation is also done at the start of the gate, since the signal cannot be assumed to be synchronous with the ref clock.
Interpolation depends on accurately locating a point on the waveform - something which is affected by both phase noise (movement of that point in the time domain) and input comparator jitter (measuring a sine wave will be less accurate than measuring a square wave, because of the need to accurately locate a point on the relatively shallow slope).