The "frame grabbers" widely available in the 1990s had problems with this.
A test card, or a frame sourced originally from cine film would "grab" perfectly, but video originated by a TV camera would blur (or perhaps "smear" is a better term) on rapid movements.
So basically equivalent time sampling?
Maybe it's just another way of looking at things, as scanning does "sample" the scene presented to the camera, & each tiny increment of movement is a change from line to line, so maybe it can be regarded as being higher in frequency than the "sample" rate.
But, as an analog person, I tend to think in tems of it being to do with the fact that analog TV doesn't actually present 25 or whatever frames per second as individual still pictures, but presents a number of lines which take finite time, during which motion is happening in the live scene the original camera is seeing.
As with many other things in analog TV, it depends upon human physiology.
(Dogs, with their different vision characteristics, did not recognise their kind on analog TV--- they can with digital TV)
"Persistence of vision" prevents us from seeing this happening as other than a smooth movement over multiple frames, but the "grabber" can see it, & in endeavouring to record it, tries to capture the xhange over multiple lines.
I'm afraid I never went into how the cheaper "frame grabbers" work.
Broadcast standard equipment seems to be able to do this without problems-- these were the cheaper ones we used mainly to grab & in turn source announcement "slides" as part of the emergency stuff at the transmitter.
Scenery & such, without movement also "grabs" OK with these.