Maybe Dave can (and should) tell us, what he wants to measure. Anyway, thank you Dave! While I'm not happy as an engineer (I was told that measuring is not guessing) I think you created a really interesting topic that I hope will keep our minds busy.
I placed a stopwatch in front of my scope's display. Connected to a RF generator set to sweep from 5M - 100M. Very small data set. Norm trigger. All processing turned off. Camera set to 1000fps. Recorded for 1sec. Manually counted each screen update. I measured a pathetic 30fps. It's an 18 year old PC and based on how poorly it handles the X-Y music, not surprised.
I placed a stopwatch in front of my scope's display. Connected to a RF generator set to sweep from 5M - 100M. Very small data set. Norm trigger. All processing turned off. Camera set to 1000fps. Recorded for 1sec. Manually counted each screen update. I measured a pathetic 30fps. It's an 18 year old PC and based on how poorly it handles the X-Y music, not surprised.You may also been able to extract that info from the OS display refresh rate settings if you can stop the boot before the scope app loads.
I like pdenisowski's "pixels changing" definition, so any test source need to have "pixels changing" at a fast enough rate to make the measurement possible.
You'd just have to ensure that the waveform rate was higher than the screen rate so that you'd expect new information to be available at every screen refresh.
What is bit confusing about the video I posted is that the waveform update is set at 60/s (using line trigger) and the 60fps video shows a display update (pixel change) on each and every frame yet there are multiple traces showing. ...
I placed a stopwatch in front of my scope's display. Connected to a RF generator set to sweep from 5M - 100M. Very small data set. Norm trigger. All processing turned off. Camera set to 1000fps. Recorded for 1sec. Manually counted each screen update. I measured a pathetic 30fps. It's an 18 year old PC and based on how poorly it handles the X-Y music, not surprised.
Your video requires some codec I don't have.
It seems to stutter or stall at times and then plays catch up. I had turned off all the readouts and such and set the scope software to the highest screen priority but with Windows not being a RTOS, I am not too surprised.
I think it will be hard to determine refresh rate from high speed video camera capture of the display pixels change, because display has some limited pixel change speed. If there is a way to attach external display it will be better to analyze video signal.
Your video requires some codec I don't have.
That's Windows pissing in your ear and telling you that it is raining. HEVC (h265) has been around for quite a while now, but MS still wants a fee to let you use it. Download VideoLan if you want a decent video viewer.
https://www.videolan.org/vlc/download-windows.html
Quote
It seems to stutter or stall at times and then plays catch up. I had turned off all the readouts and such and set the scope software to the highest screen priority but with Windows not being a RTOS, I am not too surprised.
Those stutters and stalls are the main thing that makes a display annoying and not 'live'. I think a consistent 30fps might be marginally sufficient for most people in most cases just like video, but throw in 5 consecutive missing frames every two seconds and everyone will hate it.
I placed a stopwatch in front of my scope's display. Connected to a RF generator set to sweep from 5M - 100M. Very small data set. Norm trigger. All processing turned off. Camera set to 1000fps. Recorded for 1sec. Manually counted each screen update. I measured a pathetic 30fps. It's an 18 year old PC and based on how poorly it handles the X-Y music, not surprised.
Not long ago (10 years?), 30fps was pretty standard for most application on a PC. Now these days it has increased to 60 - 120 fps for dynamic and responsive application. Nevertheless, 30fps is still present. For example you still find a lot of video games running at 30fps.
I see between 2 and 4 lines on the screen, indicating a 15 to 30Hz screen refresh rate.
What is bit confusing about the video I posted is that the waveform update is set at 60/s (using line trigger) and the 60fps video shows a display update (pixel change) on each and every frame yet there are multiple traces showing. This is why I was saying that 'screen rate' or 'screen update' may not be simple, clear-cut concepts.
I wouldn't be surprised if some of the scopes shown in the music thread are running 120Hz. They are very impressive compared with my scope running Windows XP. It's why I had brought up the refresh rate when I asked about buying a new DSO.
Odd, seemed easy enough to detect. Only problem was manually going through the data.
it will be easy if lcd/led matrix is rated for much higher refresh rate than actually used (very much higher). But usually it is close to actual refresh rate or even worse. In such case you cannot see pixel on/off event immediately, it will be very smooth and it will be hard to determine exact frame when pixel appears or disappears.
If video camera uses lossy compression format, it will be even more hard because compression will remove very small change of pixel brightness between frames.
Your video requires some codec I don't have.
That's Windows pissing in your ear and telling you that it is raining. HEVC (h265) has been around for quite a while now, but MS still wants a fee to let you use it. Download VideoLan if you want a decent video viewer.
https://www.videolan.org/vlc/download-windows.html
I'll try it with mine.
What is bit confusing about the video I posted is that the waveform update is set at 60/s (using line trigger) and the 60fps video shows a display update (pixel change) on each and every frame yet there are multiple traces showing. This is why I was saying that 'screen rate' or 'screen update' may not be simple, clear-cut concepts.
They are not. I'm sure there are scopes that manipulate the data to provide a more "analog like" display.