Hello all,
I've attached a couple of table containing the update rate (measured with the AUX out configured as trigger out and connected to an HP 5334B) of a 10MHZ square wave. My scope is a RTB2004-304M + PK1.
Both tables contains the data @10KSa memory and in auto memory as long as the relative sampling frequencies.
The settings were:
Normal Mode, Acquisition = Sample, decoders-math-menu-fft-counter-Meter-App-etc = off, Channel 1 only, holdoff = off, persistence = off, measurement = off, cursors = off, whatever = off.
In the first table I've used the classic sinx/x interpolation and the dot mode is disabled.
The second table is in Sample-Hold mode and dot mode enable.
Something bizzarre seems to happen between 500us/div and 5us/div particularly when sinx/x interpolation is enabled with significant instability on the measured wfm per second and very low rate also.
Furthermore for unknown reason the 5-2-1 sequence is not respected and instead of 100ns/div we have 80ns/div and instead of 50ns/div we have 40ns/div.
What is happening here?
Best,
0xfede