I did some tiny work with DS1074Z (same as DS1054Z and DS1104Z). Signals what I was looking was very different what now here in test images. First I did not note signal level due to continuously variable live signal but then when I do some measurement I find that part of signal level was too low.
After short check I find that really, if I change ONLY timebase and all other things unchanged. Slower than 100ns/div level is constant. With 100ns/div or more fast level change. There is severe level step depending of signal details. Alone this design error drop this equipment to "toy" class. More bad is that user can not mostly even turn off this function. Why? Is it because entry level hobbyists want "nice image" instead of trusted measurements. This kind of "smooth" filter is then named Sin(x)/x. Bad is that if flush to toilet all true sample points and the reality is replaced by an invented data. If some analog scope in old times do this kind of error then responsible people set adhesive sticker over display "out of order" or just "do not use" so that peoples in lab do not accidentally use it before it is repaired.
After then I take other scope (because after this finding I did not know what I can trust with this Rigol) and continue my work and later I did some tiny tests about what is going on.
Pick-up some tiny examples from tests: part I
Here tested with steady and enough good quality sinewave for this kind of test.
Signal source HP8657B. Signal 70MHz sinewave. Level set so that signal is around 6 div high peak to peak with scope setting 50mV/div. Scope end terminated with Tek 50ohm feed thru. From HP to Rigol 50ohm cable.
Oscilloscope display mode: Dots
Sampling mode normal, Input DC, Trig C. DC, Trig Rising edge.
All 4 channels on. Samplerate 250MSa/s
First 3 images:
200ns/div ; Sin(x)/x OFF scope show normal level (as also with more low speeds)
100ns/div ; Sin(x)/x OFF
level drops down and then stay same level from this speed down to 5ns/s
005ns/div ; Sin(x)/x OFF
level drops down
Next 3 images:
200ns/div ; Sin(x)/x ON level ok (as also with more low speeds)
100ns/div ; Sin(x)/x ON level ok
005ns/div ; Sin(x)/x ON level ok
Also if think typical criteria for sin(x)/x interpolation there is 250MSa/s and input is 70MHz sinewave. Samplerate/input frequency = 3.57 what is ok. Even with 100MHz it is still in acceptable range (2.5)
But what is this problem?
Note that it happends when Sin(x)/x interpolation is selected OFF! and even more, there is not even vectors, display mode is Dots not Vectors!
(yes I have tested also same so that only display mode changed to Vectors. Result exactly same!
So, what this machine is doing?
But I have also question if someone know real true and not only quess.
If I want save sampled data as .CSV file, are these sample points true native raw data from ADC (only scaled with known vertical settings and/or other exactly known parameters.) or are these saved points after some undefined "Rigol make-up" process. Are there any solution to get real ADC RAW for later external analysis. If analyze data what is manipulated with unknown way it is nearly waste of time analyze these. Garbage in - analyzing - garbage out.
Later, when I have time to extract out more my test notes and arrange data and images then more in part II what rise more questions related to this and display mode Vectors/Dots and with also Sin(x)/x ON/OFF.
Here more readings also:
http://cdn.teledynelecroy.com/files/whitepapers/wp_interpolation_102203.pdfhttp://m.eet.com/media/1051226/Sin(x)x_Agilent.pdf