If you have the budget to go higher than the SDS2000X Plus, then compare it with the SDS2000X HD. Either one would be a great scope for your needs
Many thanks for info. My use is design verification. Use DSO to inspect signal lines to confirm that there is no abnormal glitch nor invalid signal. Are below correct?
1. The MCU and external chips works at 10MHz SPI clock. Likely these silicon are tens of nm node and will not response to glitch that is many times narrower than the normal 10MHz wanted-signal.
2. SDS2000X Plus is 1 or 2GSa/s in 4 or 2 channel mode. So, I got 100 to 200 dots for the 10MHz signal. Spec. also says 1ns peak detection. Presumably, refers to best case signal at full swing voltage (3.3 volts). Presumably, the scope will detect lower-voltage glitch at, says, 1 volt, of a few ns long.
3. SDS2000X Plus will do the job, right? If the scope does not see any glitch, the glitch energy, (voltage multipy time) should be too weak to cause the chip to response, right?
The modern DSO is pretty powerful at finding stuff you might not even think is present.....it's all about using the features available to see that you might have a problem then applying the toolset to capture them.
This ^^^ screenshot is a good example where some Persistence shows it's present then we can narrow in and seen if it's a one off or repetitive.
In this post I did a similar exercise with the older SDS1104X-E which gives some idea of using a few of the scopes features:
https://www.eevblog.com/forum/testgear/siglent-sds1204x-e-released-for-domestic-markets-in-china/msg1370717/#msg1370717
Comparing SDS1104X-E and SDS2104X-Plus. Does the cheaper unit have same toos/function (trigger, search, measure) for the purpose of design verification, to hunt for abnormal signals in 10MHz SPI signal between MCU and external chips?
Is there more differences than what I found so far from scanning manual:
500MSa/s vs 1000MSa/s in 4 channels mode (MISO, MOSI, Clock, nChipSelect), 500M is 50 samples per 10MHz signal pulse. Enough to see abnormal glitch, runt, overshoot, undershoot???
7 Mpts/CH vs 100Mpts, 7M captures 140,000 pulses at 50 samples per pulse.
No histogram, still has StdDev to qualtify signal jitter.
7 inches and non-touch screen. Needs a few more human seconds to use knobs to active a function, right?
TBH, today if wanting a lower cost solution and happy with a 7" display, SDS814X HD is where it's at.
Many thanks. From SDS814X HD user manual, it has a
new trigger mode, among manys, that seem to fit my speific SPI signal hunting.
Being new to modern MSO, please kindly advise if below is correct understanding:
1. Is trigger setup "
>, <, in-range and out-of-range" for a numberic value for the setup time and holdtime respectively?
2. If I set trigger "setup time < data sheet value", and let the McU run overnight. if the scope
did not trigger, there was
no glitch nor rare abnormal events leading to non-compliance of "setup time should be longer than datasheet value".
3. If I capture a random frame of data to full memory of the scope (tens of ms of MCU real run time at 1GSa/s), I can use
SEARCH with "setup_time of >1ns", I should get 100% event hit.
Slowly increase, to says, 100ns, as data sheet, if still got 100% event hit, this VERIFY that the data frame meet the data sheet value.
Further increase time to get 50% event hit, this will be the
medium value of the circuit's setup time, right?