SPI Decode Issue (or idiosyncrasy):
Hoping this is on the user side of the screen, as they say....
I can only get a reliable SPI decode when the timescale is above a particular threshold (appears decode-size dependent).
I need to be >=100uS/div (for 8 ,16 bit) and >=200uS/div (for 32-bit) for reliable decoding.
Configuration:
SDS2104X Plus
Firmware 1.3.5R10
8-bit (or 10-bit) Vertical Scale mode
CS, CLK, MOSI thresholds all set to 2.0V (MISO inactive)
Analog channels used for inputs
CLK rising edge, CS active low
Signals driven directly from Total Phase SPI generator (no PCB or other circuits involved)
Issue repeated with:
8-bit, 16-bit, and 32-bit messages
Every possible acquisition length
Both SPI triggering and edge triggering (on CS)
Failure always occurs on last nibble or byte.
Also appears to fail with odd values and not even values.
It is not intermittent. It either works every time or fails every time, based on timescale.
The only valid decodes are with such large timescales that the signals are visually impossible to recognize.
My inexpensive "brand" name scope at work does not have this decode limitation.
Now, if this is really the scope (and not me), maybe somebody could explain the technical reason why this is so.
It is very counter-intuitive actually. I would expect more horizontal resolution to equate to a better measurement.
Most definitely (at least vertically), I've seen more accurate automatic voltage measurements with greater vertical resolution (on other 'scopes anyway).
Maybe the worst aspect of this "issue?" is that the scope does not alert one of the bad data. If it is simply a function of timescale, then the firmware should not present any data at all - which would just serve to mislead the user. It should display a message that decoding is not possible at the current resolution (or something to that effect).
This is not for a hobby and I'm terrified of taking bad data for a customer.
I've attached good and bad plots at 8, 16, and 32-bit decodes (those ending in "00" are bad).
Just look at the last byte decoded in each - and look at the timescale.
Anybody else see this? Please tell me this is not normal operation and that I have a setting incorrect.
I don't see any mention in the user manual of timescale dependent decoding (but I may have overlooked it - in which case my apologies for posting).