Here i uploaded two more samples, this time good waveforms.
- One sample SPI mode, 12Mbit. same settings as before.
- The other sample in RS232 mode, 2Mbit, 8bit, no parity, 1 stop, LSB.
Yes, those are much cleaner. I'm still confused about two things.
1) why do you think the scope should be able to properly decode a bitstream, when not triggering on that protocol as well? In both cases, you trigger on alternate sources. Just because you draw lines on the screen where you think bytes should begin and end, how is the scope supposed to know that? The Green boxes indicate, for better or worse, where the scope decided it should start & stop. When those don't match your expectations, you can be pretty sure you've got one or more settings wrong.
2) on the RS232 decode, the second Green box indicates a Data value of 0, yet I see no way the bitstream could be interpreted in such a fashion. Regardless of MSB/LSB settings, etc., that's not a zero. So I'm curious how the scope made that determination.
One clue is that every byte is flagged with a red Error marker at the end. Since there's no Parity defined, that means the Start/Stop bits are not lining up where the scope thinks they should be. It's saying, "
Here's what I'm decoding, based on the info I have, and it all looks wrong to me".
I understand you want/need to trigger on some other conditions, but is there some reason you're not willing to even try triggering on SPI and UART, to see what happens?