But even then you have to ask yourself what the actual value is of such a feature. I have an Agilent MSO and the decoded serial data usually changes so fast that the only thing you notice are the numbers changing. You can't tell whether the numbers are right or wrong (and just forget about spotting an anomaly).
How about seeing live feedback and checking for correlation between the real world and the measurements on screen -- gyroscope, accelerometers, light sensors, ADCs reading DC values... USB lines in response to different keystrokes. Low data rate applications like clap sensors. I mean really, just try to contrive any situation in which an SPI line produces small amounts of sporadic data, OR even large amounts of highly repeatably structured data. Also, set up a glitch filter/trigger of some sort, and manipulate the device to see exactly when glitches occur.
It may well be true that the serial lines that
you interact with are carrying huge numbers of bytes per second, or unstructured data that requires deep interpretation. But since when were test instruments about focussing on one use case at the exclusion of all others? It's crazy to take a discussion the LA's should have this feature and flip it into an assertion that even MSO's shouldn't, especially when most MSOs provide it and some even provide
ASIC implementations for even more speed. Seems like an awful load of bother to go to for a feature of no "actual value".