I have proposed computing statistics so as to be able to trigger on "trace outside x.x sigma bound" and even histograms. This is not a "start of sweep" trigger, but a data event trigger. I've given careful thought to the resource requirements and it seems quite tractable to me for an Ultrascale implementation.
It's important to distinguish between things which must be done in real time and which simply need to appear to be done in real time. Most of what a DSO does does not need to be done in hard real time. A screen refresh delay is of no concern. Trigger point alignment, AFE correction, anti-alias filtering, downsampling and a few other things must be done in hard real time, but once the data are in the format needed for the selected data view, the time constraints become quite relaxed.
I have the view that a DSO should do everything it is possible to do with the resources available.
My primary concern now is the AFE input filter. It should be a high order Bessel-Thomson filter to provide accurate waveform shape. I've got every reference I can find, but unfortunately, the maximally flat phase gets skimpy treatment and I've still not figured out how to analyze and design one from first principles. I can do a design by hand or with software, but I can't write the derivation on a whiteboard. More work required.
I'd very much like to see threads discussing how to time synchronize waveforms, implement advanced triggers, do signal processing operations e.g. FFT, etc.
I keep reading a lot of "you can't do this", "you have to do that", but precious little, "this is how you implement that". It would be nice to have more of the latter and less of the former.
Have Fun!
Reg