As I haven't used the scope in person yet, I didn't comment on the UI performance earlier, but some general observations (most probably pretty obvious) below:
- Most users will have spent thousands of hours interacting with a smooth, responsive and refined touchscreen UI that they carry around in a pocket
- Anyone building something else with a touch UI will be automatically compared to that, whether it is fair or not, so UI will always be critical to the user impression of the device
- If it's (technically and economically) feasible to build a scope using Android as a base (like higher end systems use windows), maybe that would help solve a lot of the UI responsiveness issues without having to re-do all the work and optimisation that has gone into the problem in the past. I think a number of car entertainment systems do this but don't advertise what they are built on.
- Experienced users will already know where the button they want is going to appear on the menu they bring up. Any animation should be shorter than the time required to re-position the user's finger over the next button. If not, there should be an option to disable them.
This is not a criticism of the R&S effort/implementation (as I said, I haven't used it in person, and it sounds like a lot of the UI design is done extremely well) but a few of my thoughts about the different expectations and challenges of going from knobs and buttons to a touch UI.