Sorry for the length of this.
You're really showing your age now and I'm a good bit older than you.
It's not age related this. It's actually down to real quantifiable experience.
One of the products I was wholly in charge of in the early 00's was a virtual interface platform that allowed people to prototype hardware and software based interfaces and wire them to the underlying electronics. This consisted of a COTS PC and various standard PCI digital interface cards and numerous Matrox cards connected to physical controls and virtual displays. There was a whole load of VC++ I am not proud of under there
... The result was your avionics module could be tried in situ in simulator platforms and the controls could be moved around and look for HCI related problems with the end users (pilots, military personel etc). This spanned avionics, ground support systems, production line controls, everything.
One of the major findings from this platform was that the important thing is user preference and dividing the control surface into two groups of functions: primary and secondary. Primary functions are things that are critical to the operation of the product. Secondary functions are not so important and are used occasionally.
We found that primary functions MUST be physical tactile controls with haptic feedback and must be
comfortable or fatigue builds up over time leading to errors, physical pain and generally low opinion of the products. I've even seen it where people get shoulder problems after having to look around the side of something that their hand is obscuring a hundred times a day. Even little things like this only come from observing the user for days at a time and doing several hardware prototype revisions (which was expensive and why I built the above).
Secondary functions it really doesn't matter as they are used so infrequently. Stick it in a menu somewhere!
Literally everything you have on a standard analogue scope, DMM, signal generator should still be primary physical controls on every piece of test gear unconditionally.This comes from watching hundreds of people doing real work with hardware for 3 years using this software and building interfaces. This is quantifiable and there's a ton of research out there which backs this up. Go and have a look around.
This is why the TDS210/220, while not having an amazing spec sheet, is such a nice bit of kit to use. It has that functionality clearly divided and clearly implemented. The interface is literally "right".
A few years back I'd have thought similar but every bit of equipment a has different UI requirement like WTF would you want a 'pinch to zoom' on an AWG for instance.
Placement of the touch screen elements has everything to do with usability and if they're aligned with physical buttons the choice is yours of which to use. For some elements a touch screen is faster and less effort, quite contrary to your misguided beliefs. Other elements of course require user feedback such as indented encoders and numerical keypads and it's unlikely they'll be replaced anytime soon with touch controls especially in instruments with small displays and/or complex multi-menued UI's.
You comment on the users hand obscuring the display, well yes but in some in some implementations a quick access panel is available to place (drag drop) anywhere that suits.
Like you I've never been a great fan of touch displays but the more I get to use instruments that have them the more I understand their advantages with good implementation.
I think I've covered the above already.
I will say that if you look at the quantifiable disaster that is metro / UWP on Windows platforms over the last few years, you will see that it has started to move away from the touch focus to a keyboard and mouse focus again. Because it literally didn't work in the real world. It's tough to admit this. People should learn from other people's mistakes. win32 was not a mistake. UWP is.
Panels fragile ? BS, double BS !
You really need get out more !
I think I did
They are incredibly fragile. If they weren't there wouldn't be at least 100 phone repair shops within a mile of me. And the phones are pretty much the best bits of engineering out there. They have had the highest overall investment in technology and reliability over the last decade. But they still get broken because you don't get to choose the hands in which they are placed.
Same goes for test gear which is pretty much ritually abused. Back when I was at university, someone popped a DMM screen with a flying test lead for example. And that had a layer of Fluke around it. And working for the test gear department of a large company for a bit, I actually saw all the creative ways people fucked up their kit.
Just to wind some of ya's up some more, in not too many years I predict there will be instrument ranges from many manufacturers only available with touch displays and also ranges of instruments with physical controls too but at an additional cost.
Such is the pace of change in the last few decades.
This is a side effect known as the feature bell curve. What we have is roughly the pinnacle of feature completeness at the moment. We have decent quality reliable feature load, we have devices which are completely functional and reliable and we have low cost manufacturing. Unfortunately when we get into this state, there is a latent desire to innovate because of fear from other manufacturers innovating first. This results in either two outcomes:
1. Firstly we have negative innovation. Apple are good at this. They start removing things people use and need to build the ultimate clean interface and system. This harms the user by removing established patterns.
2. Secondly we have negative innovation again. This is where established paradigms are broken, simply for the sake of labeling something as innovation.
This is a type 2 failure mode. Thus people are so afraid to do minor revisions of their products and only reinvent them every few years due to the pressure from above and the marketing teams. Also there is a pressure to cut costs. If you tick both boxes you think you have won.
Yaesu are a fine example of sticking a finger up to this. They took the FT-817 platform which is nigh on 20 years old now and released the FT-818. People nearly shit a brick because it was literally almost exactly the same as the FT-817. 20 years of progress?!?!?! What is this?!?!?!? Why should you buy this?!?!? That was the sentiment. Well it turns out you shouldn't if you have an FT-817. This is shocking to the masses apparently who have been programmed with itchy upgrade dick syndrome (fear of obsolescence) and the manufacturers desire to sell new products all the time to the same people. What did they do with the FT-818? Well they re-engineered the guts so that Yaesu could provide the same functionality with newer parts. That was it. The interface for it works. Didn't need changing. You can drive it with your eyes shut.
Same with HP / Agilent / Keysight. Over the last 25 years some of their products haven't changed, like the E36xx platform and the 34401A for example. When you look at their newer lines, taking the E36312A for example, there are still physical controls. Even the high end InfiniVision 6000 scopes with the touch screen still have ALL of the primary controls on the panel with no channel sharing for example. They paid for a user study.
This is simply the bottom end of the market throwing cost cutting fad crap out to outdo each other and not even bothering to do a user study.
I'll throw another "right" user interface on the table:
I'd just like to say that the DG1022Z interface isn't great, especially if you compare to the above, but the thing is an order of magnitude better than the newer touch devices for sure.