It's always been a complete non-issue for me. I walk into the lab, turn on power supplies, power meters, OSAs, spectrum analyzers, whatever....I sip my coffee for a minute, complain a bit with the nearest co-worker about the latest idiot who's clueless, and get to work. This is pro equipment designed for pros, and things like boot time, screen angle and things like that are pretty irrelevant. Stuff that annoys me is performing out of spec, strange GPIB lockups in the middle of the night so all my testing is garbage, etc etc.
I'm not saying that it wouldn't be BETTER in some sense if this other stuff was better, but it's really not even on the radar when I'm selecting equipment. Shoot, it takes me longer to remember where I was from yesterday than it does for any of this stuff to boot up.
Several issues comes to mind here:
*) In many cases with top shelf equipment you have physical limitations on the power on delay, like ovens needing to warm up etc. Here there is often little point in having the UI operational after a few seconds. If you actually need the precision, then you gotta wait for things to warm up correctly (assuming you actually do turn off your frequency standards and the likes, of course). In this case the UI boot delay is not an usability problem.
*) However, many people actually do move their test equipment around on the shop floor, wheeling it around down in the cellar next to the particle accelerator etc. That is why many of the smaller pieces of test gear from Agilent have rubber corners and a handle. In these situations you frequently do power cycle the equipment several - or even many - times a day. If you work in product calibration, then your test setup may consist of, say, a bunch of power supplies, DMMs, counters and more, all wheeled around on a cart. Here the boot delay frequently doesn't serve a physical purpose, and it gets to be a real pain in the rear, for a whole bunch of reasons.
*) My home lab is probably not a good example to go by. However, most labs I have worked in, seem to have been temporary affairs (for some suitable definition of 'temporary'), frequently being both fairly small and with questionable A/C - where it exists at all. In these situations you tend not keep your whole collection of stuff powered up all day long due to heat build up, and potentially noise as well. Thus, when you realize you need the DMM, you get to wait for it to boot.
WinCE because they already have the test procedures, the build libraries and the firmware in use on other devices, so it is not a big cost and learning curve to use another OS. The cost for them is essentially zero to use it, anything else will have a bigger cost in time or development.
I suspect few would question Agilent's wisdom in focusing on a single OS across their product range. What I am asking is whether people have heard or seen Agilent mention that WinCE has technical merits, which warrants the UI boot delay? If Agilent had chosen, say, QNX instead of WinCE, then a fast UI boot would have been an option. So our DMMs could be ready in seconds, and the 'scope with the complex power-on chaining and self calibration routines will be ready when the hardware is ready.
To put it differently, then I personally prefer instruments with LED, VFD and CRT readouts, compared to LCD screens. This due to the higher illumination intensity and increased contrast, which is available when using these technologies. However, upgrading to a full color LCD screen as seen with the 34461A is a valid design choice IMHO. By doing so you get the possibility of making an UI both more complex/powerful yet still easier to use/operate, compared to being limited by, say, the VFD readouts.