Expanding on my pet peeve above:
User interfaces should provide zoom, pan/scroll, rotation etc. events, instead of applications synthesizing them from other events.
This would mean that instead of each application doing all the work themselves, with usually different interfaces, they all would use the same events. The OS would provide configurable mappings, when the user does not have a Human Interface Device directly producing the events.
The main benefit would be that new interface devices would not need any additional application support. They would Just Work.
Have an useless nipple stick on your laptop? No worries, map it to pan/scroll, or 3D rotation, whichever you prefer.
Have two fancy trackballs? Use one for 3D rotation in 3D design programs, and the other for the normal user interface.
Make your own input device for engineering and design tasks, and have applications work with it by default; no need to plead for the application vendors to add support for it.
Am I asking for a new standard? No, everything exists already, especially in USB HID. All I'm asking is for the mapping of these events moved from individual applications to the OS UI services. Even games would benefit from this –– although they are probably furthest along in doing this already.
This is also nothing new; this has been happening all the time, albeit slowly; a good example is two-finger and edge scrolling on touchpads. Applications don't see them as touch events, the OS converts them to pan/scroll events automagically. I just wish we'd do the same for the rest of the 2D and 3D controls.
Feels silly to be stuck in twenty-year old limitations for no real reason.