My complaint has nothing to do with hardware performance; it is a software design problem. A couple of times I have tracked this down to use of modern languages which rely on non-deterministic garbage collection but there are other causes as well. I think there is an assumption that faster hardware performance will fix it but that has not been the case for decades so why would it change now?
It doesn't help that as time goes on, software developers unnecessarily pile abstraction on top of abstraction, resulting in call stacks that are many tens of levels deep (sometimes approaching 100 levels!). The amount of code that is executed to do even relatively simple things is now horribly excessive. You know that things have gotten out of hand when you depend on the computer to figure out the relationships in the code for you, because they are too complex and too numerous to figure out "by hand".
It would be one thing if that which is being accomplished were that much more complex than what a simpler architecture could accomplish, but from what I've seen in the software world (which is the world I live in), with
very few exceptions, it just isn't.
It seems that, instead of powerful tools being used in order to better handle previously existing complexity, the complexity is instead later scaled to match the power of the tools that are used, just because it can be (and, more importantly, because it's erroneously perceived as being easier and cheaper than intentionally designing and building minimal complexity -- a false tradeoff if there ever was one).