The best hardware, software and software tech has disappeared from the markets because of company mistakes and bad marketing. If the best stuff would have survived the 80s and 90s our computers including the software would look very different compared to what we have today. It's a shame so much great stuff is gone, or never got the support it deserved.
But how do we define "the best"?
There indeed were many "beautiful" architectures that eventually dropped out of mainstream or disappeared entirely. They were, without exception, closed platforms, owned by companies that were often mismanaged (post-Tramiel Commodore for example, not to say that the old man didn't make mistakes in his day). The PC architecture won out because of openness, while being "good enough".
How would the world be different if the PC didn't happen? We'd probably still have completely incompatible architectures, with software houses having to develop separately for Amiga, for Atari, for DEC, for Sun, for NeXT, for whatever else... By choosing a computer you'd probably be choosing an entire ecosystem. We probably wouldn't have USB and SATA - instead there would be Amiga external HDDs, Atari external HDDs, DEC external HDDs, Sun external HDDs, NeXT external HDDs...
Coding assembly would probably be nicer than it is for the x64 architecture... But nobody would be coding assembly anyway because I doubt the laziness of coders would have evolved differently - and you'd probably see massive feature withholding in software, as everyone would be coding in a way that can be compiled for each architecture with minimal code changes.
Machines would probably be marginally faster per clock cycle, but definitely not per US dollar, as closed architectures would have prevented the commoditization of hardware as it happened with the advent of the PC.
So... I guess the PC was probably the right choice in the end.