I hate to break it to you, but the first thing your ears do is a Fourier analysis in the cochlear, then your nerves translate that into a set of digital PCM signals and finally your brain tries to reconstruct the signal using a neural network to comprehend it. So any arguments that the whole natural signal chain is analogue fall apart.
Quite.
Add in that the whole ear-brain system is horribly non-linear in every dimension you can think of. If it wasn't, audio compression wouldn't work!
Perhaps so. But has anyone determined our hearing's sample rate?
You'll have to define what you mean by that.
For example, if you consider the long (by electronic standards) response time of the nervous system, you could conclude that (say) 100 samples per second would be sufficient to cover it.
Mmmmhmmm... but we're talking about ~15,000 cilia in the human ear, all processed in parallel. x2. And I for one have no idea what the bit depth per single cili (?) would be; certainly much more than binary.
You appear to be doing the equivalent of confusing baud rate with bit rate, or clock rate with instructions per second.
Bet it's infinitely better than any digital format. After all, billions of years of evolution.
In the same way that the human eye is better than a camera? NOT!
It is obvious that the human eye is superior to a camera at what it is intended to detect;
The human eye/brain is crap, both absolutely and relative to other independently-evolved eye variants. Start by considering why
our eyes have a blind spot. Other eyes have avoided that grossly bad engineering. Then go on to consider the
remarkably small field of decent vision, i.e. the fovea centralis.
We can't perceive just how crap, since we are limited to what we can percieve. Well, that's not quite true, as shown by optical illusions. Have a look at the 148(!) examples at
https://michaelbach.de/ot/index.html Many are familiar, but some are astounding; this one is gobsmacking
https://michaelbach.de/ot/mot-mib/index.htmlotherwise we wouldn't have all these publicized problems with Apple's child-safe bunk identifying a digitized pic of a dog as probable child-porn. It's both the sensor and the processing that makes a working eye vs a working camera.
That, and Teslas, poor performance don't make the human eye/brain good.
The difference with a camera is how much more range it can detect and of course the fact of once again, parallel bus for the eye vs serial for a camera.
Too simplistic, but at least you are acknowledging that cameras are better in many ways.
For a real appraisal, you cannot separate the human eye from the visual cortex... it is a system all directly tied into the brain itself. The problems we have quantifying how it all works are similar to that of trying to gauge sound with a tape measure; the tools we have at our disposal are... ahem... less than ideal.
Indeed you can't separate them; the brain does a surprisingly good job of guessing/inventing what the eye omits.
Analogies are dangerous, and often confuse the issues.