Prolog: Feel absolutely free to find me boring and old-school!
Each analogue measurement is only averaging.
(As philosopy has entered the thread ...) Nope.(a) A measurement is gathering/determining quantitative information at some point in time. No more, no less. (b1) Everything that we humans can (really) measure is necessarily analog because the world directly accessible to us is, (b2) if you allow a quick reminder, "digital" is just a simplifying "bundling" of analog; instead of dealing with an infinite set of numbers we simply say e.g. "everything below 0.8 V is zero anything above 2.3V is 1 and everything in between is undefined".
But I get your point. Already 8 1/2 digits are much more than a healthy brain can imagine. And though it really works well to make an 8-bit ADC measure 12bit if you well define your analogue filters and do enough averaging, it just gets exponentially more complicated with every additional bit or digit.
Funny and hard to understand thing is that sometimes you have to add noise to increase measurement accuracy. Maths is strange sometimes...
Uhm, quite frequently deal with numbers far beyond 8 (or even 80) digits in my job.
More importantly though, NO no amount of analog filtering 8 (or whatever) bits somehow gets you 12 (or whatever) bits. While, yes, that works in theoretical ponderings and is often used in science, measurements in electronics are, by their very nature, about physics. What filtering
can do though is to get you more valid/"clean" bits in those your ADC got you. But if your ADC is 8 bits then you can't somehow magically get 12.
Not sometime, always must add noise. Without noise, the averaging trick doesn't work at all. And the noise has to be bigger than 1LSB. And not any kind of noise, it has to be uniformly distributed, or else you'll get lying results. Everything else has to remain perfectly constant, which in practice is not constant.
Nope, the noise must be
random but I guess that's what you meant with "uniformly".
But no, at least in most cases on should
not add noise because that's counter-productive. The goal of filtering out noise is to get a "cleaner" result and pretty much always there's already noise in the chain. Besides what does one get by adding (random) noise and then filters it out? In the best of cases one gets what one had before adding noise.
Even so, averaging does not increase accuracy, nor precision, only increases the resolution.
Accuracy, Precision and Resolution are 3 different aspects, they are not interchangeable words, and an instrument has to have all three to produce useful results. Infinite resolution would be meaningless without corresponding accuracy and precision.
Yes and no. Yes they are different aspects and in particular, YES (hurray!) infinite, or let's say, much more resolution than accuracy and precision, is meaningless! (I'm very pleased to see that statement here).
But how much is "too much" or basically useless? My personal view is that I'm not interested in more then n+1 digits resolution when accuracy and precision is n digits.
Example: I don't care a flying fuck about an 8.5 digits multimeter with de facto 5.5 to, on a lucky day, 6 digits accuracy and precision (and that's the
good ones.
But alas, companies are about profit which leads to marketing coming up with ever new BS. The sad state is that if one wants
real 6 digits measurements one is bound to buy an "8.5 digits" DMM.
Now, since we are in the salad words realm, here's a reflection about metrology and the philosophical "Do we have free will?" (in the sense that the Universe is deterministic or not):
The short answer is yes, we do have free will because "metrology".
How's so? Well, there are known math examples of chaotic functions, where the same function will have a wildly different outcome when even the slightest numerical deviation is plugged into the function as the initial conditions. The fancy term is "Butterfly effect".
Now, if we switch from math to the real world, to get the initial condition it means to measure the current state. And in metrology, a certain amount of error is inevitable. Therefore, even if we were to know all the physics, we still can not measure the current state with zero error, so the Butterfly effect will still cram into our prediction, so we will still have an unpredictable result.
Even if the Universe were to be deterministic, our lives will still remain unpredictable because of metrological errors while measuring the initial conditions.
Fate or not, you will appear to have free will anyway. Because "metrology".
Careful there! Philosophical excursions tend to look attractive but actually have plenty of spikes ...
So, just two quick remarks:
No, the fancy term is "avalanche effect". "butterfly effect" is the (frankly, idiotic IMO) hollywood term. And btw. functions with an avalanche effect are not (necessarily) "chaotic". In fact we usually try hard to have them be well deterministic - yet avalanche (e.g. in cryptography).
And the main reason for us not being capable to make zero error measurements is not really complex but the simple fact that there is pretty much always (at least) one element in between us, who measure and the target of the measurement and/those element(s) usually introduce errors (e.g. DMM. Another factor often encountered is accuracy and precision (e.g. parallax error).
Generally speaking statistics is
not a friend of metrology. It tends to be useful though when dealing with large (measurement) result sets. And statistics *never* creates reality with one single (occasional) exceptions: humans.