Thanks Andreas. I must admit I am a bit confused.
Parts per million - ppm - is widely used. Doing a search on the web digs out ppms in chemistry, production quality, electronics, economics, medicine and many more areas.
I was under the impression that ppm was just a dimensionless ratio [1/10^6] - as in this tutorial (imo easy to read and quite good introduction to many aspects of voltage references - absolutely worth reading):
Accuracy Parts per Million
Another reference accuracy unit found in data sheets is parts per million, or ppm. This unit is typically used to specify temperature coefficients and other parameters that change very little under varying conditions. For a 2.5V reference, 1ppm is one-millionth of 2.5V, or 2.5µV. If the reference is accurate to within 10ppm (extremely good for any reference), its output tolerance is:
2.5V × 10/10-6 = 25µV
Converting this to voltage accuracy:
2.5V ± 25µV = 2.499975V 2.500025V
Converting to percent:
±(25E - 6V) × 100/2.5V = ±0.001%
http://www.maximintegrated.com/app-notes/index.mvp/id/719But then I found what what you say about using RMS, ppm and standard deviation:
"a root mean square (RMS) voltage (identical to the noise standard deviation) in volts"
http://en.wikipedia.org/wiki/Noise_%28electronics%29and the "6 standard deviation" and "6.6 sigma" rule (many variants) in for instance:
en = 1.26 µVrms or 8.3 µVp-p {my comment:(8.3 / 1.26 = appr. 6.6)}
http://www.ni.com/white-paper/3295/en/#toc4ADI has even used a whole video to explain that to convert from RMS to Peak-to-Peak you multiply with 6 or 6.6 (depending on a confidence interval probability):
In general I find the whole area a mess with a lot of quasi-formulas that lack assumptions and documentation. So I will not reflect more on it now except from using the
same ADC with the
same software to compare
different voltage references (with inter-comparison of 10/5/2.5 volt refs it doesn't matter much what measure the software uses).
But the topic interests me, so I think that next year I will make a project out of Noise in Voltage References, DACs and ADCs. A good point for me to start will be understanding this first:
Quantification
The noise level in an electronic system is typically measured as an electrical power N in watts or dBm, a root mean square (RMS) voltage (identical to the noise standard deviation) in volts, dB?V or a mean squared error (MSE) in volts squared. Noise may also be characterized by its probability distribution and noise spectral density N0(f) in watts per hertz.
A noise signal is typically considered as a linear addition to a useful information signal. Typical signal quality measures involving noise are signal-to-noise ratio (SNR or S/N), signal-to-quantization noise ratio (SQNR) in analog-to-digital conversion and compression, peak signal-to-noise ratio (PSNR) in image and video coding, Eb/N0 in digital transmission, carrier to noise ratio (CNR) before the detector in carrier-modulated systems, and noise figure in cascaded amplifiers.
Noise is a random process, characterized by stochastic properties such as its variance, distribution, and spectral density. The spectral distribution of noise can vary with frequency, so its power density is measured in watts per hertz (W/Hz). Since the power in a resistive element is proportional to the square of the voltage across it, noise voltage (density) can be described by taking the square root of the noise power density, resulting in volts per root hertz (\scriptstyle \mathrm{V}/\sqrt{\mathrm{Hz}}). Integrated circuit devices, such as operational amplifiers commonly quote equivalent input noise level in these terms (at room temperature).
http://en.wikipedia.org/wiki/Noise_%28electronics%29