From my (very) limited understanding of the Nyquist sampling theorem, to faithfully measure the sine wave I need to sample it at a minimum of twice its frequency. In this case it's just a 50Hz sine wave so I will sample it at a frequency of at least 100Hz.
Disclamer: I'm not familiar with advanced maths
.
But a 100Hz sine wave goes through 0V 100 times a second, so you might be sampling it exactly when it goes through 0 twice.
But that does not really matter since it assumes that the sampling of your ADC is instantaneous.
Assuming that the ADC takes some time to sample the voltage (which is likely since it's only 100hz ?) - to find the exact /100% correct accuracy of the real-world system you would have to do some very complex mathematics which involves the sine wave + knowledge of the internal architecture of the ADC, beyond what is in it's datasheet. This is because the ADC takes some time to measure the voltage - during which the voltage changes -- this can cause inaccurate measurements due to the architecture of the ADC.
Basically, the datasheet of the ADC assumes that the input voltage is constant throughout the sampling period. Since it's not, the actual voltage that the ADC reports might be an average of the voltage during the sample period, or a value at a specific point in time during the conversion.
BTW since you know it's a sine wave, why don't you just convert it to DC and measure the voltage and work your way back to the AC voltage ?
In this case I am using 100% of the range of the ADC already. The sine wave is DC biased around 2.5V with a max peak of 250V.
Yup, you're right, I missed that.
How did you get 4.2721519%?
It's from the previous "With the voltage reference at the worst case 5.056V the ADC will see 0.0191455696202532 of the reference voltage..."
0.0191455696202532 is 4.2721519% off from the expected 0.02V.