I have used a little different method, but it is not so foolproof in case of an error. It can, however, be built without using exotic analog parts. If you
know what you are doing, then you can measure the line directly by using a difference amplifier with suitable input resistors.
That means at least 3 or 4 big resistors in series to mains in each "leg" of the difference amplifier inputs. Resistors should be chosen so that in case of one or two short circuit failures, the leakage current is still not lethal. I'd use something like 470k-1M? resistors. 4x 470k resistors in series leak about 120 µA from mains, which is much less than leakage current of ordinary mains line filter (about 300 µA) (like used on a computer PSU). Two short circuit failures on the resistor chain increase current to 245 µA which is still less.
Do not use single resistor in any case. Ordinary resistors are only rated about 200 volts, and you'll want that single short circuit failure will not cause hazard.
You can see the schematic
here. That schematic contains other things too, but the voltage measurement amplifier is built around U1. Umeas_cm needs to be at middle voltage of your ADC conversion range. In other words, if you are using 5 volt reference to the ADC, then Umeas_cm is 2.5 volts. That can be quite easily achieved by just with resistor divider. Upre goes into your µC ADC input. If you want to use filtering, I suggest that cut-off is set at least to 20 kHz or so.
Then the resulting amplifier output can be directly sampled by µC ADC and true RMS value can be calculated in the µC. This is much cheaper (and accurate) than analog RMS chip & ADC approach, although it needs care when designing the input stage and some thought when RMS is calculated. Also, the RMS calculation period should be integral multiple of the line period.
Regards,
Janne