Very interesting discussion, who would've thought such a simple thing can be so complicated!
I finally managed to get some ADC readings and experiment a bit (that's really quick for me, I still have stuff before y2k filled under "new - to do") and it seems that input impedance is not even making a difference. Even a 220k resistor in series is doing nothing to the ADC reading (at least for 8-bit precision, which I programmed now).
Now spreadsheet says:
"The ADC is optimized for analog signals with an output impedance of approximately 10 k? or
less. If such a source is used, the sampling time will be negligible. If a source with higher impedance
is used, the sampling time will depend on how long time the source needs to charge the
S/H capacitor, with can vary widely."
Ok, fine, higher input impedance equals slower reading. I understand the theory, how the ADC samples, etc. But numbers, how much impedance can I get if I'm willing to "wait for it"? This theoretical step I'm missing.
In practical terms probably most of my problem is stupid, I can just use high value resistors and calibrate the input (which I plan to do anyway), if the readings are consistent for the same voltage, that's the end of it.