20k looks awfully low just for a sensor, that's 15mA and 4.5W at full voltage! You want to reach a suitable compromise between accuracy, leakage and tolerable loading (dissipation). That's what engineering is, optimization.
Ah, the 5V source has a diode and a 400k resistor? Then that diode can be very small as well; I don't think you'll find a 1N4148 in 400V+ ratings (aside from specialty high voltage rectifiers), so a UF4004 or better would be fine. Or if you don't care about having 400k loading the circuit under test, just leave it hooked up all the time, no diode whatsoever... 1mA backflow into the 5V source probably isn't the worst that can happen, and might do nothing at all, or need only a little work to be reliable (like the zener to keep the supply voltage from running away in case of excess current). You could just as well bias the ADC input with a resistor and connect that, in turn, to the load; even use just one diode, so the ADC measures the "high" end of the diode. You lose a diode drop, but I'm guessing now that you're not going for 10+ bit accuracy or anything.
Again, you're counting on everything settling to the expected voltage; I don't know what exactly you're looking to get from it, but the accuracy isn't looking good. You might just use a comparator instead of the ADC (well, technically it's still a 1-bit ADC) to do a dumb go/no-go condition, for instance.
Tim