Yes this is done with ADCs too - you inject noise into the input voltage and then take say 100 readings, add them up, and divide by 100, for a 10x resolution improvement.
The problem is that you just get more resolution, not usually more accuracy, and the ADCs and DACs which are on-chip in most micros are crippled by noise, so the last 1-2 bits are just fiction.