Probably the leads acting as antenna's. It doesn't take a lot of current through 10Mohm to generate a few mV, as long as it doesn't occur when actually connected to something, it should be fine. All meters with an 'infinite' (>10Gohm) input impedance have the same effect. As long as the output impedance of your test point isn't too high, this shouldn't matter. 100mV in DC mode is a bit high for 10Mohm I believe, my Fluke 189 is <1mV unless I'm moving the leads, in AC it's like 500mV with the leads close to power cords. The high reading in DC mode might indicate poor CMRR. Not a calibration issue in my opinion, possibly a design issue.
If it's a true-RMS meter, it's perfectly normal that it doesn't read zero in AC mode with the leads shorted. The response of the True-RMS converter is not exactly linear close to 0V, so they adjust it so it's accurate in the linear part. The accuracy specs are usually not valid below 20% of full-scale or so. You should not subtract this from other readings. There's a note on the Keithley website somewhere that explains this. Not sure about non-true-RMS (false-RMS?
) meters, I don't own any decent meter without true-RMS. I can't see any reason why it wouldn't read close to zero.