Anyway, this is the workhorse we have (actually 5 units) in the lab and all are with a reading little bigger than the correct one.
For this reason I wanted to test on one of them the possibility to correct the reading.
For the official use in front of the customers we use the Agilent 34401 that is periodically calibrated by an external company.
Fascinating that you have not one, but three reference meters that I can't possibly justify even one of, but your everyday 'workhorse' , in a use where precision appears to matter, literally costs less than my test leads. You have to understand that many will find that difficult to understand!
So, as for getting your meters to read accurately, is the high reading consistent over time and scale? Do you get
10.0008 10.008 volts from all of them every time you try? If not, how much variance? And how about 5.00000 and 15.00000? Is it 5.004 and 15.012? If not, what is changing the gain constant really going to get you? If so, and assuming you don't solve the issue of changing the calibration memory, perhaps you could simply try adding 8K resistance to the test leads and see if the results are acceptable. If so, find the 10M input resistor and add 8K to it. Or, remove and measure it and sub in a low-tempco resistor with about 8K more resistance. From another thread, I think that the 10M input is R29 + R30, but I'm not at all sure since I don't have an example here.