I personally don't like any digital instrument for reading any null. As I have mentioned elsewhere in the forums, I did compare two 10.00000VDC sources using my HP419A analog null meter and it worked perfectly for the resolution I was looking for. The equipment did indicate the two supplies were stable within about 1uV which is certainly good enough for my non commercial use. True, my HP419A is over 50 years old and there may be some better newer meters now but I would not use a meter with a digital display for this application. YMMV.
HP419A has an error of 2% i.e. over the selected range of 100 μV, we will have an error of 2 μV. A digital instrument gives 100 times greater accuracy with 10 times the range margin.
Guys working with the Johnson standard use a digital device.
Using 845A or EM nanovolt meters/amplifiers on 100uV range to compare standards is unlikely (assuming you want to compare two 10V standards). You didn't say anything about what is the measurement, is it 10V-10V or maybe 1000V-10V ? In latter case, you'd need much more expensive gear, not only nanovoltmeters, such as F720A KVD or reference divider like 752A or a precision calibrator.
I am considering a light case of 10V-10V.
The range is selected from the calculation of 10 ppm of the total error between the two standards. Even if we accept 1ppm of the total error (which, in my opinion, is the ideal case), the nanovoltmeter still beats the ordinary null meter.
You might add the Keithley 182 to your table. Its minimum range of 3mV gives it some nice headroom over 1mV devices, even though it has 1 fewer digit.
Thanks for this add-on. 181 and 182 are close to leaders.
A $10 DIY null meter with any DMM would do <100nV resolution..
Resolution is not equal to accuracy
+/- 10uV input --> +/- 10.01mV output
I see an accuracy of 0.1%, and as I understand it, without a guarantee. The best industrial multimeter gives only 0.5%. But I added this device to the table