I fully agree with you. I can do it. But it is something like an trying with unknown results. I don't know how HP 34401a is calibrating internally. I mean which circuits and algorithms are executed. So, I would expect an official requirement regarding to max output resistance of DC reference (standard). But it seems to me there is no such requirements
I imagine they expect any cal lab to use a calibrator like the Fluke 5700, Fluke 5440, maybe Fluke 5100, Datron 4700, etc. These calibrators will all have a very low output impedance, except at the lowest ranges, where the meter would be >> 10 MOhm anyway. So I guess the output resistance was never a consideration.
Before there were these calibrators, the practice would be like the Fluke 7105 system, where they had a lower accuracy calibrator and adjusted it to a standard cell using a voltage divider. But the calibrator was used to drive the device under test, not the divider.