Yes, its not doable on all meters; the Fluke 87V for example, will not allow partial calibrations: doing only one or 2 ranges ... its all or none. However, older Flukes in the series with pot adjustments you had free reign in choosing not only which range but what reference voltage to use. The meter's linearity took care of the rest. After the DIY calibration you can check values across the entire range of the meter from mV to kV to insure it matches your reference meter. But truly, Fluke 80 series work well in heat and humidity, it never needed adjustment in 10 years, the only problem was performance checks, who was off, the Fluke or the reference voltage? In the Pacific temp would swing from 75F to 120F daily, in the shade.
High frequency and voltage AC sources are not easy to come by, and its tricky doing the high current calibrations on AC and DC. The good news, accurate AC isn't as critical in my need as DC. But in the olden days we'd use AC voltage regulators [ typical in the boonies to run gear like PCs] and under no load provides stable 120-240Vac output at 60Hz typically down to 100mVac, most stable at 1Vac; that's what was available easily.
You can get stable low cost kV DC sources by looking for surplus electrophoretic power supplies, they deliver 3-6kV, old ones to 600Vdc and stable to the mV. I got one on eBay for $10. The stability is important because protein electrophoresis requires a stable reliable voltage source or the migration will be off; these power supplies are dumped by biological labs every year, they cost $1K-10K each new, and well, that's bioresearch!
Many new batteries, and my favorite now is eneloop NiMH LSD, have stable voltage outputs to the 10uV, even 1uV for a few minutes, particularly when using very high input impedance meters. Fresh charge them, let them 'rest' for a day before using them as a votlage reference to transfer measurements between 2 meters. You can series them to go from 1.2Vdc to XVdc but it gets noiser as the contacts and uV errors magnify with each series cell.
Precision resistors are stable to the 1-10 uV level, for many minutes, stock up on them when you find < = 0.01% and the higher the wattage, the less likely to drift due to thermal and electrical heating. For best results use as low a voltage as possible for generating mA you need for cal. Values in the mid Kohm are best, as megaohm types are noisiest and single digit ohms require too much power.
As for using resistors for precision voltage dividers you can too, but the weak link is the quality of the connection between any 2 resistors, soldering is the best bet.
On the cheap, its possible to adjust any meter with non-metrological sources so long as the source is stable for a few minutes, and the reference meter is equal to or better than the DUT. For accuracy, the reference meter needs to be calibrated. Its not a preferred way because its non-standardized and difficulty to deploy to metrology houses.
Sure, that works, as long as short-term stability and noise is good enough and you can generate the value it wants. This can actually be tricky at higher voltages and for AC. Do you have a stable 1000VDC source? Or something like a stable 300VAC 10kHz source? I've actually adjusted the 30V range of a multimeter with something like 7Vrms because that was the max. output of my only stable function generator. Not great for accuracy, but the previous cal constants had a slope of zero (result was 10V, regardless of input), so at least the accuracy got much better .
For example, if you have a 5.00V source but if you measure it down to the uV its stable for 5 minutes say e.g. 5.0015V, you can measure it with reference meter, then DUT, adjust, measure again with DUT, reference etc., and repeat the cycle to insure at any time [...]
I've done this, I agree it works fine (although I wouldn't consider that meter calibrated). I don't like to invest much in calibration sources because they are basically single-purpose devices. Something like a multimeter is much more versatile, so I've always had to improvise. The more resolution the meter has, the more interesting this exercise becomes. Lab supplies are not very stable if you're measuring down to the µV level, and 1/4W resistors are completely unstable (not even for a few seconds) due to self heating from the test current. But fortunately I don't think the UT71E is in that territory.
The issue I see with the Uni-T meter is that the supplied calibration procedure suggests to me that the trimmer is only for the lowest VDC range, and the other ranges just require a certain input value. It doesn't show the option of entering an arbitrary value it should expect. In that case, you have to generate the 5.0000V +/- .03% or whatever the spec is, at least for a few minutes. Good luck doing that with some random LM723-based lab PSU. Even getting a potmeter to sit exactly on that value is hard, most pots have a fairly limited settability, even the multi-turn ones.
I finally saw it 'suggested' when I received my Agilent 1252a, its actually in the calibration page of the manual. I don't often see it mentioned anywhere. Here the Agilent folks suggest the 3458a, probably one of the last 8.5 digits around, as the 1252a has a resolution of 1uV, but it may be possible to use a lesser meter for the other scales.
The 8.5 digits may not actually be necessary (a 6.5 digit ~20ppm meter should be enough), maybe they just recommend it because most cal labs tend to have a reference multimeter like that around. But yes, extremely linear multimeters have largely replaced the old way of dividers and null meters. A stable source (although they probably don't abuse a lab supply to do it) and an accurate multimeter is all you need in that case. Sometimes even the pros struggle with accuracy, I think the calibration procedure for the HP 3468A 5.5 digit bench meter states that the reference multimeter (I think it was an HP 3455A) should be calibrated within 24h before calibrating the 3468A, because the 90 days accuracy spec was not good enough.