The ohms ranges are relative to a reference resistor. In the ohms mode the voltage reference should not have a significant effect.
In the votlage mode the voltage reference can be an important part of the uncertainty - so this may very well be the limiting factor.
I don't think I have ever known a meter to have better accuracy on its Ohms ranges than it's DCV ranges.
I see what you are saying about a known reference resistor and doing a comparison against the unknown, but I thought all/most meters adopted the approach of passing an accurate known current through the unknown resistor and measuring the voltage on a DCV range? Or is this only done on some more expensive meters?
For the coparisome +-5 digits is not the same for a 3.5 and 4.5 digit meter. The digit steps on the 4.5 digit meter are smaller so 0.5% plus 1 digit for 3.5 digit resolution would be equivalent to 0.5% plus 10 digits with 4.5. digit resolution. So the WG025 is still a littel higher accurac, though not by very much.
Yes you are correct. I hadn't given this enough thought before posting this point.
Unless one has a much better source / meter, I would not change the calibration / adjustment trimmers. The usual rule of thumb is to use a DMM that is about 10 x better than the meter to calibrate.
I do have 2 Solartron 7150's and a Datron 1065, all needing attention/repairs, (eBay purchases), which most probably have all lost their calibration data, so i'm no better off ATM with no accurate reference sources, lol.
I am currently re-building up my electronics lab again after many years, so am at the stage of cleaning everything up, repairing where needed, then will calibrate everything, hence why I wanted to obtain the specs of this meter when I come to calibrate it.
When everything is repaired i'm likely going to hire a multi function calibrator or a high precision lab standard multimeter for a week and calibrate all my equipment with it.
I hope at this stage to have the money to have built my own voltage reference and precision resistance box which I will also characterise against the hired lab standards.
The eventual goal is to have my own lab standards with which to calibrate all my equipment against, requiring the hire of expensive lab standards equipment very rarely, (say every 2 years), as a reference check.
Over the years I will then be able to plot my uncertainty budget to a reasonable degree, good enough for what I need.
From the pricture is looks like the CAT rating is more like the Chinese fake type and based on wishful thinking / best case and not real tests. The connector for the Hfe test is at least a bit better than typical, but still likely not really compatible with a CAT 2 rating.
This meter, and a second one I just acquired off eBay, (not sure why I bought a second one other than sentimental value of having the first one so long and the fact it's got real components in it I can repair, lol), are going to be relegated to just Breadboard Development/Design only, i.e. low voltages and low currents. For this they will be perfect, especially the ultra sensitive DC Current ranges. So i'm not worried about the CAT ratings at all.