Reference is only 5% of all work needed to be done for precision meter.
LM399 and similar reference chips are used in high-end gear because they are very stable (better than 10ppm/year is not a rarity), not because of initial tolerance. Absolute voltage output in such case is not important, because it's calculated and calibrated by software/firmware, but making sure that set value is not going to change millivolts after day/week/month is vital. Just a resistors alone for such references are often selected/aged and cost more than reference chip itself. For example Vishay foil resistors with similar to reference temperature related drifts can easily cost over 50USD per piece.
Also even if you spend countless hours designing and debugging whole thing, reference, ADC unit (which need to be very stable and precise as well), current sources for resistance measurements, all firmware tricks and hacks, proper thermals and mechanicals, you will end up spending way more than 2500$.
And even then - how you going to verify that such product meet spec? Common rule - to test accuracy of something, you need to have source or meter at least 4, better 10 times better than device under test. Which brings you back to high dollar arena, because 7.5/8.5 meters cost 5-10k, and calibrators cost 5 times of that
Of course there are some ways to cut corners, but to make device like mentioned Agilent, which you get supported, under warranty and well calibrated, would cost much more than 2.5K$
.