There are various IC voltage references with fairly good stability, but the initial accuracy is usually much worse than the stability, so you really need to have them calibrated for optimal accuracy, and verify the accuracy after some time. It would probably hard to get a commercial cal lab to calibrate a DIY piece of kit. They need to be kept powered on and at constant temperature for optimal accuracy, which is why transfer standards often have batteries and are shipped overnight. A simple 10V reference will only allow you to calibrate the 10V (or 20V/30V) range, and with reduced accuracy the 100V range. For lower ranges, you need a precision divider (resistors with better specs than a simple 4.5 digit DMM are quite expensive). For resistance you need resistors with a lower tolerance than your meter (eg. 0.025% for a simple 4.5 digit DMM). Currents (up to 1A or so) are even harder, although the tolerance of your meter is usually higher.
My strategy is just have multiple precision DMM's (need multiple meters anyway), and use any stable source to compare them. Just connect them in parallel to a variable power supply or function generator, or mains for high-voltage AC. If multiple meters from different manufacturers and places all agree within spec, they either drifted by exactly the same amount, or they're still within specs. Not NIST traceable by any means, but good enough for hobby use in my opinion.
A lot depends on the tolerance, a cheapy 3.5 digit meter with 0.3% accuracy or so is a lot easier than a meter with lots of digits and a very high accuracy. This is usually also reflected in commercial calibration prices.