Am I crazy to try to get a baseline I can use to establish a minimal level of credibility on my bench? Am I descending into some new form of nuttery? What do you all do?
I appoint devices to be my in-house standards. They're generally the most stable one in a category or ones that I have the most confidence in. All the others are compared to the designated standards and either brought in line or offset noted. If I buy a new device with a known calibration or can borrow a standard with a known history, I'll use it to validate the in-house standard.
For example, my Agilent 34401A is my DMM standard. When I got a new U1282A, I used its in-cal status to sanity check the 34401A on all functions and ranges. Each time one of the USA Cal Club voltage and resistance refs get around to me, I check them using the 34401A to see how much it has changed.
For example, the past few years the 34401A has consistently been about 200µV high at 10V. Have I adjusted it? Nah. I'm OK with having a little more potential than everyone else.
It's close enough at that resolution to adjust my power supplies, lower-res DMMs, etc.
For frequency, if you don't want to have GPSDO-based central frequency source (or don't have external clock on all your gear), you can still get one of the inexpensive GPS modules to tune up a frequency counter pretty well. That's still on my to-do list.