I see a lot of what I consider obfuscation and misdirection in the policies of a lot of so-called cal labs here in the US. That may seem harsh, but in my view the common no-data cal cert is just a way for underperforming cal labs to collect money for work they are unqualified for from customers that often don't care.
This I admit is common, most customers don't know and don't look at a cert. They just accept it as a necessary evil that has to be done. You do get some that look and you get the odd question or two. I have just had to explain to a test lab what Uncertainty is and isn't. He assumed it was the specification. I think its partly down to information about it being kept to the nerds and not spoken about in public. Most customers just want to know is it in spec.
I would like to hear from a non-OEM cal lab about how they actually deal with the exact topics you listed, as well as to what standards a device like a DMM should actually be calibrated to. For example, if I send one of my 6.5 digit DMMs for calibration, what I want is before/after data and all adjustments made to better than the 24-hour specifications with equipment demonstrated to have an accuracy better than a 5:1 TUR relative to those 24-hour specs or alternatively, at least 3:1 TUR with sufficient guardbanding used. And I'd like that in a 23C +/1 1C environment with a 72 hour acclimatization and a 2-hour warmup. A typical sales rep from a typical cal lab will flubber around for half an hour about their various services and accreditations without managing to actually address any of the issues.
The whole idea of ratio should be downgraded as this is an old school method of thought and often lead people along the line of assuming as this is more accurate its better and it often doesn't take other things into account and in many cases is just not possible if you do a proper budget. I remember chatting to a friendly UKAS auditor about this and they did say it was a very strong thought process in America.
The Certificate should make it clear the method used to do the calibration. For example "The voltage source was left in a temperature controlled environment and allowed to stabilise for a minimum of 24 hours, then the equipment was turned on and allowed to warm up for 1hr prior to the calibration where it was measured using digital multimeter" this is so that if it was sent to another lab then the method could be reproduced. Because you may have compared it to a known voltage source using a DMM etc etc.
Our lab is 22C +/-2 and for electrical that is perfectly fine and its common for UK elec labs to run at this temp because of the yanks and the cheapness on the aircon, that was the story told me, I think the 22C comes via Fluke. Otherwise, it would be 20C like we do in the mech labs.
From what I have seen it is the 1yr spec that labs aim for as the 24hr spec is almost what I would call repeatability in the absence of a repeatability study. Think of it this way, unless you get it calibrated every 24hours then that spec is worthless.
If no adjustments have been made then the cert should say that its not been adjusted. If it has then you need to show before and after adjustment results. Because you need to work out the risk its had to your prior measurements let's say it's 0.5V out of spec, you would need to check that you hadn't had anything passed or failed because of that error then maybe look at recalling those jobs to rectify your measurements. Yes, it is a big can of worms if that happens.
Here is something from a long-ago thread about a dodgy eBay voltage reference. The dodgy reference isn't the issue I'm interested in, this cal certificate is. Comments?
That is an interesting cert in that they don't make it clear if it was compared to a reference standard though there are hints about it but in a confusing manner. They say before and after adjustment but the results are the same so no adjustment has been made. The Temp looks a little chilly and hum is nice and dry not sure if that is a little too dry, also it doesn't state the lab tolerance on temp, it might drift by 5C over 24hrs so the kit could have been in 14C and only 1hr before taken up to 19C
Upper and lower specifications are confusing as hell as they have the wrong specifications there. I would be aiming for a measurement in the 6ppm for the general measurement. Also, they should measure the other outputs. Now I would assume there is a separate Unc for the 24hr constant measurements as that would differ to a single measurement.
Now the interesting thing is they state compliance and yet the Unc would say they can't make that statement in today's rules with decision rule. See attached, which is going on a 6ppm Tolerance.