I never said otherwise. All I said is that EEs in general tend to ignore the cal sheet.
Er, no... you worded it in away that suggested that all EEs are sloppy at reading calibration data.
You should really stop reading stuff in other people's posts that simply isn't there.
There's really nothing 'sloppy' in not reading cal data on every occasion. Far from it. As an EE, if a instrument has passed calibration and the specified accuracy of that instrument is good enough for my measurement why should I read calibration data?
That doesn't change the fact that there's usually a certain difference in the adherence to numbers between EE and scientist.
You implied they all just look at the pass/fail column on the sheet when they read it. See below.
Yes, because that's my experience. YMMV of course, after all who knows maybe in your labs engineers spend half an hour going through the cal data (that just one post ago you said you don't trust anyways) before taking a simple 100Mhz scope to do a basic measurement
Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make.
I'm afraid it only takes one diligent EE to prove you wrong, but in reality there will be a lot of EEs who do study calibration data. I rarely do it at my place of work but when I do it will typically be for something like a noise source ENR vs frequency or the efficiency correction factors (across frequency) for a power meter head. But other EEs may study calibration data much more than me. I just do RF.
Yes, you have to look to the calibration data for some simpler noise sources like the HP 346A because they are hardly more than a simple noise diode in a solid housing. Each diode has a specific operating envelope which differs slightly from the next one, so each noise source is calibrated and the output/frequency response printed on the source directly. And since the simple noise source has compensation or controls or indications you have to refer to that table to use it.
The same is true for old power meters which don't have a frequency response compensation, so again you have to look at the sticker with the frequency response curve or table if you want to get some useful data. Modern Power Meters store the calibration data in the Power Meter head which is read out by the Power Meter, so manual compensation is no longer required.
Both are pretty much edge cases which really require an engineer to read and apply some calibration data to get any meaningful measurements. And even there you'd usually rely on the data of the factory-provided table or graph showing the frequency response, and not on the latest calibration certificate that shows by how much the noise source or PM head deviate from that table.
For other pieces of test gear (RF generator, scope, PSU, AWGs, Spectrum/Signal/Network Analyzers, whatever), I guess even you don't bother checking the calibration data you already said you don't trust anyways, as long as the instrument has passed (i.e. is known to be working within spec). And despite not trusting calibration data, my guess is that you're not doing your own calibration of every piece of test equipment before using it in a measurement to acertain it's spec compliance (stuff like cable/adapter calibration on VNAs excluded of course).
And frankly, there's nothing wrong with that.