Author Topic: Uncertainty Budgets and Decision Rules  (Read 16495 times)

0 Members and 1 Guest are viewing this topic.

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 998
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #75 on: August 18, 2021, 03:07:27 pm »
[...] having to take into account Height Above Sea Level, Atmospheric Pressure, Gravity [...]
The worst calibration I have ever heard about, is the BIPM's Watt Balance.
They told me they would have to calibrate out the gravity gradient, that is, not the direction of the force of gravity, but the curvature of the force of gravity, because BIPM is literally located on the side of a mountain.
How ?  No idea, but they knew they would have to.

17025 is similar to when you do a maths test, most of the marks you get are for the working out of the sum, not just the answer. I am sure the 17025 accreditation bodies are full of pedantic people...you know...forum members. They spend the day wondering about things that might affect the reading and then ask the lab to work it out. Electrical is fairly easy in the scheme of things.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 8010
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #76 on: August 18, 2021, 06:05:49 pm »
Poor labs were able to do this with little recourse, relying on the customer ignorance..... Although it is ultimately the user's responsibility to contract a laboratory with a capability that suits their needs, it can be very hard to navigate or understand a scope of accreditation, and I think we also have a duty of care to make sure our capability is likely to be suitable

Yes, and yes...but it goes further.  Some customers are well aware of what they are getting and use subsequent obfuscation to hide the fact that their calibration certificate is worthless.  It isn't just companies needing to meet some ISO requirement that all devices used for any purpose, like testing 9 volt batteries, have to have an annual calibration.  You see this in the sale of test equipment, the calibration of subsequent devices and so on.  The scammer states that the device is calibrated and lists the calibration certificate number, but no mention is made of the fact that it is calibrated with an uncertainty two orders of magnitude worse than what most people would expect.  And I suppose in many cases they can get away with this because they know that devices like voltage standards and long-scale DMMs will probably not drift that far in any case--thus relying on the quality of the original instrument rather than the most recent calibration.  This is, in fact, what the guy was doing with that cal certificate that I posted--there was a long previous thread about the guy who sells "2 ppm" voltage standards on eBay.

This is why I would like to see the rule that any calibration done to less than OEM specs be denoted as such on the sticker and the certificate, with perhaps an additional requirement that anyone listing the certificate number has to disclose that as well, with failure to do so being deemed actionable fraud. 
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 998
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #77 on: August 18, 2021, 06:36:11 pm »
This is why I would like to see the rule that any calibration done to less than OEM specs be denoted as such on the sticker and the certificate, with perhaps an additional requirement that anyone listing the certificate number has to disclose that as well, with failure to do so being deemed actionable fraud. 

Other than that won't happen. Who verifies that the spec published is actually realistic? I can see the likes of Fluke and Keysight making it virtually impossible for external firms to calibrate stuff as they are the only ones with the funds and equipment to calibrate the stuff (plus they won't see the secret box), then everyone will winge that it takes 6 weeks and costs a gazillion bucks to get the meter calibrated. Killing off competition and preventing any new labs from coming onto the market unless they are willing to spend half a billion on kit to get into the market.

Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 8010
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #78 on: August 18, 2021, 06:56:35 pm »
17025 is similar to when you do a maths test, most of the marks you get are for the working out of the sum, not just the answer. I am sure the 17025 accreditation bodies are full of pedantic people...you know...forum members. They spend the day wondering about things that might affect the reading and then ask the lab to work it out. Electrical is fairly easy in the scheme of things.

If you are encouraged to throw whatever you can think of into the equation and just give a measurement and uncertainty, no matter how inappropriate for the device at hand, then that method of operation works.  If your only fear is accidentally understating your uncertainty, then perhaps it is the way to do it.  The only downside would be that in some cases, the calibration results become relatively meaningless.  You've gone through the steps correctly but you haven't arrived at a helpful answer  That sack of potatoes is 10kg +/- 50% +/- 2kg!  Oh dear!

However, I can't imagine actually operating that way in practice!  Surely at least some of your customers demand a go/no-go decision?  And if you unnecessarily increase your uncertainty, you are increasing the possibility of unwarranted false rejects or conditional results, which AFAIK are regarded as just as undesirable as false acceptances.  Or isn't that true for you?

Quote
My lab would quote an unc of 60ppm + 1mV as I only have a 6.5digit meter at the moment.

If I haven't beaten you up too badly, could you share how your lab gets such a seemingly terrible uncertainty result?  Assuming your 6.5 digit DMM is  typical ~35ppm/reading + ~5ppm/range sort, where does the additional uncertainty come from??? 
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 8010
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #79 on: August 18, 2021, 07:13:25 pm »
Other than that won't happen. Who verifies that the spec published is actually realistic? I can see the likes of Fluke and Keysight making it virtually impossible for external firms to calibrate stuff as they are the only ones with the funds and equipment to calibrate the stuff (plus they won't see the secret box), then everyone will winge that it takes 6 weeks and costs a gazillion bucks to get the meter calibrated. Killing off competition and preventing any new labs from coming onto the market unless they are willing to spend half a billion on kit to get into the market.

I'm not sure what you mean about published specs being realistic--do you think they are too conservative or too optimistic? Fluke and Keysight have tons of published information detailing how they justify their specifications, although I don't know how that squares up with UKAS.  If that isn't good enough, we can all just examine the products once they are in the field.  MY observations are that typical closed-case calibration Fluke or HP DMMs easily meet their specifications the vast majority of the time unless they are broken, regardless of how long ago the last cal was.  I have numerous such Fluke bench meters, and one HP 34401A, that meet their 24-hour specs for decades on end.  As for Fluke making it impossible to calibrate their stuff, they publish full calibration manuals telling you exactly how to do it and they sell the very equipment needed to to the job.  There's no secret formula that I'm aware of. 
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 998
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #80 on: August 18, 2021, 07:18:24 pm »
I have just had a quick read of the service manual and it does throw up some thoughts on some potential issues the effect of the use of the guard or not, Thermal EMF Errors, Leads, Stability of the oven etc etc. but at the end of the day, the lab imported is going to be a big part of it, until you get to what I think Fluke UK claims of 0.75uV.

Though I will admit I am interested they just don't use an 8.5digit meter to measure the reading and record that, it is more of a comparison between two 10V sources and a null meter. A few more items added to the long time wish list for the lab.
Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline mendip_discoveryTopic starter

  • Frequent Contributor
  • **
  • Posts: 998
  • Country: gb
Re: Uncertainty Budgets and Decision Rules
« Reply #81 on: August 18, 2021, 07:50:24 pm »
Other than that won't happen. Who verifies that the spec published is actually realistic? I can see the likes of Fluke and Keysight making it virtually impossible for external firms to calibrate stuff as they are the only ones with the funds and equipment to calibrate the stuff (plus they won't see the secret box), then everyone will winge that it takes 6 weeks and costs a gazillion bucks to get the meter calibrated. Killing off competition and preventing any new labs from coming onto the market unless they are willing to spend half a billion on kit to get into the market.
I'm not sure what you mean about published specs being realistic--do you think they are too conservative or too optimistic? Fluke and Keysight have tons of published information detailing how they justify their specifications, although I don't know how that squares up with UKAS.  If that isn't good enough, we can all just examine the products once they are in the field.  MY observations are that typical closed-case calibration Fluke or HP DMMs easily meet their specifications the vast majority of the time unless they are broken, regardless of how long ago the last cal was.  I have numerous such Fluke bench meters, and one HP 34401A, that meet their 24-hour specs for decades on end.  As for Fluke making it impossible to calibrate their stuff, they publish full calibration manuals telling you exactly how to do it and they sell the very equipment needed to to the job.  There's no secret formula that I'm aware of. 

I can see some firms, not necessarily Fluke coming up with some interesting specs just to sell a product and its not until later we find they are not. Fluke and Keighsight are currently making data available but what if they didn't, Clare Flash testers now Seaward don't produce calibration gear to the same standard as they used to. I have a force meter in the lab at the moment that I have to pay a firm each year to get a code to get it to work again as it bricks once the cal is out. I have had to turn away gas alarms as the company that makes them want 4k for the cradle and codes to recalibrate them, and yes they brick after the cal runs out. Just because a few people do play nice don't expect it to be the case always.


17025 is similar to when you do a maths test, most of the marks you get are for the working out of the sum, not just the answer. I am sure the 17025 accreditation bodies are full of pedantic people...you know...forum members. They spend the day wondering about things that might affect the reading and then ask the lab to work it out. Electrical is fairly easy in the scheme of things.

If you are encouraged to throw whatever you can think of into the equation and just give a measurement and uncertainty, no matter how inappropriate for the device at hand, then that method of operation works.  If your only fear is accidentally understating your uncertainty, then perhaps it is the way to do it.  The only downside would be that in some cases, the calibration results become relatively meaningless.  You've gone through the steps correctly but you haven't arrived at a helpful answer  That sack of potatoes is 10kg +/- 50% +/- 2kg!  Oh dear!

However, I can't imagine actually operating that way in practice!  Surely at least some of your customers demand a go/no-go decision?  And if you unnecessarily increase your uncertainty, you are increasing the possibility of unwarranted false rejects or conditional results, which AFAIK are regarded as just as undesirable as false acceptances.  Or isn't that true for you?

My UKAS auditor does groan when I refer to our Unc being a safe Unc. Its big but I am just getting my head around things and trying to understand the elements that are there and which ones I need to look at. I never had a handover from the previous 2 lab managers and I happen to be the lucky bloke who can have a go at almost anything so I get all the fun jobs onsite so my time to drill into it is limited but I am getting there.

Quote
Quote
My lab would quote an unc of 60ppm + 1mV as I only have a 6.5digit meter at the moment.

If I haven't beaten you up too badly, could you share how your lab gets such a seemingly terrible uncertainty result?  Assuming your 6.5 digit DMM is typical ~35ppm/reading + ~5ppm/range sort, where does the additional uncertainty come from??? 

Most of it is that I am using an Agilent 34401A and the 10V source is on the threshold of two ranges of my Unc so I went for the 10V-100V range (safe mode). But that has a DC Spec of 45ppm (35ppm for the 1-10V range) so I need to get some data of its history so I can then start to use my spec rather than the manufacturer spec. The same goes for the mV as that again is the spec for linearity holding me back. 21ppm is from the lab that calibrated my kit. Now I can get 0.712mV but it has been rounded up to 1mV in the past and I haven't taken the effort to go down to uV and start pounding at that. I have tweaked the Unc this year a little as I did drop the ppm down to 57ppm. If I use the 1-10V range I would be 46ppm+60uV.  I have just sent the meter off to another lab with better customer service and lower unc so I will need to plumb that in, I also need to have a proper look at the specs and how well my meter has done over the years.

Motorcyclist, Nerd, and I work in a Calibration Lab :-)
--
So everyone is clear, Calibration = Taking Measurement against a known source, Verification = Checking Calibration against Specification, Adjustment = Adjusting the unit to be within specifications.
 

Offline binary01

  • Contributor
  • Posts: 41
  • Country: au
Re: Uncertainty Budgets and Decision Rules
« Reply #82 on: August 19, 2021, 12:51:29 am »
I remember doing pressure gauges to UKAS, having to take into account Height Above Sea Level, Atmospheric Pressure, Gravity, also how high the gauge on the deadweight tester. It was such a nightmare for a gauge you know was just going to be used to tell them that there is pressure there. The worst pressure I had to do was 4000bar that was a day for wearing brown trousers.
Yes, we have to consider lots of environmental factors and uncertainty contributions when using deadweight testers.  However, they are excellent references and it is worth the effort for the brilliant long stability and low uncertainty.  We calibrate DWTs and have written and sell a software package for our customers to simplify it as much as posisble  :D
Working at high pressure is scary at first, but as long as you use the correct high pressure fittings and tube, it should fail "safely" - you should only get a spray of oil and loud bang of dropping masses if a fitting doesn't hold.  The loud noise can certainly cause a surprise and good reason for the brown trousers.

This is why I would like to see the rule that any calibration done to less than OEM specs be denoted as such on the sticker and the certificate, with perhaps an additional requirement that anyone listing the certificate number has to disclose that as well, with failure to do so being deemed actionable fraud.
This is precisely what we do whenever the accuracy specification used for compliance is different than the OEM specification.  This ensures it is clear to any user of the instrument, even if they don't have immediate access to the calibration report.  We do this for pressure and electrical instruments.  If we cannot confirm the desired/OEM specification due to instrument performance or UoM, we discuss it with the customer to see if the performance that we can confirm will still meet their needs.  We don't find this difficult or onerous to do, and we can then be confident that our calibration will be useful for them.
 

Offline bdunham7

  • Super Contributor
  • ***
  • Posts: 8010
  • Country: us
Re: Uncertainty Budgets and Decision Rules
« Reply #83 on: August 19, 2021, 01:22:54 am »
Just because a few people do play nice don't expect it to be the case always.

Many smaller specialty manufacturers have funky calibration setups that are very specific to their device.  It may be that they want to keep the revenue in house and it also may be that their small volume means that their support capability is limited--sometimes it is one person and one calibration setup in the whole world.  If they will sell you the necessary stuff, then then it is just a matter of whether you are willing to pay the price.  There's not a requirement that equipment be serviceable with generic or ad-hoc methods.  In some cases they may be worried about liability from measurement tools that have been inadequately tested.  Whether customers accept that probably depends on the product. 

Quote
My UKAS auditor does groan when I refer to our Unc being a safe Unc. Its big but I am just getting my head around things and trying to understand the elements that are there and which ones I need to look at. I never had a handover
from the previous 2 lab managers and I happen to be the lucky bloke who can have a go at almost anything so I get all the fun jobs onsite so my time to drill into it is limited but I am getting there.

It sounds like you are learning by being thrown in the deep end of the pool right off the bat.

Quote
Most of it is that I am using an Agilent 34401A and the 10V source is on the threshold of two ranges of my Unc so I went for the 10V-100V range (safe mode). But that has a DC Spec of 45ppm (35ppm for the 1-10V range) so I need to get some data of its history so I can then start to use my spec rather than the manufacturer spec. The same goes for the mV as that again is the spec for linearity holding me back. 21ppm is from the lab that calibrated my kit. Now I can get 0.712mV but it has been rounded up to 1mV in the past and I haven't taken the effort to go down to uV and start pounding at that. I have tweaked the Unc this year a little as I did drop the ppm down to 57ppm. If I use the 1-10V range I would be 46ppm+60uV.  I have just sent the meter off to another lab with better customer service and lower unc so I will need to plumb that in, I also need to have a proper look at the specs and how well my meter has done over the years.

There appear to be some specific errors in that process that it might be helpful to review.

First, the 10V range on the 34401A goes to 12V, so there would be no sane reason not to use this range--and high impedance mode should be selected for voltage standards, although it will make no discernable difference in this case.

Second, the specs on a 34401A are neither drift nor linearity and unless UKAS rules are insane or they are being misapplied, there's no point in breaking it down like that.  The 34401A specs, as received or if calibrated properly to manufacturer specs (as in by Keysight), are inclusive of all errors--drift, tempco, linearity, calibration reference uncertainty, etc.  And, if the manual I have is to be believed, they are to k=4.  At least that is what this manual says, which is different from the k=2 I always presumed to apply to HPAK gear....   

Third, while calibrating a 34401A with an 'imported uncertainty' of 21ppm on the DC ranges seems like a very poor choice (why??), if you do go this route the errors break down in to scale, offset (range) and INL (linearity) Drift only accounts for a certain part of the first two.  Drift, tempco within the 18-28C range, random jumps and so on are not separated out.  Only INL, which is specified as 2ppm-reading/1ppm range, is separately listed. While you may think (and UKAS may allow you) to use the device history to come up with a lower uncertainty, you should be careful.  Perhaps you can infer the in-bounds tempco from the listed out-of-bounds figure and so on, but I don't know if you can make a solid theoretical case for doing it this way.  It may be stable to 0.1ppm for years, then one day when you power it on it jumps up 5ppm for no particular reason. That is accounted for in the OEM specs, but possibly not your historically-derived uncertainty.  OTOH, it appears you are unlikely to underestimate your uncertainties anytime soon.
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Uncertainty Budgets and Decision Rules
« Reply #84 on: January 14, 2022, 07:46:58 am »
Been playing a little with recently published Sandia PSL's Uncertainty Calculator and it's quite powerful thing.
Also have prebuild / precompiled version here.

Was looking for something like this quite a while. It's also available in source code and using open toolchain, which is yet another plus.  :-+
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 
The following users thanked this post: MegaVolt

Offline bsdphk

  • Regular Contributor
  • *
  • Posts: 206
  • Country: dk
Re: Uncertainty Budgets and Decision Rules
« Reply #85 on: January 14, 2022, 05:55:22 pm »
I can highly recommend this Sandia Tech Report as companion reading:

https://www.osti.gov/biblio/886899-sensitivity-risk-analyses-uncertain-numbers

 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Uncertainty Budgets and Decision Rules
« Reply #86 on: January 16, 2022, 08:40:51 am »
Played with resistance data and Sandia's toolkit.
So far pretty happy with capabilities. Have found few very minor bugs :).

Here's dataset I've been massaging with resistance measurements. Summary in graph form and report below.

PDF with math and uncertainty models and analysis.





« Last Edit: January 16, 2022, 08:46:33 am by TiN »
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 
The following users thanked this post: e61_phil

Offline MegaVolt

  • Frequent Contributor
  • **
  • Posts: 930
  • Country: by
Re: Uncertainty Budgets and Decision Rules
« Reply #87 on: January 17, 2022, 12:09:16 pm »
Played with resistance data and Sandia's toolkit.
How can I compare resistors with uncertainty 0.1ppm?
 

Offline mzzj

  • Super Contributor
  • ***
  • Posts: 1282
  • Country: fi
Re: Uncertainty Budgets and Decision Rules
« Reply #88 on: January 17, 2022, 12:33:09 pm »
Played with resistance data and Sandia's toolkit.
How can I compare resistors with uncertainty 0.1ppm?
Typically with resistance bridges. Some examples would be ASL F18 or various MI models
https://mintl.com/productcategories/metrology/resistance-measurement/resistance-bridges/
These offers accuracy as low as 0.015ppm 
 
The following users thanked this post: MegaVolt

Offline bsdphk

  • Regular Contributor
  • *
  • Posts: 206
  • Country: dk
Re: Uncertainty Budgets and Decision Rules
« Reply #89 on: January 17, 2022, 01:02:38 pm »
Bit TiN, please read the document I posted a link to, because "Gaussian" is not the only game in town, and you likely underestimate your uncertainty by assuming/postulating gaussian noise processes.
 

Offline MegaVolt

  • Frequent Contributor
  • **
  • Posts: 930
  • Country: by
Re: Uncertainty Budgets and Decision Rules
« Reply #90 on: January 17, 2022, 01:19:34 pm »
Typically with resistance bridges. Some examples would be ASL F18 or various MI models
https://mintl.com/productcategories/metrology/resistance-measurement/resistance-bridges/
These offers accuracy as low as 0.015ppm
https://www.ebay.com/itm/154493840861

3000$ :))))
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Uncertainty Budgets and Decision Rules
« Reply #91 on: January 17, 2022, 11:44:19 pm »
bsdphk
Valid point, I'm still learning all this. Yet to find clear guidelines or documents on which distribution should be applied to what. For time being I get pretty uniform gaussian results from actual measurements, so the analysis above should be still in correct ballpark. I'd like to add tempco corrections next so analysis can be done also for setups without thermal chambers.

MegaVolt
More like 32k$. I plan to repeat measurements with more typical 8.5d DMMs too once I have analysis template with tempco of DMM and DMMs own errors. Started with a bridge because it's easier (with bridge don't have to worry about tempco, linearity, drift etc.).
« Last Edit: January 17, 2022, 11:47:06 pm by TiN »
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf