Hi guys, i'm thinking about replicating this mains breakout box:
http://www.farnell.com/datasheets/1650902.pdfAs you can see, the box has the possibility of either using the internal sense resistor of the power meter or use its own 1 ohm resistor for low-power measurements. I am interested in replicating this feature. However, I am curious about the requred temperature tolerance of the resistor required. A 1 ohm resistor with 1A (the maximum allowed) through it should dissipate 1 W, obviously. But what about the resistance change caused by heating of the resistor? What I am concerned about is that basically every tolerance is multiplied by the mains voltage(230VAC), so a 1% change of a 1A current at 230VAC would change the power dissipated by 2.3W. That seems like quite a lot?
I was thinking about using this ER58 resistor(
https://www.te.com/usa-en/product-1-1623750-1.datasheet.pdf), which is what I have available. Its a 7W resistor, so there should be some overhead for disspiating 1W. The datasheet specifies max 60 ppm/decC, which I guess is fairly acceptable? I am a bit confused about the PPM value, since the temperature range isn't clearly defined. But am I correct in assuming that at 1 ohm at 25degC would cause a max deviation of 60ppm*50degC = 3mR at 75degC ?
A 3mOhm change is equal to 0.3%, so that should reduce the error to 0.7W at full power. Thats still quite a bit, considering standby power measurements often deal in the 0.1W range. Do you guys know if special resistors are used and if so, what can you recommend?
Thanks!
Addition:
I tried measuring the 1 ohm resistor over a period of 12 hours(overnight). It shows a variation of ~3.4mR(max-min). This is on both my keithleys. I assume the temperature control of the meter is as important as of the resistor under test?