I finally managed to obtain a better DMM, if that can be said for Fluke 287. It's new, manufactured two months ago, and I presume that is good enough calibrated. Since in firmware M5 we are adding third decimal place for output values I'd like to see is that possible in practice. Chosen voltage reference, and combination of 15-bit ADC and 16-bit DAC should provide 1 mV, 1 mA resolution.
If output is in CV mode of operation it seems that programmed output voltage value can be accurately maintained for many hours (when initial drop without remote sensing is ignored). Don't know if the same can be said for many days or weeks what professional units usually claims. I didn't leave it for such long period yet.
Another scenario is CC mode of operation. There I have something that I cannot explain. Lets take as example an extreme when max. output current is set. With voltage set to 15 V and power resistor of 2 Ohms output will surely enters CC mode for current set to 5 A. Current control loop measure current as voltage drop on R65 (see down right corner,
Sheet 3). Power dissipation on R65 is 0.5 W for 5 A, that will heat it considerably very quickly (but it can manage it since it's rated for 2-3 W) and that will affect its resistance despite of how low TCR it has.
Change in its resistance should affect output current and that is something that I can be easily monitored on Fluke which has relative mode of operation (
Reference is starting value). I got the following situation after about minute:
Current starts to drop as R65 is heating up. That I understand. Something that I don't understand why ADC cannot measure that difference. That -0.34% or 17.2 mA on 5 A represents almost 113 LSBs/levels/steps on 15-bit scale, but ADC reports changes of barely +/-1 LSB. ADC (IC13, see
Sheet 4) measure I_MON (pin 6 on IC7A).
Why I need that? If ADC can measure that difference it could be possible to some extent to maintain with firmware set current in CC mode despite the Rsense resistance fluctuation. Or should I simply forget that since difference is not so huge (I didn't measure more then -0.4%). Other possibility is to push user during calibration process to wait for e.g. 60 or more seconds before externally measured value can be entered that will be used for calculating required correction.