@garrettm
It will be more easy to verify the status of calibration if the displayed read value is displayed in real values , not scientific .
The DMM has a 10 Meg ohm input resistance. Even if the leads had a total of 1 ohm resistance, the voltage drop would be half a microvolt.
If I had the results you're showing here I'd be worried. It's saying that the output is quite a bit higher than it should be unless those cables you're using have absolutely no voltage drop? Which is unlikely.
McBryce.
I also planned by myself to edit list with calibration points from thread with bash script. Tommorrow I will make time for this.
Here's a picture of my calibrated DP832(A) sending 5.000V to my 34461A
If I had the results you're showing here I'd be worried. It's saying that the output is quite a bit higher than it should be unless those cables you're using have absolutely no voltage drop? Which is unlikely.
McBryce.
The DMM has a 10 Meg ohm input resistance. Even if the leads had a total of 1 ohm resistance, the voltage drop would be half a microvolt.
I decided to do the print format edit first. Now everything is lined up and prints out values in volts and amps. So no mucking about with annyoing scientific notation. I also reduced the settling time to 2 seconds for volts and 3 seconds for current. This is in anticipation of the larger cal point arrays, which will make the calibration take longer.
Let me know if you have any problems with new changes and thanks for using the script!
UPDATE: There was a minor typo in TelnetTest that I fixed. It would cause the DMM to report an error because of an unfinished measuremnt. Accidentally called readDMM() instead of dmm.read() inside the for loop. Whoops.
As soon as you issue an SCPI command the instrument will switch into remote mode and the local panel is disabled. There's usually a way to regain local control.
*RST disables continuous auto triggering in some instruments, so unless you start a measurement (MEASure), set a trigger, or auto trigger-and-wait (READ) the instrument won't take measurements or display a readout (readout being a calibration data corrected measurement). My Keithley 2001 works the same; if I recall it also permits choosing what settings to load on reset, with the default being a complete reset with everything disabled (which is more practical for remote operation).
[...] so unless you start a measurement (MEASure), set a trigger, or auto trigger-and-wait (READ) the instrument won't take measurements or display a readout (readout being a calibration data corrected measurement). My Keithley 2001 works the same; if I recall it also permits choosing what settings to load on reset, with the default being a complete reset with everything disabled (which is more practical for remote operation).
[...] so unless you start a measurement (MEASure), set a trigger, or auto trigger-and-wait (READ) the instrument won't take measurements or display a readout (readout being a calibration data corrected measurement). My Keithley 2001 works the same; if I recall it also permits choosing what settings to load on reset, with the default being a complete reset with everything disabled (which is more practical for remote operation).
You can also use INIT to begin a measurement, *OPC? to query if the measurement is done and FETC? to move the measurement to the output buffer. Of course, the DMM must be triggered, as you pointed out, either internally, externally or over the bus. I generally let the DMM figure out the triggering (TRIG:SOUR IMM) but you can manually trigger by sending *TRG (if using TRIG:SOUR BUS).
@Trident900fi are you using the Python script or are you writing your own? If you're using the Python script you might want to also get help over at https://www.eevblog.com/forum/testgear/dp832-calibration-using-python-pycharm-running-on-windows/. You can also use my Telnet script if you'd like. If you do, test your setup first using test_run.bat.
Are you guys running the 01.16 firmware now? I'm on 01.14 - should I upgrade it?