I have multiple old devices, that run on the rather ancient ADC MP2734
This ADC has the following parameter:
- 14bit
- +-0.5 LSB Differential Linearity
- +-1ppm FSR/°C Differential Nonlinearty
- 0.005% p-p Noise
- +-3ppm Offest Tempco FSR/°C
- 250 Ohm Impedance
- Relative Accuracy 0.006%
- Absolute Accuracy 0.006%
- Noise 83uV rms
- Full scale 0-10V
Before this ADC, there are multiple gain stages, the lowest is 4, so about 2.5V input, will produce a full scale of 10V at the ADC. There are multiple gain settings by software which are where the majors are [4,10,20,40,100,200,500,1000]. At a Gain of 1000, 10mV at the input, will create a full scale range.
For this, i would like to build a calibration device, that can produce at least three calibration points on the scale, for example [500mV. 1500mV, 2000mV] on the gain of 4. I do not expect to need gains higher than 100, so i only need to calibrate stages [4,10,20,40,100].
Important for the calibration, it that a repetitive, exactly matching pulse can be produced. It does not matter if the pulsed voltage is near a defined number, it can be any number, but it should be always precise. For example, it does not matter if its possible to set 1.00001V, because 1.02345V will also do, but this 1.02345V should always be exactly the same number
I guess, the best way would to use Reference+DAC+Opamp and a MCU to set the values.
But since there are alot of different opamps, dacs, and references, i would like to ask, if someone could give me some tips how to choose the parts