Can one calibrate an SDS1204X-E against external accurate sources in any real sense?
I now have a gpsdo as a ref clock for my sdg2122x and a very accurate voltage source available, and the measurements performed by the DSO are off by "significant" amounts, though still well within spec.
All Siglent X-series DSOs provide a Self Calibration feature for the amplitude related parameters. The internal reference for this has the accuracy and stability to sufficiently serve the purpose for the entire lifetime of the instrument.
I do not know if there is any internal clock frequency adjustment (which would require a VCTCXO as internal clock source). Midrange instruments like the SDS5000X provide the option for using an external clock if better timing accuracy is critical.
Here are some general considerations:
• Calibration accuracy in a DSO is limited by the resolution of the ADC. For the SDS1000X-E, SDS1000X, SDS2000X-E and current SDS2000X, there are 25 LSB/division, consequently accuracy can never be any better than 1/25 division.
• For very low levels, calibration accuracy is limited by noise as well as the output range and resolution of the internal calibration source (which might not be optimized for the high sensitivities below 10mV/div).
• In general, it makes little sense to adjust an instrument to any accuracy far better than its specification. Even if a certain (better) accuracy could be achieved, calibration would be void after a short while, because of thermal drift and limited long-term stability. DSO acquisition is rather complex and the attenuators, amplifiers and references are optimized for wide bandwidth and decent pulse response rather than high accuracy and stability, which would be difficult to achieve at the same time (and very expensive).
• It’s the same for all instruments. For instance, a decent DMM is more accurate than a DSO (but with an extremely limited bandwidth), and it can be adjusted during the calibration process. But even if you adjust it to a much higher accuracy than specified, this is just a momentary situation and the specification still applies, i.e. the error margins have not changed for the specified calibration interval. You’d need a better reference, better dividers, more precise amplifiers and a better ADC (and then some), where “better” stands for more resolution, less noise, less offset and linearity error, less temperature drift, better long-term stability and a better design that takes care of interferences, voltage drops and thermal EMF more effectively. In a DSO, there are other (partially contradicting) priorities.