Hi
This is something I don't understand but would like to understand in order not to damage my oscilloscope.
It's not about probes/attenuators or RF effects but just the basic understanding of how much voltage can be applied directly to the BNC (DC to 1kHz for example).
If you compare the specs of different scopes, you will notice the following variants:
1: Specifications in Vrms / Vpk / Vpp (where I assume e.g. 300Vrms > ~400Vpk -> ~800Vpp)
2. Values in x1 or x10 mode.
3. Additional CAT ratings
4. it is not always clear whether the maximum voltage to avoid damaging the device is meant, or the maximum measurable voltage.
Here are a few examples of specs:
Siglent SDS800XMax. input voltage: 1 MΩ ≤ 400 Vpk (DC + AC), DC~10 kHz (No CAT rating)
Rigol DHO800Max. input voltage: CAT I 300 Vrms, 400 Vpk (DC + Vpeak)
Hantek DSO2000Max. input voltage: 300VRMS (10X)
Overvoltage Category 300V CAT II
Fnirsi DSO-TC2Max. input voltage: 1:1 probe: 80Vpp (±40V), 10:1 probe: 800Vpp (±400V)
What does it all mean?
I find the Fnirsi most understandable, with the ±40V
So Rigol and Siglent can withstand 300Vrms on the BNC (1x)? But they can't measure it with max 10V/div?
The Hantek specifications seem to me to be based more on the maximum measurable voltage (8x 10V/div).
Although it says 300VRMS (10X), it has CAT II, whereas the Rigol has CAT 1
Does anyone have a clue?
Simple question, as an example of what I'm asking: Will the Hantek break if you apply +60V DC to the BNC?
Thanks for any enlightenment.