1. Package box shows 12-bits but actually 14-bits resolution on device label.
RE: Yes, it’s our package problem, we have start plan to change it.
To me, this reads as if they initially only were doing 12-bit interpolation but someone said "if we can turn 8 bit ADC data into 12 bit, can't we also do 14 bit?", so the specs (and software) were changed but the device packaging was overlooked.
12 bit makes a difference in the noise level. It is one of the very few high resolution scopes and is a great deal at around $450 compared with a comparable data acquisition card.
If you get true 12 bit, sure. But I wouldn't trust this kind of company to either use a real 12-bit ADC or perform the required amount of oversampling to approximate them with an 8-bit ADC. Sure, the scope says it uses 12-bit values (and probably shows them as such) but how accurate are they?
It's even worse when claiming 14 bits of resolution when using an 8-bit ADC and not doing the required oversampling, which they are:
10. Single channel on 14-bit resolution, sample rates at 100MS/s.
RE: Yes, so it does. Limited by hardware, we could implement 1GS/s sample rate only at 8-bits mode.
So in essence, they use an 1GS/s 8-bit ADC and claim they can deliver 14 bits of resolution when performing 10x oversampling. However, that would only yield 1.5 additional bits, so they're lying. For an additional 6 bits of resolution, they'd have to perform 4096x oversampling. Quite the difference, I'd say.