Comparing the latest 2.5.12 with the earliest version I have, 2.5.3. Both in demo mode. Attempting to set intensity to give the same shading.
IMO, these speckles are always going to raise questions for the user if they are dealing with a scope or a signal problem. Looking forward to the updated firmware. I have not tried to bump the triggers above 30k as suggested.
The range of the brightness slider is increased in 2.5.12 compared to 2.5.3. Attached is a screenshot with both at the same internal setting.
The speckles are inherent to any single-comparator CDF sampler, and will tend asymptotically to zero as Nmin is increased. The math is in Section 2.2.5 of the new manual revision. The new firmware revision will remove the CDF quantization noise (which contributes about half the statistical noise at Nmin = 10k), which will improve but not eliminate the speckles.
Our hope is the expanded manual section is clear enough for the user to understand why they occur, and what factors control their intensity. Practically, this limits the BER fidelity to 10^-5 for a reasonable acquisition time with a single comparator.
I am sure you were aware of the speckles early on in the design phase. Most likely before even starting on the hardware. I envision the signal processing was simulated first, but maybe not. I am curious if you knew changing the architecture would have solved it, why didn't you just change it. Was the added cost really that big of a factor?
Had the dual-comparator approach been used, how would it have effected the sweep speed compared to your future firmware approach?
There's a mix of reasons. Adding another comparator is not trivial - it opens up part matching issues, and the math + feedback algorithms get substantially more complex. It would likely have added another 6 months of development time.
When the GigaWave was launched, we had no idea how customers would react to a CDF sampling scope. There wasn't any existing product to directly compare to. It was also our first product, so we wanted to minimize the hardware complexity to reduce the chance of things going wrong. (And as you know, things still managed to go wrong with the initial firmware revision.)
The original target application was in photonics and ultrafast laser research, where they mainly want to measure pulse widths and risetimes (with very high repetition rates), with less emphasis on eye diagrams and BER. We realized only after launch that the latter market might be much larger.
We have been turning away customers who ask about BER applications, due to the single-comparator design. Had we known what we know now, we would have gone with the dual comparator design. Of course, all the problems are obvious in hindsight.
If we introduce a dual-comparator version of the GigaWave, it would be similar enough to integrate into the existing software. The sweep speed would be the same (or faster).
For a dedicated SI analyzer, we can substantially simplify the trigger, and run the comparator clock at GHz speeds. This would allow for an eye diagram that updates in real-time, as well as BER testing to 10^-12.
Hope this cleared things up. May not be what you were looking for - but it's the honest story, and the best answer we can give.