I was referring to what I've read elsewhere, written by people who have taken their scope apart and looked and seem to know more than I.
sure, but remember, most ppl don't know what they doing (not calibrated generators, wrong cables, wrong measure techs and so on).
A 1GSs scope can do "accurate" single shot for much higher freq than 75Mhz - as long the total clock jitter x 2.5
is within one sample cycle (let's say the analog freq. response is flat).
However, if you study the Rigol design, you will see that the FPGA is too small to deliver 16k short memory, which
tells me that multiple cycles are taken to get the 16k - this can result in some additional "inaccuracy" because of
the jitter between 4 clock cycles.
I haven't traced clock pins, but afaik ADCs clocks are coming from not-dedicated pll clk out pins (increasing the jitter by 2).
So let's calculate worst case (non dedicated clock pins) for Rigol = 650ps pin jitter + 100ps PLL phase shft jitter x 2.5 = 1.875ns per cycle.
So the max. usable freq. will be 186Mhz per cycle, we should additionally add 3 x 100ps potential jitter between cycles
(4 cycles = 3 cycle diffs). This give us 2.625ns, so 1/max jitter x 0.35 = 133Mhz max. measured frequency before the
jitter becomes relevant (actually it is always important but let's talk about limits only).
As you can see, from design point of view DS1152E is above these specs, giving some errors above 133MHz
(this is now calculated for input clock with 30ps jitter). In real word there are other aspects reducing the max. single shot freq.
(or better said the accuracy caused by waveform distortion).
This is of course for single shot, if you do avg. sampling the error will be reduced by the firmware itself, so for sure the
150Mhz (DS1152E) are possible within the DSO accuracy specs.
A question: if one was to start changing these resistors, what effect would that have on the accuracy of the unit? I understand that calibration data unique to each scope is established at the factory and stored in the flash rom, and that if this data is changed the scope will no longer be accurate. Would the inverse not only be true -- would modifying the hardware require a modification to the calibration data?
The manufacturer calibration data is supposed to define the differences between signal amplitudes and skew time between ADCs.
Of course it is unique just because of the pure fact that all components have "unique" tolerance range.
This data will be then used during self-calibration to correct the measured results. In principle any modification need changes
to the manufacture calibration data, however in real world if you replace 1% resistors by 0.1% (with lower values to match freq. response for HF) the potential difference will be lower than it was in worst case before. This is of course only true if your DSO was using real 1% quality parts, if there was already bigger diff let say between one of the gain resistors and the other once then a new set
of 0.1% resistors will made your DSO unusable (just because the DSO was calibrated already with "bad" parts).
So yes, there is potential risk, it is always a good idea to mark all replaced part just in case you have to resolder them back.
That's for the op-amps resistors.
For the input resistors, they actually changing rise time response, not really that important (unless you need higher accuracy
above 150MHz). I did tested many values (and did re-calibration each time,which cost low of time) to improve my Tekway,
the diff is mariginal, not worth playing on Rigol (as the firmware is not allwoing more than 150MHz anyway).
The RC circuit in HF path is mostly responsible for compensation response, so every change there require changes to
the compensation settings. It is always a good idea to change first this part and recalibrate compensation before
you starts with other modifications. <- ups, i see now that this part even don't exist in Rigol input circuit, so don't care about too.
The trigger reponse can be regulated with one resistor middle of the trigger circuit, however the value used
in DS1102E is good enough up to 150Mhz anyway, so don't care about.
So let's sumirize: yes, we can (oh no, not this again) as long we follow some rules. I did spend many hours modyfing my DSOs,
but only because i was looking for good flat response up to 200MHz. Without these mods the response was still less than 1%
away from original 200MHz (now talking about soft-hacked Tekway from 100 to 200Mhz), generaly spoken soft hacks are
in principle good enough for most ppl (i'm one of these ppl who are doing tweeks to add 20hp to 1000hp engine)
The major problem with all these cheap DSOs are the "money savings" things, like not enough caps (Tekway),
high ripple PSU, bad overall design (sry Rigol but i don't like the 7cm distance between ADC input and PSU - and nothing between),
high jitter clock coming from FPGA i/o pins (all manuf.), not the best PCB design (like via's in signal path - again Rigol) even
sometimes insuficient/bad manufacturer calibration (uni-t) or ugly firmware errors (all cheap DSO manuf. except Instek)
Some of these things can be fixed, other not (unless you spend a lot of money for jitter attenuators - but then you can directly
buy Agilent DSOX). Some will never change, just because cheap products will be not really supported after they sold
(firmware bugs never fixed).
For most applications such cheap DSO soft-modified to the max. of what the manufacturer calculated (so for Rigol 150Mhz)
is good enough, don't forget an DSO is not high accuarcy measurment device. If you need more, you will have to spend some
money in the one or other way (time/equipment/parts necessary to modify cheap DSO vs. better quality DSO).