Philips PM3340: This scope offered 2GS/s back at that time. But it was equivalent-time sampling. Does anybody know if it also offered real-time sampling, and what the actual bandwidth was?
The PM3340 manual shows that it has a 10 bit ADC, not 14 bit.
I had a look at some old documents. The scope I had was a PM3320A, not PM3343 (not sure the latter even exists).
I couldn't find the specs on a quick search but you're probably right that it was 10bit only.
Are you sure it was 250MSPS? It's hard to believe they could build even 10bit 250MSPS ADCs back then.
Yes, 200MHz bandwidth and 250MSa/s sample rate.
As someone already pointed out, the PM3340 was 2 GHz, not 2 GSps. I have the PM3320A, little cousin to 3340. The PM3320A is only 250 MHz but did up to 10 GSps effective-time sampling. At 5 ns/div the register is 512 samples deep, which affords 50 samples per division (slightly more than 10 divisions are shown on screen). Then 50 samples / 5 ns is 10 samples per ns, or 10 GHz. The PM3320A service manual goes into much more detail about the sampling system than the user manual. Do you have the PM3340 service manual? The following is for the PM3320A but the PM3340 is so similar this will be useful to understand the PM3340 too. The PM3320A has three sampling modes: Real time, effective time, and random.
The PM3320A had two different real-time sampling modes. First, using the ADC directly. This was good up to 200 kSps, so 2 ms/div (4096 samples/ ~10 div screen or exactly 400 samples per div. At 2 ms/div, that works out to 5 microseconds/sample or 200 kSps). The other way is using the CCD, a charge-coupled device, a.k.a. bucket-brigade device. Analog samples are clocked into the CCD at a high rate, then clocked out later at a lower rate (50 kHz or so I think) for digitization. The CCD was only 512 samples deep (actually there are two 512 sample CCDs, but every other sample on each was the ground reference, so still 512 effective samples), so for true real time, the resolution is limited to 512 samples/screen (1/8 of the maximum resolution), with interpolation between those for the display. If you use the "Max Res" function, then it combines 8 captures of 512 samples each, shifted slightly in time, to give a 4096 sample deep capture, but that's not real time any longer. The maximum real-time sample rate is 250 MSps at 200 ns/div (2 Gsps effective time at 200 ns/div). If using "Max Res" at the same time/div setting, the effective time sample rate is then 2 GSps (8x as high).
Beyond the 200 ns/div or 250 MSps limit, the PM3320A did random sampling. The CCD would still be filled at 250 MSps, but when reading/digitizing the samples, only 1 in N were kept and stored in the register (sample memory). At 5 ns/div, only 1 in 40 are kept, meaning that only 6 samples are captured and stored for each trigger. The displayed waveform is build up over many triggers, so obviously the input waveform must be repetitive and stable. It could take a few seconds to get a complete waveform on the display, and if you were using averaging to reduce noise, then you are in for a good wait while enough of each sample slot are captured. Luckily you can see the waveform building on the display. It's called random sampling because the phase/time difference between the 250 MHz sampling clock and the trigger is effectively random, so the first captured sample is randomly positioned on the incoming waveform. This contributes to the rather long capture time, luck determines when you get the full set of 512 samples from those 6 randomly placed samples per trigger event.
The antiquated sampling system of the PM3320/PM3340 is offset by a couple of features that set this apart from modern competition. First, true 10 bit resolution. I don't need to say more. Second, the screen is 4k. Yes that's right, 4k. The displayed waveform on the CRT is 4000 pixel wide by 1024 tall. What's the screen resolution of your Rigol? or your >$30k Keysight?