Author Topic: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels  (Read 1835 times)

0 Members and 2 Guests are viewing this topic.

Offline theoldwizard1Topic starter

  • Regular Contributor
  • *
  • Posts: 175
So Pico has introduced a new product line, the 6000E series.  8 channels, 500 Mhz bandwidth, 4Gs memory.  Very impressive, especially since they have 2 ADCs.  They also use USB 3 speed for connectivity. (But WHICH USB 3 ?  Probably 5 Gb/s.)

But things get confusing to me when you go a little deeper.  Each ADC is capable of 5GS/s and has 4 channels.  So that is actually 1.25 GS/s for all four channels on one ADC.  In 8 bit mode.  After that, there is a lot of qualifying/"hand waving" about sample rates in 10/12 bit modes.  (Best 12 bit is 1.25 GS/a for one channel down to 625MS/s for eight channels in 10 bit mode).

Those still are impressive numbers, but I am old enough to remember the first digital 'scopes and I still have "heartburn" about bandwidth vs sampling rate.  I worked in the "hard real time" world, where outputs had to occur with μsec accuracy.  Of course these were widely spaced, up to msec apart, so bandwidth did not mean much.

My first question is, would anyone realistic use a tool like this for signals close to 500 MHz ? 250 MHz ?  100Mhz ?

Second question.  Are people these days willing to use a separate "display" device instead of something built into the 'scope ?  The key advantages are a larger display and the ability to store many waveforms.
« Last Edit: April 16, 2020, 04:30:25 pm by theoldwizard1 »
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27575
  • Country: nl
    • NCT Developments
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #1 on: April 16, 2020, 04:49:08 pm »
When it comes to Pico I'd be very wary about 10/12 bit being real (as in the ADC actually has 12 bit resolution). In many of Pico's products the extra bits are gained through oversampling. In theory oversampling works but in practical situations you will want the ADC to have the specified resolution.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline rvalente

  • Frequent Contributor
  • **
  • Posts: 730
  • Country: br
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #2 on: April 16, 2020, 05:05:08 pm »
For such a device, they really should include a keyboard mimicking the layout of scope, much like the MIDI consoles mimics the CDJs and vinyls for DJing..
Using virtual instruments with a mouse on a screen is such a terrible experience,  even if you have a touch screen
 

Offline chickenHeadKnob

  • Super Contributor
  • ***
  • Posts: 1060
  • Country: ca
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #3 on: April 16, 2020, 05:28:12 pm »
Second question.  Are people these days willing to use a separate "display" device instead of something built into the 'scope ?  The key advantages are a larger display and the ability to store many waveforms.

The last question has an easy answer. Simply invert the question to the contra-positive. I would not buy a multichannel MSO (> 2 channels + digital) without HDMI out these days. I don't like squinting at some small square waves crammed at the bottom of the built in screen along with the tiny fonts.
 
The following users thanked this post: Someone

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27575
  • Country: nl
    • NCT Developments
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #4 on: April 16, 2020, 05:32:20 pm »
Nowadays many oscilloscopes feature streaming the output to a web-browser realtime.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 17016
  • Country: us
  • DavidH
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #5 on: April 17, 2020, 03:08:51 am »
But things get confusing to me when you go a little deeper.  Each ADC is capable of 5GS/s and has 4 channels.  So that is actually 1.25 GS/s for all four channels on one ADC.  In 8 bit mode.  After that, there is a lot of qualifying/"hand waving" about sample rates in 10/12 bit modes.  (Best 12 bit is 1.25 GS/a for one channel down to 625MS/s for eight channels in 10 bit mode).

I cannot say exactly what Picotech is doing but modern DSOs commonly use pipelined ADCs, also known as subranging, that have multiple stages which provide more bits than the result.  At the highest speed, the extra bits are used for error correction but many support a lower sampling rate at increased resolution.  Bandwidth is constant since it only depends on the input sampling stage.

This is in contrast to DSOs which provide increased resolution during decimation at the expense of effective sample rate and bandwidth.

Quote
My first question is, would anyone realistic use a tool like this for signals close to 500 MHz ? 250 MHz ?  100Mhz ?

It is the same as any other oscilloscope however it is often more relevant to consider transition time rather than bandwidth unless you are using it as a spectrum analyzer.

Quote
Second question.  Are people these days willing to use a separate "display" device instead of something built into the 'scope ?  The key advantages are a larger display and the ability to store many waveforms.

In some applications they obviously are.  I would be worried that newer personal computers and software does not support the DSO hardware though.  At least if my DSO has an embedded operating system and controls, it will continue to work after Microsoft drops support for its OS.
 

Offline TheUnnamedNewbie

  • Super Contributor
  • ***
  • Posts: 1211
  • Country: 00
  • mmwave RFIC/antenna designer
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #6 on: April 17, 2020, 07:00:04 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
The best part about magic is when it stops being magic and becomes science instead

"There was no road, but the people walked on it, and the road came to be, and the people followed it, for the road took the path of least resistance"
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4770
  • Country: au
    • send complaints here
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #7 on: April 17, 2020, 09:19:25 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
Some less scrupulous brands have advertised bit depths that while technically were present in the data path were so noisy and non-linear that it was misleading. SFDR or ENOB come in to play as better measurements of performance for comparison.
 

Offline TheUnnamedNewbie

  • Super Contributor
  • ***
  • Posts: 1211
  • Country: 00
  • mmwave RFIC/antenna designer
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #8 on: April 17, 2020, 09:23:37 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
Some less scrupulous brands have advertised bit depths that while technically were present in the data path were so noisy and non-linear that it was misleading. SFDR or ENOB come in to play as better measurements of performance for comparison.

I fully agree that SFDR and ENOB are much better metrics, but that is separate from this seeming claim that you have 'real bits' and 'over sampled bits' with the latter somehow being inferior.
The best part about magic is when it stops being magic and becomes science instead

"There was no road, but the people walked on it, and the road came to be, and the people followed it, for the road took the path of least resistance"
 

Online HKJ

  • Super Contributor
  • ***
  • Posts: 3004
  • Country: dk
    • Tests
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #9 on: April 17, 2020, 09:53:04 am »
I do not have problems with seeing if it is real bits or oversampling on my picoscope:



I have copied two images together to get the above image, I cannot have both selections open at the same time.
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27575
  • Country: nl
    • NCT Developments
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #10 on: April 17, 2020, 10:18:28 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
Oversampling requires that the signal has uniformely distributed noise which is at least 1LSB and the ADC is more linear than necessary for the number of bits it has. Think of it like this: you have 4 comparators to detect a voltage level from 0V to 5V. Each step is 1.25V. Now apply 2V to this setup. How are you going to determine that the applied voltage is 2V? You can only see the input is greater than 1.25V and smaller than 2.5V.

If you look at a delta-sigma ADC you'll see it has extra circuitry to increase the resolution of each step. SAR ADCs OTOH have fixed steps.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 
The following users thanked this post: I wanted a rude username

Offline TheUnnamedNewbie

  • Super Contributor
  • ***
  • Posts: 1211
  • Country: 00
  • mmwave RFIC/antenna designer
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #11 on: April 17, 2020, 10:54:54 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
Oversampling requires that the signal has uniformely distributed noise which is at least 1LSB and the ADC is more linear than necessary for the number of bits it has. Think of it like this: you have 4 comparators to detect a voltage level from 0V to 5V. Each step is 1.25V. Now apply 2V to this setup. How are you going to determine that the applied voltage is 2V? You can only see the input is greater than 1.25V and smaller than 2.5V.

If you look at a delta-sigma ADC you'll see it has extra circuitry to increase the resolution of each step. SAR ADCs OTOH have fixed steps.

The extra circuitry is used for noise shaping. Noise shaping helps get the most out of oversampling since it pushes the quantization noise to higher frequencies, where you will filter it out later when you decimate the sample rate. But it is not required. Any ADC can over sample. With correct calibration, even linearity is not the biggest issue (in fact, that is a weakness of noise-shaping ADCs, since they use a feedback loop, so calibrating for quantizer non-linearity is much more challenging than when you don't apply noise shaping).

Your example of a 2V applied to an oversampling ADC is not a good one since that is a DC signal, and oversampling cannot help you there without applying some form of dithering to push the error you have to higher frequencies. But say it is an AC sinewave with an amplitude of 2V, then you can use the over-sampled bits together with the knowledge that your signal is band-limited to a lower frequency to figure out the amplitude.

Assume I have a tone with frequency \$f_{in}\$ with a voltage of y. I use an adc with a given number of bits, and so I will have a certain amount of quantization noise, equal to at most 1/2 LSB. For a second, lets assume we have a perfect anti-aliasing filter at \$f_a \geq f_{in} \$ and that I sample at exactly nyquist, of \$f_{sample} = 2 \cdot f_a\$, the SNR will be \$2Y/LSB \$. 
If I now sample at twice the anti-aliasing frequency, the spectral density of my input signal is still exactly, the same, since the tone and its amplitude remain unchanged. The quantization noise is also the same at 1/2 LSB. However, this is now spread over twice the bandwidth. I can now apply a filter to decimate my samplerate to half, and this will get rid of half of my quantization noise, since it is at a higher frequency. As a result, I end up with 1/4 LSB quantization noise, but my signal power is the same. The SNR has been reduced to that of an ADC with one additional bit of quantization.

Any non-linearities in the quantizer will result in spurs occuring since they quantization becomes non-linear.
The best part about magic is when it stops being magic and becomes science instead

"There was no road, but the people walked on it, and the road came to be, and the people followed it, for the road took the path of least resistance"
 
The following users thanked this post: _Wim_, I wanted a rude username

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 17016
  • Country: us
  • DavidH
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #12 on: April 17, 2020, 11:57:40 am »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.

Could you give some examples?  Except for some rare instrumentation converters which oversample a SAR ADC, only delta-sigma converters commonly oversample.
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27575
  • Country: nl
    • NCT Developments
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #13 on: April 17, 2020, 12:03:09 pm »
How are over sampled bits not real bits? Almost every ADC now uses some form of oversampling, either 'real' (in the form of delta-sigma) or something like SAR.
Oversampling requires that the signal has uniformely distributed noise which is at least 1LSB and the ADC is more linear than necessary for the number of bits it has. Think of it like this: you have 4 comparators to detect a voltage level from 0V to 5V. Each step is 1.25V. Now apply 2V to this setup. How are you going to determine that the applied voltage is 2V? You can only see the input is greater than 1.25V and smaller than 2.5V.

If you look at a delta-sigma ADC you'll see it has extra circuitry to increase the resolution of each step. SAR ADCs OTOH have fixed steps.

The extra circuitry is used for noise shaping. Noise shaping helps get the most out of oversampling since it pushes the quantization noise to higher frequencies, where you will filter it out later when you decimate the sample rate. But it is not required. Any ADC can over sample. With correct calibration, even linearity is not the biggest issue (in fact, that is a weakness of noise-shaping ADCs, since they use a feedback loop, so calibrating for quantizer non-linearity is much more challenging than when you don't apply noise shaping).

Your example of a 2V applied to an oversampling ADC is not a good one since that is a DC signal, and oversampling cannot help you there without applying some form of dithering to push the error you have to higher frequencies. But say it is an AC sinewave with an amplitude of 2V, then you can use the over-sampled bits together with the knowledge that your signal is band-limited to a lower frequency to figure out the amplitude.
Where this goes wrong is with signals which are DC (or have a very low frequency). An oscilloscope needs to work for those too. I have been in situations where oversampling didn't work due to lack of noise. This is why I'm very wary of real ADC bits versus the ones created by oversampling. The latter need specific circumstances to appear. Because an oscilloscope can be used in so many different ways it is not wise to depend on oversampling.
« Last Edit: April 17, 2020, 12:09:31 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline theoldwizard1Topic starter

  • Regular Contributor
  • *
  • Posts: 175
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #14 on: April 17, 2020, 01:53:28 pm »
I would not buy a multichannel MSO (> 2 channels + digital) without HDMI out these days. I don't like squinting at some small square waves crammed at the bottom of the built in screen along with the tiny fonts.
Very well stated !
 

Offline theoldwizard1Topic starter

  • Regular Contributor
  • *
  • Posts: 175
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #15 on: April 17, 2020, 01:55:15 pm »
Nowadays many oscilloscopes feature streaming the output to a web-browser realtime.
If you believe that is "realtime", I have a bridge to sell !
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27575
  • Country: nl
    • NCT Developments
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #16 on: April 17, 2020, 02:05:04 pm »
Nowadays many oscilloscopes feature streaming the output to a web-browser realtime.
If you believe that is "realtime", I have a bridge to sell !
If you think that the display on a DSO is updated more often than 10 times per second you are in for a surprise. There is a limit to what you can register with your brains. 5 display updates per second is at the upper limit of what you can keep up with. Take that back to the function of a DSO then you quickly realise that a DSO will have to display an anomaly for a relatively long period of time in order for the user to register it. In turn that means that you won't need a high framerate to make streaming into a web interface useful.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Nanoman

  • Newbie
  • Posts: 6
  • Country: gb
Re: Specsmanship : sample rate vs bandwidth vs # of bits vs # of channels
« Reply #17 on: June 29, 2020, 11:31:43 pm »
When it comes to Pico I'd be very wary about 10/12 bit being real (as in the ADC actually has 12 bit resolution). In many of Pico's products the extra bits are gained through oversampling. In theory oversampling works but in practical situations you will want the ADC to have the specified resolution.

These products use E2V ADC's, one 8bit and the other 10bit. The '12bit mode' is created by averaging the result from four 10bit ADCs, the averaging results in lower noise than can be obtained from a single 10bit core.
The drop in sample rate between 8bit and 10bit modes is due to memory bandwidth.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf