Hello everyone, I am new in the forum, but I have been following Dave's videos for a while. I am currently working as a student for one company in Germany and I am testing the telemetry systems. I am measuring the frequency response of the Digital and Analog Output and then process the data in a Graph, which gives me the percentage error on the y-ax, and the frequency on the x-axis. The same as a Bode - plot, but I am using percentage, not dB. I am using a function generator to sweep the frequency from 1 to 1000 Hz in 100 seconds. I measure the Output (first analog, then digital) and then I process the data with a special software developed by the company. The way I compute the percentage error is the following (I tried different other methods, this one just happened to work better than the others): I take the measured data and then I put a time window, which slides over the function of the output and computes the maximum every couple of seconds. Then I compute the percentage error with the help of a couple of virtual channels and I plot all the data over the frequency axis. When I am using this method to compute the analog output of the system, I get the results I expect At 1khz we have about 0.7% lower value, than the nominal, and we have about 0.05% tolerance, the problem comes when I measure the digital output. (The way I measure it is the following, I do 5 measurements and then plot then in one Graph. This way I can see the tolerance). By the measurement of the digital output according to the engineers, that have developed the ADC it is supposed to have zero tolerance, the five measurements should be exactly the same. The problem is that they aren't. We have a 0.05% tolerance at about 200hz. I and my colleagues think, that the system doesn't have enough time to sample all the samples by a given frequency, because the sweep is too fast. We are not sure that we have sampled a value that is close enough to the real maximum of the Signal. We are thinking of doing a statical sweep (for example, we measure the value every 10hz, and we stay at a specific frequency for a specific period of time). My question is the following: Is there a mathematical way to calculate how much time do I have to stay by a specific frequency (when I sweep), to be sure, that I sampled the most possible values. I know I will never have the real maximum, but I want to be in a specific tolerance (0.1%) I know that this can't be done by frequencies that are 2,3,5 times smaller than the sampling frequency, because we always sample at the same place, but by frequencies that are for example 2.3 times smaller than the sampling frequency we need to sample more than one period to get all the possible values.
Thanks in advance, and if you have any problems understanding what i mean, just ask one more time, I will be more than happy to explain myself.
Georgi