Reading your manual and thinking to write some simple software to try it out. I'm curious with the calibration, you talk about when the device has reached steady state that the need for calibration every second reduces. With that in mind, do you have a way to monitor the Xilinx or other parts for temperature? The manual does not mention this and I don't see it in your software.
When using the serial interface, we only guarantee proper operation when calibrating once per second or faster (i.e a 6% duty cycle of calibration, 60 ms every second).
We do not recommend decreasing the calibration frequency even in thermal steady-state.
The comment about steady-state in the manual is rough guidance for those who want/need to push this frequency down - we then no longer guarantee 1 ps timing accuracy, and it's up to you to check if the resulting accuracy is sufficient for your application.
There are not thermometers to measure individual chips. From our testing, no arrangement of thermometers was anywhere near as accurate as "using the delay chip itself" as a thermometer - this is what we do in practice.
For the next manual revision, we have modified this section as follows:
The 1 ps timing precision of the GigaWave is guaranteed only up to one second after this com-
mand is issued. We therefore recommend issuing this command at least once per second. If
data acquisition (R) takes more than one second, the CAL command must be issued immediately
before the corresponding delay (D) and data acqusition (R) commands.
If the device is in thermal steady-state (∼5 minutes after warmup), the 1 ps precision can often
be maintained for up to 30 seconds without recalibration. If calibrating at a reduced frequency,
the 1 ps specified accuracy is not guaranteed, and it is the user’s responsibility to verify that the
timing precision meets the application requirements.
The commands seem fairly easy to follow, until I get to the Acquire CDF (R) command. I assume the software pulls the data for what ever channels are selected and increments the delay, runs a cal, pulls the next data, repeating the cycle for however much data we want.
The R command only takes data and does not perform a cal or increment the delay. For example, in normal operation the official software more-or-less issues the sequence:
CAL D R D R D R ... D R CAL D R D R D R ... D R ... (you get the picture)
This gives you lots of flexibility on which delays you want to take data at.
The third byte represents the value of F(V ;Δt) multiplied by 255.
I'm lost. 2.2.1 doesn't explain things in simple enough terms with enough detail for me to follow along with my limited math skills. An example in plain text would be helpful.
Or, do I need to find a stats book and start reading?
The cumulative distribution function (CDF), denoted F(V; Δt), gives the probability that the signal at the post-trigger delay Δt is less than V.
If you're trying to measure a sine wave, for example, you'd expect F(V; Δt) at any fixed Δt to be equal to zero for V < V0 and equal to one for V > V0, where V0 is the actual voltage of the sine wave at Δt. In other words, a step function at V0.
For single-valued signals (e.g. a simple periodic waveform), you can more-or-less average the two neighboring voltages where F(V; Δt) first goes from <0.5 to >0.5 to find the location of the step (V0). (The software does something fancier, fitting a Gaussian error function.)
For more intricate stuff (e.g. eye diagrams), you would need to take the derivative with respect to V to obtain the probability density function. In practice, you'd implement this as a finite difference (F2-F1)/(V2-V1).
We've added this explanation as well as an example program to the next manual revision (see attached).
Hope that cleared things up a bit - let us know if not.