I don't know the design of the bg7tbl GPSDO, but most designs are phase locked to the GPS - i.e. on average the frequency will be exact within the limits of the GPS system. The GPSDO tries to minimise the phase difference between the local oscillator and what it thinks the GPS is telling it. However, what it thinks the GPS is telling it and the actual GPS time can differ slightly due to conditions between the satellites and the receiver.
What this means is the actual frequency of the GPSDO will wander between slightly high to slightly low due to changing conditions.
Comparing two GPSDOs of the same nominal frequency should yield nothing more than a phase difference between the two. If the phase difference is changing, then the instantaneous frequency of the two is different by the rate of change of the phase difference. It doesn't say if either is better than the other.
Two similar GPSDOs running in the same location from similar or (as some people do) the same antenna are going to react similarly to changes in conditions between the satellites and the receiver. Their frequency variations will track together so the phase difference won't show much.
One way of finding out which GPSDO is better is phase comparison with a local oscillator known to be stable. It doesn't need to be accurate. If the phase change over time (e.g per second) is constant, then the GPSDO is also stable. And knowing it is accurate in the long term confirms that it is also accurate in the short term (which is the desired outcome). This may be beyond the capability of a hobbyist as the phase change variation second to second can be small and not detected by many instruments.
Maybe the best the hobbyist can do is try to minimise external variability. A good quality OCXO in a stable environment (temperature, supply voltage, loading) controlled by a stable control voltage derived from a timing GPS receiver with a well placed antenna. Just how far one is prepared to refine the accuracy of a GPSDO is determined by what it is used for. If it is used for nothing more than the joy of getting ever better accuracy then the rabbit hole goes very deep. If the use is more practical, most GPSDOs are far more accurate than most hobbyists need.
Update - as I'm writing this the Amazon OCXO is now remaining stable to1001502001215 10 second gates, so maybe it just needed longer to warm up and/or I need to keep the AC off.
Hi Electro Fan,
the topic of timing, accuracy and stability is a much wider and deeper field, as it looks from the ouside (the rabbithole of time nuttery).
And I am in no means an expert in this.
If you feed the couter input with its 10 MHz outut, you see the "hard limit" of the counter. Your results can never be better and it does not mean, that your results can be as good.
If you feed the counter with a signal form a generator and both are synced to the same soure, imho there is not much to gain from that.
If you want to evaluate an oscillater, you need another one which is at least a magnitude better.
The BG7TBL GPSDOs are rumored to have an inherent error in the range of about 2 mHz. It is not clear to me which models are affected.
I have a BG7TBL and a Samsung gpsdo and I have found this to be the case for my unit, but I have not done extended tests.
I suggest to check out the tinyPFA Phase Frequency Analyzer:
https://www.tinydevices.org/wiki/pmwiki.php?n=TinyPFA.Homepage
I got a used NanoVNA-H4, which can be flashed with the TinyPFA firmware.
I found this to be a very interesting low cost device to experiment with GPSDO, OCXO and counters.
The FA-2/FA-3 and the TinyPFA do work with the Timelab software.
This can keep you busy for quite some time
Regards
Chris
If this happened to be correct, we'd just be down to whether they are truly in sync including not only frequency but also phase. Assuming they might be close but not identical on frequency, why would we potentially care about some amount of phase offset? It seems like we would only care about phase offset if we were within some specified amount of frequency, ie to some Hz or some fraction of a Hz.... yes/no? Thanks
If this happened to be correct, we'd just be down to whether they are truly in sync including not only frequency but also phase. Assuming they might be close but not identical on frequency, why would we potentially care about some amount of phase offset? It seems like we would only care about phase offset if we were within some specified amount of frequency, ie to some Hz or some fraction of a Hz.... yes/no? ThanksWhen two sources identical in frequency are compared, there is a constant phase offset (which may be zero, in which case they are in sync). If they are not identical in frequency, then the phase offset changes at the rate of the difference in frequency. If the two frequencies are wide apart, say a difference of 100Hz, then general terminology sees the difference as a beat frequency of 100Hz or a heterodyne. If the two frequencies are within a few Hz then the terminology talks about change of phase differences (in radians or degrees). For instance, comparing 10MHz with 10,000,000.1Hz the phase changes 1/10th of a circle or by 36 degrees a second. There's no difference between the fundamentals of the two examples, just the terminology changes.
So when comparing two close frequencies, it is more practical to measure the change in phase offset over some period than by counting cycles. For example if the difference is 10µHz then it will take 100,000 seconds for one signal to have one more cycle than the other. The period of a 10MHz signal is 100ns (nanoseconds) and a 9,999,999.99999Hz signal is 100.0000000001ns. So the difference between say the zero crossing of each signal is changing by 0.0000000001ns per cycle (a ridiculously small amount of time). But over a 1 second period the difference is .001ns - 1ps (one picosecond). This is measurable with a nanoVNA configured as a Phase Frequency Analyzer.
So phase offset changes are of importance when comparing two close frequencies. From the specification of the BG7TBL FA-2 Counter it would appear it uses phase change to measure frequency as it claims a resolution of 0.0001Hz@10MHz with a 1s gate. However, this says nothing about its accuracy. That is determined by whatever reference signal it is using.
Does that make sense?
If this happened to be correct, we'd just be down to whether they are truly in sync including not only frequency but also phase. Assuming they might be close but not identical on frequency, why would we potentially care about some amount of phase offset? It seems like we would only care about phase offset if we were within some specified amount of frequency, ie to some Hz or some fraction of a Hz.... yes/no? ThanksWhen two sources identical in frequency are compared, there is a constant phase offset (which may be zero, in which case they are in sync). If they are not identical in frequency, then the phase offset changes at the rate of the difference in frequency. If the two frequencies are wide apart, say a difference of 100Hz, then general terminology sees the difference as a beat frequency of 100Hz or a heterodyne. If the two frequencies are within a few Hz then the terminology talks about change of phase differences (in radians or degrees). For instance, comparing 10MHz with 10,000,000.1Hz the phase changes 1/10th of a circle or by 36 degrees a second. There's no difference between the fundamentals of the two examples, just the terminology changes.
So when comparing two close frequencies, it is more practical to measure the change in phase offset over some period than by counting cycles. For example if the difference is 10µHz then it will take 100,000 seconds for one signal to have one more cycle than the other. The period of a 10MHz signal is 100ns (nanoseconds) and a 9,999,999.99999Hz signal is 100.0000000001ns. So the difference between say the zero crossing of each signal is changing by 0.0000000001ns per cycle (a ridiculously small amount of time). But over a 1 second period the difference is .001ns - 1ps (one picosecond). This is measurable with a nanoVNA configured as a Phase Frequency Analyzer.
So phase offset changes are of importance when comparing two close frequencies. From the specification of the BG7TBL FA-2 Counter it would appear it uses phase change to measure frequency as it claims a resolution of 0.0001Hz@10MHz with a 1s gate. However, this says nothing about its accuracy. That is determined by whatever reference signal it is using.
Does that make sense?
Thanks for all the info, it makes sense, mostly. The only thing I'm wrestling with is why we would care much about phase offset if two signals are more than 1 Hz apart (other than possibly to observe drift such as jitter)? I can see the value in examining phase offset if two signals are less than 1 Hz apart. Thx again.