Output frequency calibration.
...
Brilliant. Adjusting frequency sends new values to registers 0x25 to 0x2A which are used during start up
I don't know what is inside but we now have a way to find out.
That could mean a way with a lookup table minimize jitter perhaps?
So, just in case I was wrong about my "unmuteable fixed clock and PLL" theory, I did some tests.
Alas, correcting the frequency does not actually change the frequencies jitter.
For example, normally 10MHz square is perfectly jitter-free, due to being a perfect divider of 250.
Correcting it to 9.900000 cause the jitter to appear on 10MHz (which is now actually 10.010101MHz), and 9.900000 (which is now actually 10.000000) still is jitter free. To me that shows they are not changing the clock, just count more or less ticks of it, and the jitterfree points stays fixed on the frequencies as they are dividable on the 250MHz clock.
So I see no way how we can use this to reduce the jitter issue.
While it does not make the jitter worse, it does make it less predictable / less easy to calculate.
For example, if you were to set a realistic correction value of 9.999982, you'd get these jitterfree values:
I often use some of these, usually not because the jitter actually matters to the circuit, but mostly because the scope screen is easier to read without those annoying double lines.