Having looked at a few datasheets I can fairly confidently say there's generally an inverse relationship between power consumption and noise when it comes to things like ADC drivers, ADC ICs, clock sources, and voltage regulators. Why, I'm not exactly sure, but the pattern is there. The highest precision systems tend to be at least a few watts, not exactly friendly to handheld battery applications.
I'm trying to get the best performance I can out of about 50mW. Hopefully 18 or 19 effective bits limited to audio frequencies.
I'm focused now on crystal oscillator clock sources. Phase noise seems to decrease with higher power consumption. What are some considerations/strategies to achieve optimal phase noise per watt?
It seems that power consumption doesn't increases with clock frequency, and I think I read that if you reduce the frequency of a clock signal you gain a phase noise advantage. Is this true? Does it depend on the method used to reduce frequency (eg counter circuit versus other methods) and if so, how?
A reference design I saw appeared to be using a precision 100MHz crystal oscillator and a clock output from an FPGA (presumably pretty noisy) fed together to a D flipflop, then the flipflop output into the ADC. The ADC had a max clock rate of 1MHz so presumably this is getting the precision of the higher clock with the lower FPGA-controlled frequency. Otherwise the FPGA clock could have been used directly. Is this a common technique I should try to leverage? How does it work?