The LPRO-101's are getting rather long in the tooth leaving you wondering just how long you've got before you resort to refurbishing the physics package with IPA to clean 20+year's worth of contaminants off the glassware and taking a hairdryer to the lamp in the hope of redistributing the rubidium back to where it belongs. They're a better choice compared to the FE5680As if you need a low phase noise reference you can use to transvert an HF transceiver right up to 24GHz without having to add a clean up XO.
Most of the problems that I've seen mentioned with LPROs are electronic, not lamp. Although the lamp has a more defined ageing characteristic than most other components, it is otherwise just one reliable component out of very many others, so tends not to be the cause of too many problems. Lamps have been gradually refined over the decades, and by the late 90's it appears to have become much less of an issue.
It's not just noise that causes problems with other types. The SRS PRS12 is excellent, but it uses a DAC for all it's adjustments. This limits the resolution to between 1E-12 and 2E-12, and there is the DAC non-linearity on top of that.
The FE-5680A type ones come in many different variants, and usually have a coarse temperature compensation that causes frequency jumps.
I once checked with Temex about their FRS series and was told that the analogue C-field adjustment was truly analogue and step-less, but neither of the two old Temex ones that I tried worked brilliantly.
Most rubidiums are so noisy that for RF applications you're often going to need to add an OCXO and lock that to the rubidium anyway.
I take your point over the use of a simple fan to control temperature. You've just got to pick a target temperature that's not too high for the sake of the electronics, yet not so low that the heaters in the physics package land up running too close to the limit for comfort at the lowest expected ambient temperature.
There's a table in the manual for it that shows the MTBF falling from 381,000 hours at 20°C to 134,000 hours at 60°C. There's really way too much concern out there about operating them at lower baseplate temperatures. The LPRO has a minimum operating temperature of -30°C, and often will be operating in equipment which has fans blowing air over them keeping the whole case at ambient (probably not at -30°C...). Keeping the baseplate temperature down towards room temperature - although not necessary - is not a crazy low target, but will increase the power consumption a bit and probably won't be practical with fan cooling unless you live in a far north cave. If the temperature is set too low it will just take a sunny day or a heater left on for longer than expected for it to get too hot and potentially mess up a long test run.
I reckon my chosen instrument case might be large enough to avoid the need for ventilation by varying the speed of recirculated airflow targeting the base plate to control heat dissipation into the relatively large surface area of the enclosure's panels. I've always got the option to add ventilation to the case if such a cooling scheme proves inadequate during the final commissioning test stage. It'll be interesting to see what stability improvement I can achieve with thermal regulation alone (first things first, especially the easiest ones to begin with ).
I think that it might be a challenge to keep the temperature down with a closed case. I tried that with a power supply and there was still a lot of heat build up.
One issue with having the fan in more or less open air is that if the air temperature changes fast (such as opening a door) it tends to cause a temperature spike - adding an extra sensor measuring the temperature of the cooling air should help. Something this size does not have a single temperature anyway, but it may be as well just compensating for any remaining temperature fluctuations rather than trying too hard to remove them.
I was experimenting with a PWM fan that stopped rather than just going slow at minimum, and combining that with feedback should give good control.
I've already purchased a BMP280 module from here:- https://www.ebay.co.uk/itm/BMP280-Pressure-Sensor-Module-Arduino-Precision-Atmospheric-BMP180-Replacement/283518687928?hash=item420307dab8:g:3gAAAOSwLdVcpzYQ
along with a couple of 4 channel bi-directional logic level converters to interface it to my Nano3 so I can experiment with the BMP280 libraries and create a suitable pressure and temperature sensor module to replace whatever thermistor based fan controller I cobble together for my initial thermal regulation experiments.
Off the top of my head, I don't think you can sample barometric pressure readings much faster than ten times a second with the BMP280 anyway which is at least an order of magnitude quicker than I'd need even if I use an oversampling technique to filter out any sampling error noise for a once per minute update to the output signal driving the EFC input of the RFS. Even this rate is probably way faster than would actually be called for.
Changes of barometric pressure even in the case of rapidly approaching storms here in the UK seem unlikely to exceed a 10 hPa per hour. Checking out my inexpensive I.T. Works weather station's pressure trend histogram just now shows that between 4 and 8 hours ago, there had been a 'sudden jump' (basically a 1 hPa per hour rise) of 4 hPa up to the current reading of 1010 hPa. Actually, the histogram just updated to show that same jump as now spanning an 8 hour period! At those rates of change, the Nano3 won't be working hard so much as hardly working at all .
Since the sensitivity of an LPRO to pressure is <1E-13/mbar, but its nominal 1s stability is very much greater at 3E-11, there's not usually much need to worry about noise or sampling rate for pressure.
That remark about your LPRO being so good that you lost interest in GPSDOs made me smile. I fully appreciate where you're coming from. However, if you want to check and redo the calibration from time to time, you'll need to keep hold of at least one GPSDO even if its short term phase stability leaves something to be desired compared to a RFS.
The timescales involved are long enough that a GPS will usually be as good or better than a GPSDO anyway, as long as there are no major sawtooth issues. For calibration, I usually just log 1000 second averages of the 1PPS derived from the LPRO against the 1PPS from GPS. That makes it very easy to check the drift over hours or days.
I checked out the price of a ZED-F9 based plug 'n' play module - too bloody expensive for my taste . I'll make do with a humble single frequency timing module for my current gpsdo experiments for the time being. As for the idea of acquiring a TDC (I had to google that acronym - I'd seen it used before but couldn't recall exactly what it had referred to ), I'll manage a bit longer without (just like I managed without a RFS until I just had to take a closer look at how my GPSDOs were responding to the effects of ionospheric disturbance to the ToF of the satellite signals ). And, to think this all started out as a means of accessing a high accuracy frequency reference to calibrate a TCXO (followed by an OCXO) upgrade of the crappy 50MHz XO smd IC in a cheap FeelTech FY6600 AWG function generator.
With me it was to calibrate an oscillator for a clock...
There are lots of cheap boards with TDC-GP21 or GP22s in them, and there are arduino libraries for them. Just add an arduino and a few other components and for very little you can measure short time differences of up to 100us quite well. Much cheaper to buy and run than frequency counter.
The TI TDC7200 is probably easier to use, but I've not seen any cheap boards for them.
This may be getting a little OT for this thread, but it is one of the few ways of actually testing a GPSDO. The self referencing measurements can only go so far.