I haven't looked much at the cal system yet but I would think that if you run the cal and it still reports back that cal is needed then you've probably got a hardware issue.
On the other hand, let's have a look at what affects auto-cal:
First there is the temperature drift auto-cal setting located under Utilities->Preference Setup:
Hypothesis: Leave it on and auto-cal will run periodically if the room/instrument temperature changes, leave it off auto-cal will not run but accuracy will be a function of temperature.
Service menu (have to log in) Production->AladdinAcqBoard->Calibration:
It appears that you can inspect, adjust, verify, and recalibrate the various curves here. There is also an "Auto cal" chckbox that does not seem to track the user settable temperature auto-calibration, so I think it might be something else.
Hypothesis: Unchecking this auto-cal box may disable automatic calibration when switching timebase and vertical input scales.
Service menu (have to log in) Service->AladdinAcqBoard->AutoCal:
Here we can see and perform the individual calibrations that are done as part of auto-calibration. These are pretty self-explanatory except for "Hopping." Perhaps someone else knows more about this term than I do. LeCroy has a patent where they refer to using "hopping circuitry" to resolve digital divider phase. My guess would be that the "Hopping" cal either synchronizes digital dividers across channels or has something to do with ADC interleave alignment.
To run the calibration you stop the scope (Trigger->Stop), check the boxes for the calibrations you want to run, then press the "Calibrate Scope" button.
Then in the bottom-right is the front-end gain cal optimize setting:
These are curious choices since they don't seem like a natural trade-off. I guess in the one case you will have fewer but longer calibrations and in the other you will have more frequent but shorter calibrations. Sounds like a wash in the end to me...
Finally there is the calibration cache which I have mentioned before in another thread on here:
The cache is a password-protected Access database with a table for the
instrument setup and tables for each of the
calibration curves (ADC delay, ADC FE gain, ADC FE offset, ADC gain, ADC offset, trigger propagation delay, and positive and negative trigger threshold).
An instrument setup defines the instrument state that the calibration curves were taken at. The setup is made up of volts/div on each channel, fixed gain of each channel, input attenuation of each channel, bandwidth of each channel, input coupling of each channel, horizontal time scale, whether or not an external clock is being used, the trigger type and source channel, and
temperature.
Change any one of those variables and it is a new row in the setup table which probably means a new calibration may be performed. I highlight temperature because it is the one thing outside of user control.
The curves all have a foreign key for the setup table (CacheSetup) primary key and it looks like there are up to 24 curve rows per curve table per setup. For 4 channels that would be 6 points for each channel.
Hypothesis: Leaving the cal cache enabled (both boxes checked) will, over time as it grows, reduce the number of automatic calibrations required when twirling knobs at the same temperature. Leaving it off will maximize auto-calibration pain.