The drift is high in the array, so I'm going on the principle that the thermal capacitence is low in each pixel. Any heat on the pixel, whether generated from radiation or from measurement, will saturate the pixels thermal capacity since it is so much smaller than anything ever achieved. So there comes this new territory where you must deal with barely measurable changes in heat in the pixel, which can cause rapid swings in resistance. The engineers likely decided on a longer integration time to get over the ROIC noise, and this increases the time that current is passing through the pixel from frame to frame. Each pixel at a unique resistance, getting warmer from different amounts of current and a small thermal well, equals out to quick thermal drift and frequent NUC events. My theory is to reduce the integration time to where is it able to generate enough signal over read noise, by sacrificing some low end temperature response, in order to minimize drift. It would also provide more samples to average out any other noise. It *is* just a theory, so don't take any of it too seriously.