I have a strong belief on what is going on, it's just difficult for me to try put it into words.
The gradient is programmed. But the relationship to the color of gradient spectrum is established during first temperature reading. As an example, the pantry image where it was 46F at it's color purple/blue, was the same color of my first reading when the camera was in the freezer. Different temperatures, but same color. The gradient is dynamic in sense that, when the sensor detects higher radiation in the corner, the rest of the sensor compensates, by moving that blue color away from it's inital location. So what was once blue, is now white and what was once orange is now blue.
If radiation on the sensor exceeds what it first detects from inside the housing, there will be no gradient.
The sensor detects radiation from the chips, enough to over power the temp in the freezer because. The chips must be isolated from the sensor better.
If the sensor was placed over top the chips, centered on the pcb, I am almost certain there would be no linear gradient, at worst, a radial gradient.
I also noticed that how a traditional camera reacts to different light conditions by adjusting exposure, is the same that thermal camera does with the changing levels of radiation, creating an awful image.
The lens has a fixed aperture and I'm wondering if different apertures will create a sharper image and affect the range of gradient.
It's more than one issue.
This video shows how the color association changes. Keep your eye on the bottom left. It shouldn't change color at all as the freezer temp hasn't changed. The color only chages as my hand moves toward it. Re-associating the color to changing temperature introduced by my hand.
video:
http://youtu.be/e5M3v7k3qa8?list=UUnTjoJWDkUC85-eqBk2lLyQThis video show's how the camera falsely associates the color to temperature. When hotter temps are detected, the color previously associated with a lower temp, changes as well as the gradient range. When there are very little differences between hot and cold temps, such as when the camera is pointed at the radiator, There is very minimal gradient, and a Gaussian blur is applied. When the camera is taken into a room that is 10c or so, the curve drops and all hell breaks loose and the tv is over saturated, showing white in area's that would make one assume that was a high temp, when really was only a few degree's above THIS rooms ambient temp.
Video:
http://youtu.be/-GG8PaPEkF0?list=UUnTjoJWDkUC85-eqBk2lLyQNow if it had maintained the range of gradient and the colors associated with temp, the quality should have been expected to stay the same, regardless of ambient temperature. The tv would show a hue of orange perhaps, and the remainder of the items in the cold room, hue's of blue.
I don't know if any of this makes sense but these are my thoughts and if I had the technical ability I would tear the camera up and test these theories out.