To be very pedantic, the "CRI is poor" --> "you can see the imperfect white in the screen" conflation is a bit of a red herring -- although the huge blue peak contributes to both a blue cast in the light and a poor CRI, those are totally different and independent consequences.
For example, you could tune (
) three red, green and blue lasers to produce a mixture that looked like perfect white to the human eye. It's only when you use that light to illuminate some fruit* that you can suddenly tell that something is terribly wrong. Hence the three lasers would combine to make a very pure white, but with a very poor CRI.
On the flip side, a cool white incandescent (if such a thing even exists) would have a noticeable blue tinge, but fruit* would look right -- it'd have a pretty decent CRI. So the warmth/coolness/pinkishness of a light source is almost completely orthogonal to its CRI, hence the existence of the CRI measurement.
* "Fruit" used here as a placeholder for "things that our brains are deeply familiar with the normal daylight response for."
LCD backlight does not need a good CRI. For a large color range and best efficiency they even want 3 peaks, just where the color filters of the LCD panel are. So no wonder to see the spectrum drop all the way to zero. On thing I am wandering is why there is so little red light. So maybe the TV was nor the best in showing bright red pictures.
Indeed -- the absolute optimal spectrum for a domestic light source is a CRI=100, broad, daylight-like spectrum. Whereas the optimal spectrum for an LCD display is three sharp peaks (which this newfangled quantum dot TV stuff is ostensibly working towards) in order to maximise the gamut of the display.
Regarding the weak red response; maybe the LCD panel had a warm/reddish cast when it was told to turn fully white, so that the entire system has a fairly neutral look?