Apparently they made a special low bandgap LED with about a 1mW light output when the power in was about 100mW. So that is a 1% efficient LED. Normal LEDs have efficiencies of about 4% to 18%, so this is a pretty useless LED for practical uses.
They then heated it to 135 deg C and lowered the electrical power down to 30 picowatts. It emitted 69 picowatts of light, so by calculation, it must have cooled the LED at about 40 picowatts. They claim a 230% electrical efficiency (ignoring the heat energy they are pumping into the LED). Way to low an effect to detect thermally. At 85 degrees, the efficiency only reached about 10% at 30 picowatts, and at room temperature, it had fallen to about 0.1% at 30 picowatts.
Sounds like the idea is a huge way from being a useful technology. By pumping in lots of heat, they were able to get an unmeasurable cooling effect at nearly unmeasurable power levels. I think to get any kind of useful effect, a far higher temperature is needed.
Richard.