http://dvice.com/archives/2012/03/230-efficient-l.php"When the LED gets more than 100% electrically efficient, it starts to cool itself down, which is another way of saying that it's stealing energy (in the form of heat) from its environment and converting that heat into those over-unity photons."
Don't get your hopes up though, it currently only works at 69 picowatts
"LED that produces 69 picowatts of light while using just 30 picowatts of power."
Sounds like it might be a more efficient cooling device than light emitter.
Very interesting, more so for the potential of creating a new type of thermal imaging devices. If the heat influences the brightness of these LEDs, then I'd imagine you could create a LED matrix, focus infra-red light on them, and potentially form an image in visible light. For example, you can use a CCD with these LEDs deposited on top of each CCD pixel. Then use image processing to detect brightness fluctuations, etc.
I want to know how do you accurately measure that a LED is emitting 69 picowatts of light. It wouldn't be hard to make an error.
So, does it just not put out as much heat, plateau, or does it actually get cold after you reach 100%?
I want to know how do you accurately measure that a LED is emitting 69 picowatts of light. It wouldn't be hard to make an error.
Some photomultiplier tubes and avalanche photodiodes can detect light at one photon at a time if you are so inclined.
It is certainly possible to get cooled CCDs to detect single photons and measure the energy levels.
This seems to be what is basically a Peltier/Seebeck module with the output being photons rather than heat.
You can't get over 100% efficiency ... if that is it , wouldn't it break the laws of percentage ?
100% efficiency already produces no heat ... but instead cools down the led , WTF ?
Agreed, but what do you mean by laws of percentage?
So it operates in an open system with external heat energy being converted to photons. This means that the device is operating at less that 100% efficiency, but has a COP of 2.3. Not impossible considering the heat exchangers and compressor in a heat pump operate at less than 100% efficiency, but the heat pump generally operates at COP's above 1.
I want to know how do you accurately measure that a LED is emitting 69 picowatts of light. It wouldn't be hard to make an error.
One way would be to create an array of 10^12 devices and check that they emit 69 watts of light when supplied with 30 watts
Scientists do use lasers to cool down things. I know, it's counter-intuitive, but it's true. So I can totally believe that at a certain level an LED could cause a cooling effect. I would, however, like to see them repeat this particular experiment and collect temperature readings of the diode and the ambient environment for a long time. That would be interesting.
Now I want my flashlight that can be strapped to a can of beer, a bright portable fridge folks !!
I think it is the substance horses leave behind!
So it operates in an open system with external heat energy being converted to photons. This means that the device is operating at less that 100% efficiency, but has a COP of 2.3. Not impossible considering the heat exchangers and compressor in a heat pump operate at less than 100% efficiency, but the heat pump generally operates at COP's above 1.
Well, the wall (or whatever) hit by the light warms up, so the total amount of heat increases. It will be intersting to see if they can scale these devices up to, say 1W or more.
Apparently they made a special low bandgap LED with about a 1mW light output when the power in was about 100mW. So that is a 1% efficient LED. Normal LEDs have efficiencies of about 4% to 18%, so this is a pretty useless LED for practical uses.
They then heated it to 135 deg C and lowered the electrical power down to 30 picowatts. It emitted 69 picowatts of light, so by calculation, it must have cooled the LED at about 40 picowatts. They claim a 230% electrical efficiency (ignoring the heat energy they are pumping into the LED). Way to low an effect to detect thermally. At 85 degrees, the efficiency only reached about 10% at 30 picowatts, and at room temperature, it had fallen to about 0.1% at 30 picowatts.
Sounds like the idea is a huge way from being a useful technology. By pumping in lots of heat, they were able to get an unmeasurable cooling effect at nearly unmeasurable power levels. I think to get any kind of useful effect, a far higher temperature is needed.
Richard.
If they can get this going it may work as part of the LED drivers cooling system, by dumping part of the waste heat as part of you target output you can save on the big ass heat sinks.
So for tens to hundreds of Watts of heatand 30 picowatts of electrical power, they get 69 picowatts of light.
So for tens to hundreds of Watts of heatand 30 picowatts of electrical power, they get 69 picowatts of light. ![Roll Eyes ::)](https://www.eevblog.com/forum/Smileys/default/rolleyes.gif)
And we try to wonder and think are they
theoretical scientists or actual engineers
Some of you are missing the point. The experiment was done in order to provide empirical evidence for a theory related to semiconductor physics, not to provide a practical solution for a particular problem.
Some of you are missing the point. The experiment was done in order to provide empirical evidence for a theory related to semiconductor physics, not to provide a practical solution for a particular problem.
The researchers are actually promoting this idea as having the potential to be useful in the future, and the truth is, they are an extremely long way from that goal right now. It is not just abstract research to prove a theory.
At the moment, at 30pW, only something like one in a thousand electrons flowing into the device are triggering light, so according to the theory, 99.96% of the emitted light is coming from the vibrational energy of the crystal lattice. I cannot help wondering if they could make a material that could emit light from thermal energy without needing any electricity. Or could you get some kind of chain reaction point where the amount of electricity that gets generated by the re-absorbtion of some of the generated light is enough to sustain the conversion process? So all you need is to expose a panel to light to start it converting heat to light, and then it does it on its own continuously from there. The only way to shut it down is to let the panel cool till it stops.
Richard.
The researchers are actually promoting this idea as having the potential to be useful in the future, and the truth is, they are an extremely long way from that goal right now. It is not just abstract research to prove a theory.
Unfortunately, the research paper
is behind a pay wall, so I don't know for sure whether they are promoting anything. I didn't see it that way though, and I certainly take second hand reporting on these topics with a grain of salt.
I cannot help wondering if they could make a material that could emit light from thermal energy without needing any electricity. Or could you get some kind of chain reaction point where the amount of electricity that gets generated by the re-absorbtion of some of the generated light is enough to sustain the conversion process? So all you need is to expose a panel to light to start it converting heat to light, and then it does it on its own continuously from there. The only way to shut it down is to let the panel cool till it stops.
Up-converting phosphors certainly do exist [1]. From memory, some phosphors had excitation wavelengths in the vicinity of 1000 nm, and peak emission around ~600 nm (I can't remember exact figures), which in itself was pretty cool. I'd imagine shifting the excitation response into the thermal range is not a trivial exercise. At any rate, I don't know enough on the topic to comment anything more insightful. I just hope this tech will open up new avenues in light emission research.
1. Yamamoto et al., Phosphor Handbook, CRC Press, 2007
The researchers are actually promoting this idea as having the potential to be useful in the future,
Of course they do, they need funding.
The authors do hint at it:
Although this result is of basic scientific interest, recently it has also drawn interest in the applied community for its potential in lighting and solid-state cooling [3–10].
The new discovery is not so much the conversion of heat to light, but that the cooling power is more than the heating (loss) caused by inefficiency, hence the claim that the efficiency is > 100%.
I cannot help wondering if they could make a material that could emit light from thermal energy without needing any electricity. Or could you get some kind of chain reaction point where the amount of electricity that gets generated by the re-absorbtion of some of the generated light is enough to sustain the conversion process? So all you need is to expose a panel to light to start it converting heat to light, and then it does it on its own continuously from there. The only way to shut it down is to let the panel cool till it stops.
The semiconductor junction needs to be biased, so I don't see it happening without electricity. Bias voltage was about 70 µV in this case, hence the extremely low power.
ok, put 15 billions of em side by side. then we'll get 1W of light from 0.43W of psu, and nice airconditioning effect
So, how does this not totally and completely invalidate the conservation of energy? We have to look at how energy flows through the entire system, not just at the electricity in and light out. When the LED gets more than 100% electrically efficient, it starts to cool itself down, which is another way of saying that it's stealing energy (in the form of heat) from its environment and converting that heat into those over-unity photons.
So they're claiming it's not breaking the first law of thermodynamics but the second? It's still total bollocks. Heat naturally flows from a hotter place to a cooler one, to get it to flow in the opposite direction requires a heat pump which needs energy.