In one sense this is true: LEDs are cool to the touch because they generally don't produce heat in the form of infrared (IR) radiation (unless of course they are IR LEDs).
IR radiation heats the enclosures and surroundings of incandescent bulbs and other sources, making them hot to the touch. The absence of IR radiation allows LED fixtures to be positioned in locations where heating from conventional sources would cause a particular problem e.g. illuminating food or textiles.
However, crucially, heat is produced within the LED device itself, due to the inefficiency of the semiconductor processes that generate light. The wall-plug efficiency (optical power out divided by electrical power in) of LED packages is typically in the region of 5-40%, meaning that somewhere between 60 and 95% of the input power is lost as heat.
The energy consumed by a 100-watt GLS incandescent bulb produces around 12% heat, 83% IR and only 5% visible light. In contrast, a typical LED might produce15% visible light and 85% heat.
Especially with high-power LEDs, it is essential to remove this heat through efficient thermal management. Without good heat sinking, the internal (junction) temperature of the LED rises, and this causes the LED characteristics to change.
As the junction temperature of an LED is increased, both the forward voltage and the lumen output decrease (see figure 1). The output wavelength also shifts with a change in junction temperature.
A key factor here is that LED performance is measured yb the manufacturers under laboratory conditions, and is usually specified at a junction temperature of 25 °C, even though this is pretty much guaranteed to never occur in real situations.
Most significantly, the junction temperature affects the lifetime of the LED. Unlike other light sources, LEDs don’t tend to fail catastrophically (although a small number do, especially if you cook them); instead, the output of the LED decreases over time. Note that this also happens for other light sources.
As shown in figure 2, elevated junction temperatures lead to more rapid deterioration. Driving LEDs above their rated peak current causes the junction temperature to rise to levels where permanent damage may occur.
What affects junction temperature?
The ambient temperature and the drive current both affect the junction temperature of LEDs. Other influences are the nature of the light output, whether it is steady state or pulsed, and the LED wattage per unit area of surface that dissipates heat.
The key factor is the thermal path from the LED junction to ambient i.e. the outside of the package (see figure 3). Heat should be conducted away from the LED in an efficient manner, and then removed from the area by convection. This latter process can be passive, involving convection from the outside of the package or from a finned heatsink with a large surface area, while higher-power arrays may require active convection using forced air cooling (i.e. a fan) or water cooling.
Optimizing heat transport
A key factor is to use materials with a high thermal conductivity to move heat away from the junction as quickly as possible. Unfortunately, some high thermal conductivity materials such as copper are also relatively expensive, and there is a trade-off between cost, performance, footprint, manufacturability and other factors.
Heat-spreading materials should be used that have high thermal conductivity both laterally (x and y directions) as well as vertically through the base of the device. Also, it is necessary to optimize the amount of LEDs versus the available surface area.
Each interface produces a resistance to heat transfer, which means the number of interfaces should be reduced, and the thermal resistance between mating surfaces should be minimized. Surfaces are rarely smooth, for example a metal heat sink might have grooves where it has been milled. If two rough surfaces are mated, most of the heat transfer takes place via point contacts, since air is a poor conductor of heat. This problem can be overcome by filling the gap with a soft, thermally conductive material (see figure 4).
Calculating thermal resistance
The thermal resistance (units: °C/watt) between two points measures the change in temperature difference between those two points for each watt of input power. Or:
temperature change = thermal resistance x LED power
(where LED power = voltage x current)
Clearly, running the LED at a higher current results in a higher temperature change.
Also, a low thermal resistance is highly advantageous in keeping temperature change to a minimum.
The total thermal resistance (Rth) from junction to ambient is the sum of the individual Rth values. Referring to figure 3:
Rth(junction to ambient) = Rth(junction to case) + Rth(case to ambient)
Rth(junction to case) should be provided by the LED manufacturer's data sheet. Rth(case to ambient) can be calculated by measuring the temperature change from case to ambient. This is generally done using a thermocouple, ideally in a hole drilled in the side of the board directly under the device.
With the above information, it should be possible to calculate the junction temperature of the LED under different operating conditions, which in turn should allow the lifetime of the LEDs to be estimated in the fixture in question.
It’s imperative that the thermal performance of the entire system is taken in to account. For example, LEDs have been placed into IP-rated fixtures that are used as floodlights. This creates a sealed skin around the LED module, forming an air pocket which prevents efficient thermal transfer to the outside surface. The lack of a radiator is a sign that heat sinking isn't a major consideration in the design – which can lead to significant long-term problems.
If you have any comments on this article, please contact us.