MNTL researchers report new thermal management method for GaN transistors
Transistors made from gallium nitride (GaN) semiconductors are hot commodities. They promise higher power densities (~ 60 W) and can operate under higher temperatures (~ 500 °C) than conventional silicon transistors, which is why GaN devices are replacing silicon transistors in a variety of high power applications—photovoltaic inverters, cell phone base stations, and transformers, to name just a few. In the future, GaN transistors may also have a big impact on the operation of hybrid and electric vehicles and bullet trains.
However, GaN transistors, like all semiconductors, are hot in another way: they generate excess heat that can limit their performance. Researchers continue to look for ways to cool these devices because they are driven by so much current and voltage. One solution is to add a cooling system to the device like a heat sink or fan, but that adds cost and bulk volume to the system.
A team of University of Illinois researchers from the University of Illinois Micro + Nanotechnology Lab recently created a new method of thermal management for GaN power transistors that is simple and cost effective. Through technology computer aided design, Electrical & Computer Engineering Assistant Professor Can Bayram’s group demonstrated that the thickness of the GaN layers plays a role in overheating, affecting the device’s thermal budget and ultimately its performance.
“Thinner is cooler,” said Bayram, citing that conventional GaN transistors are deposited on thick substrates (e.g. silicon, SiC) that are not ideal thermal conductors. Bayram noted the challenges in mismatched epitaxy of GaN on conventional substrates, leading to tens-, and in most cases, hundreds- of microns-thick devices.
“By thinning the device layer, one can reduce the hotspot temperature of a high-power GaN transistor by about 50 degrees Celsius.” said Bayram.
According to ECE graduate student Kihoon Park, who led the research study, there’s a limit to the device thinness, though. “If you reduce it too much, you’d get the inverse effect and actually increase the temperature inside the device,” Park said. Hence, neither too thick, nor too thin is the approach – it is the optimum thickness which is around a micron or so for typical devices.
This work is significant because it provides thermal management design guidelines for GaN-based transistors. The optimal layer size is tied to the device’s thermal boundary resistance (TBR), which is a condition that exists at the very interface of GaN and other epitaxial layers.
“We determined the optimum layer thicknesses minimizing the GaN transistor hot spot temperature considering various values of TBR,” said Park. “Also, the layer size depends on how the device will be used. If it’s for higher power applications, then you’d want thinner, sub- micron thick-layers, ideally.”
The next step for the Illinois team is to study the electrical properties of the GaN layers before actually making GaN transistors on substrates like engineered diamond or epitaxial graphene.
This Illinois research was supported by the Young Investigator Program Grant No. FA9550-16-1-0224 of the Air Force Office of Scientific Research (AFOSR); the results were published in Applied Physics Letters [B1] on October 10, 2016.