The present work uses finite element thermal simulations of Gallium Nitride High Electron Mobility Transistors (GaN HEMTs) to evaluate the impact of device design parameters on the junction temperature. In particular the effects of substrate thickness, substrate thermal conductivity, GaN thickness, and GaN-to-substrate thermal boundary resistance (TBR) on device temperature rise are quantified. In all cases examined, the TBR was a dominant factor in overall device temperature rise. It is shown that a TBR increase can offset any benefits offered through a more conductive substrate and that there exists a substrate thickness independent of TBR which results in a minimum junction temperature. Additionally, the decrease of GaN thickness only provides a thermal benefit at small TBRs. For TBRs on the order of 10−4 cm2K/W or greater, decreasing the GaN thickness can actually increase the temperature as the heat from the highly localized source is not sufficiently spread out before crossing the GaN-substrate boundary. The tradeoff between GaN heat spreading, substrate heat spreading, and temperature rise across the TBR results in a GaN thickness with minimum total temperature rise. For the TBR values of 10−4 cm2K/W and 10−3 cm2K/W these GaN thicknesses are 0.8 μm and 9 μm respectively.

This content is only available via PDF.
You do not currently have access to this content.