Abstract
The study of background infrared radiation characteristics plays an important role in target detection and identification. Accurate modeling of ground infrared radiation characteristics relies on heat transfer theory to obtain background temperature variations under different environmental conditions. However, applying a theoretical model requires a comprehensive set of parameters, some of which are difficult to obtain accurately in practice. Solving the inverse heat transfer problem provides an approach to estimating optimal parameters by minimizing the discrepancy between calculated and measured temperatures. This article presents a method for optimizing modeling parameters of ground surface infrared radiation using historical temperature measurement data and the trust region reflective optimization algorithm. Initially, sensitivity matrix analysis reveals correlations among material parameters, guiding the development of optimization strategies. Thermal inertia is introduced to characterize strongly correlated thermal parameters, addressing correlation issues. Additionally, a day–night estimation strategy is employed, using nighttime data to estimate the emissivity and thermal inertia, while daytime data optimize the shortwave absorptivity. Long-term experimental validation shows the root mean square error of temperature predictions reduces from 4.72 °C (using the literature provided parameters) to 1.8 °C with optimized parameters. Furthermore, the accuracy and applicability of the algorithm under various weather conditions are examined, revealing an average temperature error of 1.35 °C under stable meteorological conditions. The long-term comparison results suggest that the optimized parameters retrieved from measured surface temperatures can improve modeling accuracy, providing a new approach for predicting surface temperatures and infrared radiation in outdoor environments.