The time-averaged form of the radiative transfer equation (RTE) includes emission and absorption correlations that need to be modeled. There is no general formulation to estimate the absorption coefficient-radiation intensity correlation, which is generally neglected (optically thin fluctuation approximation–OTFA). Here, a model to compute this correlation, as well as the other correlations in the time-averaged form of the RTE, is described. The formulation is based on the solution of two additional differential equations. The unclosed correlations in these equations are estimated assuming that the joint probability density function (pdf) of the radiation intensity and mixture fraction is a two-dimensional clipped Gaussian distribution. The model is applied to a turbulent jet diffusion flame, and a preliminary assessment of the model is reported. It is shown that fluctuations of the radiation intensity, caused by turbulence, imply the existence of a correlation between the radiation intensity and local properties. The assumption of the shape of the joint pdf of mixture fraction and radiation intensity yields satisfactory predictions if the turbulent fluctuations are moderate, but becomes inaccurate near the flame edge where turbulent fluctuations are very large. Nevertheless, the present results suggest that the proposed model may yield better predictions than the OTFA.

This content is only available via PDF.
You do not currently have access to this content.