Turbine entry conditions are characterized by unsteady and strongly non-uniform velocity and temperature and pressure fields. The uncertainty and the lack of confidence associated to these conditions require the application of wide safety margins during the design of the turbine cooling systems, which are detrimental for the efficiency of the engine. These issues have been further complicated by the adoption of lean-burn technology in modern aeroengines, identified by many manufacturers as the most promising solution for a significant reduction of NOx emission. Such devices are in fact characterized by a very compact design, whereas the strong swirl component generated by the injector is maintained up to the end of the flametube due to the absence of dilution holes, which in conventional combustors provides the required pattern factor.
Bearing in mind complexity and costs associated to the experimental investigation of combustor-turbine interaction, CFD has become a key and complementary tool to understand the physical phenomena involved. Due to the well-known limitations of the RANS approach and the increase in computational resources, hybrid RANS-LES models, such as Scale Adaptive Simulation (SAS), are proving to be a viable approach to resolve the main structures of the flow field.
This paper reports the main findings of the numerical investigation of a hot streak generator for the study of combustor-turbine interaction. The results were compared to experimental data obtained from a test rig representative of a lean-burn, effusion cooled, annular combustor, developed in the context of the EU project FACTOR. Steady RANS and unsteady SAS runs were carried out in order to assess the improvements related to hybrid models. Additional simulations were performed to investigate the effect of the periodicity assumption and the impact of liner cooling modelling on the exit conditions.