This article presents a study on common design challenges of large and small turbofans. Turbofan engines powering large transport aircraft have demonstrated much different design objectives than business-jet turbofans including thrust, range, mission type, development cost, unit price, maintainability standards, and production quantities. Prolific use of ‘thermal barrier coating’ has helped turbine designers compensate for the inability to distribute a large quantity of small diameter film holes over the turbine blade surface. The historical trends in overall pressure ratio observed for both large and small turbofans have parallel slopes. Small turbofans lag behind the larger engines due to the miniaturization required for low flowrates characteristic of the smaller engines. These trends are qualitatively demonstrated, showing the growth in both the overall engine pressure ratio and turbine inlet temperature for several decades. It has been noted in this paper that the importance of high-performance impeller designs and intricate turbine blade cooling concepts for very low compressor exit corrected flows has not yet been fully appreciated.

## Article

Historically, turbofan engines powering large transport aircraft have demonstrated much different design objectives than business-jet turbofans including thrust, range, mission-type, development cost, unit-price, maintainability standards, and production quantities. In the 1960’s and 1970’s, commercial transports incorporated 3 or 4 engines; they competed with propliners, targeted transcontinental or trans-Atlantic markets, and operated under the expectation of low fuel costs without emissions restrictions. During that era, business travel largely utilized “repurposed” military transports—even bombers! After the 1973 Arab oil embargo, and with the advent of government scrutiny on auto and aircraft emissions, jet engine manufacturers began to focus heavily on fuel economy and emissions. The Environmental Protection Agency was founded in 1970, and the first aircraft engine smoke standards were defined in 1973. In the mid 1960’s, two functional parameters became the “guideposts” of turbofan engine architecture: “Bypass Ratio” (BPR) and “Thrust-Specific-Fuel-Consumption” (TSFC). By definition, “bypass ratio” is simply the net mass flowrate of air entering the fan which does not enter the engine core divided by the mass flowrate of air which does enter the core. “TSFC” is defined as the engine fuel mass flowrate divided by the propulsive thrust. Typically, engines with larger bypass ratios deliver superior TSFC and, incidentally, improved (reduced) emissions.

The earliest propulsion gas turbines were referred to as “turbojets”. All of the engine inlet air was directed to the core, with thrust being generated by the change in momentum resulting from the change in gas velocity from the engine inlet to the core exhaust nozzle. The engine itself was a “core”; there was no bypass flow. Therefore, the bypass ratio of these turbojets was zero. Brayton Cycle efficiency is largely influenced by driving higher cycle pressure ratios and turbine inlet temperatures. Due to thermal and stress limitations for turbine components, internal pressures and temperatures for these early turbojets were very modest, such that all the work derived by the turbomachinery was consumed by driving the compressor without any excess power available to drive a fan. TSFC for a P&W JT3C (Boeing 707-100) engine at cruise conditions was about 0.9. Interestingly, the last generation of aircraft piston engines (e.g. the British Napier Nomad Diesel), with gear-driven slow- turning propellers, utilized bypass ratios ~ 100 and could demonstrate an effective TSFC ~ 0.45 at cruise conditions (assuming propeller propulsive efficiencies of about 0.85). In the mid 1950’s, advanced aircraft piston engines weighed twice as much as the emerging turbojets. However, since the Boeing 707 maintained a long-range cruise speed almost “double” that of the Boeing 377, its piston-engine-powered predecessor, the range and total fuel consumption of the jet versus the propliner were about the same. But the 707 got there twice as fast! If the core of the jet engine could be made more “energetic”, i.e., more powerful, then, possibly, the bypass ratio could be increased and TSFC reduced. From Figure 1, the TSFC benefits of higher bypass ratios can be visualized. A documentary could be written describing the relentless quest for increased bypass ratio enabled by engine cores capable of sustaining elevated compressor pressure ratios and turbine inlet temperatures.

But how have the technologies required to deliver these energetic cores differed for large commercial transports compared to small business jets? Increasing compressor pressure ratio and/or “intercooling” drives a reduction in flowpath annular area, meaning compressor-discharge flow areas are reduced. At any thermodynamic station in the gas turbine the “corrected” flow is described by the relation:

$Wcor=Wmass X Tgas°R/5190.5Pstaticlbf/in2/14.696$

For a given mass flowrate (Wmass) a reduction in gas temperature (Tgas) or an increase in static pressure (Pstatic) will reduce the local corrected flow, Wcor. This effect is great for making engines smaller, improving cycle efficiency, and for enabling more energetic engine cores to allow higher fan bypass ratios. Conversely, though, as flowpath components become more “miniature”, a heavy burden results on such design features as rotor-blade-tip-clearance and turbine- blade-cooling-passages. Manufacturing tolerances and the nature of “surface-to-volume” ratios (for turbine blade cooling) become critical for preserving aerodynamic performance and component durability. To prevent dangerous and efficiency-robbing compressor surge-and- stall phenomena, manufacturers of large, modern, high- pressure-ratio-engines have resorted to complex, active clearance control features in the last stages of the compressor and in the high-pressure turbine. But manufacturers of smaller, less-costly and less-sophisticated gas turbines, such as APU’s and small business-jet engines, have advanced the implementation of surge-resistant centrifugal compressors (impellers) to replace the last 2 or 3 stages of a multi-stage axial compressor system. These “small” gas turbines typically operate with Wcor<5 lbm/sec.

Added to the requirement for managing surge and stall, flowpath area reductions present challenges to the design, fabrication and durability aspects of cooled high pressure turbine airfoils and disks. Surface-to-volume ratio scales approximately by L(-20) where “L” relates to the flowpath length scale. Figure 2 depicts how aggressively the surface- to-volume varies with turbine corrected flow. At constant turbine inlet temperature, increasing the core pressure ratio results in an increase in the turbine blade surface-to- volume ratio. Higher surface-to-volume ratio, in turn, leads to a requirement for a higher percentage of cooling flow to maintain blade metal temperature. Further complicating the turbine cooling challenge is the fact that film cooling holes distributed over the surface of the airfoil suffer from blockage effects caused by sand ingestion if cooling hole diameters are smaller than about 0.012”. So, to avoid allocating too much cooling air, an economical distribution of small film holes is desirable but restricted by hole diameter. Prolific use of “thermal barrier coating” (TBC) has helped turbine designers compensate for the inability to distribute a large quantity of small diameter film holes over the turbine blade surface.

The historical trends in overall pressure ratio observed for both large and small turbofans have parallel slopes. Small turbofans lag behind the larger engines due to the miniaturization required for low flowrates characteristic of the smaller engines. These trends are qualitatively demonstrated in Figures 3 and 4, showing the growth in both the overall engine pressure ratio and turbine inlet temperature for several decades.

In Figures 3 and 4 we find the latest large commercial turbofan engines demonstrating Overall Pressure Ratios (OPR) exceeding 50:1, with engines in the 2020’s anticipating OPR’s greater that 60:1! Compressor technology developments at GE/CFM with LEAP, at Pratt and Whitney with the Geared Turbofan (GTF), and at Rolls Royce with Advance and UltraFan may push OPR beyond 70:1! These 70,000 to 100,000 lbf thrust engines will have compressor-exit corrected flows that are actually driving down to levels currently sustained by much smaller business jet engines. Figure 5 captures these trends. Bypass ratios on these high OPR engines are expected to approach 15:1.

The importance of high performance impeller designs and intricate turbine blade cooling concepts for very low compressor- exit corrected flows has not yet been fully appreciated!

This article was inspired by a presentation delivered by Mr. Jim Kroeger, at the 2015ISABE conference. Mr. Kroeger is the Director of Military and Commercial Aircraft Engine Projects at Honeywell Aerospace.