Gas Turbine Association
Home
About Us
Gas Turbines 101
Gas Turbine History
A History of the GTA
Members
Board Members, Associate Board Members & Staff
Member Company Websites
Careers in the Gas Turbine Industry
GTA Focus Areas
Government Affairs
Environmental Affairs
Technical Affairs
Gas Turbine Related Websites
Membership
Contact Us
Siemens Solar Turbines GE Energy Rolls-Royce

 
Gas Turbine History

Brayton cycle-Gas Turbine History

The industrial revolution only became a reality with access to energy conversion devices that could be dispatched on demand. Prior to this innovation, ships moved by sail, and water wheels on rivers moved the machinery of the pre-industrial era, albeit at a snail's pace. Newcomen's original steam engine, a reciprocating design, was among the first breakthrough away from the vagaries of the wind and waves, but in 1712 it functioned at very low efficiency. James Watt's addition of a condenser some years later nearly tripled efficiency although the equipment was large, cumbersome, and dangerous to operate. Power output, and efficiency, would remain in single digits until the first turbine designs were introduced. The more familiar steam turbine was developed by Sir Charles Parsons at the end of the 19th Century, nearly 150 years after the first engineering design of what is called a heat engine. Like most early stage devices, this first steam turbine size was minimal at 7.5 kW compared to steam turbines today that can reach 1,400 to 2,000 MW. Parsons became the eponymous name of one of the world's great steam turbine engineering companies. And his design was later licensed to an American, George Westinghouse, becoming the basis for rotating equipment in power generation in the rapidly developing American company.

Almost in parallel and completely unrelated was the design of George Brayton's constant pressure system which would evolve into the gas turbine. In contrast to the steam engine, also known as a vapor power cycle, the gas cycle was open. Air would be continuously ingested, and the energy was released into the working fluid (air) during combustion. This contrasted significantly from the steam cycle where the energy in a steam cycle had to be transferred through heavy, and expensive, heat exchangers.

While the first Brayton cycles capitalized on the use of internal combustion to introduce energy in the system, the process was not continuous; it was effectively similar to the reciprocating engines used in the typical automobile. In fact, Henry Ford spent years fighting over intellectual property issues related to the engine design and what type would be used in the first Ford motor vehicles.

The new internal combustion engines first exploited in the Brayton and Otto Cycles occurred in the dawn of another era in human history powered flight. While steam based power conversion would rule the oceans and the railways, those engines were far too heavy to be considered useful in the early days of flight, where aircraft were made of fabric, wire, and balsawood. Reciprocating piston engines, derivatives of those made for automobiles, were the engines of choice because of their high power-to-weight ratio. The relationship between auto and aircraft was so easy to make that several of the largest automobile manufacturers acquired aircraft engine companies, fully expecting to capture a developing market in aviation similar to the consumer market with cars.

But the power to fly an aircraft was limited by the power of the engine and how much fuel could be pushed through the engine. A reciprocating engine would compress the air, ignite the fuel, expand, then push out the remnants of the combustion process, and start over. This process was repeated thousands of times per minute in each cylinder. And new innovations such as supercharging (to push more mass through the engine at each stroke) would increase the stress on components. Failure rates were very high, many aircraft would crash, but the progress in the industry didn't stall. Engineers designed new materials, new ways of machining components, and redesigned the aircraft. Much of it may have been haphazard, since there was no one single entity leading the effort. Engine companies made aircraft engines, not aircraft. And fuel supply companies made aviation gasoline, not engines.

The other industrial innovation occurring at the same time was in petroleum and refining. Since the introduction of Parsons first turbine designs, kerosene and gasified solid fuels became the primary source of illumination. Homes were not wired with copper wiring as today, they had copper tubing that would deliver a toxic mixture of carbon monoxide, hydrogen, water, and carbon dioxide that provided the light to keep away the dark. Nearly 1,000 gasification plants dotted the landscape providing towns and industry what was considered the best way illumination possible. Burning kerosene supplied by a nascent petroleum industry was another way. Natural gas for power generation was 50 years in the future, and shale gas was over a century away.

But the Parsons turbine design, and equipment manufacturers were eroding the illumination market that petroleum and gasification dominated. By the time Thomas Edison passed away during the Hoover Administration, engineers would advise the president that it would not be possible to honor Mr. Edison's passing by turning off all the lights in the country -- it would devastate the power grid possibly causing irreparable harm.

Leading up to the Second World War, all powered flight was achieved with an internal combustion piston engine. But not all flying was the same. The ability to accelerate an aircraft rapidly in flight was tied to the size of the engine, the amount of energy in the fuel, and how quickly that fuel could be burned in the engine. This was not a major issue for cars motoring through a city, but for an aircraft operating in the world of combat, being able to accelerate into and out of deadly confrontations was a matter of life or death.

Part of the problem was the way the fuel burned, and the fuel burning was tied to the chemical structure of the fuel. Today's driver is familiar with the Octane ratings listed at the fuel pump. A higher octane rating reduces the tendency to 'ping', a chemical explosion inside the engine that can burn away minute parts of the engine. And anyone who has a turbocharged engine knows that the higher octane rating is nearly always required. Fuel experts learned that the simple hydrocarbons like butane and pentane would autoignite early, and cause burning. Fuels like kerosene would not evaporate properly, defeating the whole purpose of adding fuel since it wouldn't mix with the air. Without some change or innovation, a large percentage of what could be extracted from a barrel of oil was not going to find an outlet in the expanding automobile and aviation markets.

The chemical solution to the fuel challenge was found in a French engineer Eugene Houdry, who had discovered the catalytic properties of some minerals in producing gasoline. He had been able to produce gasoline from a low rank coal, not too different from the diesel quality fuel produced by Fischer and Tropsch in Germany about that same time. Houdry would move to Paulsboro New Jersey, joining the Vacuum Oil Company (which would later merge with Standard Oil of New York). The Houdry Process eventually became the basis of catalytic cracking of petroleum to produce very high quality and high octane gasoline from oil.

In Europe different kind of aircraft engine was being explored by two separateresearchers, one in Germany and one in Great Britain, and the solution would be based on the open Brayton cycle engine. The technology would offer its own solution to the fuel combustion problem. In Germany, Hans Von Ohain was exploring a continuous burning process in a turbine that had great similar to steam turbine designs. It included blades, vanes, but also a combustor and a compressor unlike the steam turbine designs. Von Ohain's first successful design operated on hydrogen fuel that is easy to ignite and has a wide flammability range. But it demonstrated one critical improvement; the gas turbine design was less dependent upon the chemical characteristics of the fuel to operate the thermodynamic cycle. While Germany had difficulty providing aviation gasoline with much more than 87 octane, the gas turbine engine could operate on this lower octane rating as easily as it could operate on any fuel that could be injected. This was a stunning differentiation. Much like the piston engine, the gas turbine had a high power to weight ratio, although not as good as any current production engine. But it was much less dependent on the fuel qualities (in terms of octane or cetane ratings that influenced reciprocating engines).

Von Ohain's design incorporated an axial compressor, a combustor, and axial expansion turbine that looks similar to any steam turbine. In England, Whittle's design occurred in parallel, although it pursued a different centrifugal compressor design. Both designs yielded gas turbine features that are still manufactured today, although nearly all power generation designs are based on the axial compressor design first explored by Von Ohain.

By the end of the war, the gas turbine had established the benchmark for powered flight. Supersonic flight achieved later in that decade would never have been possible without either a rocket or a gas turbine to propel it. And improvements in materials allowed engines to produce more power with less weight.

But few gas turbines were found in power generation. Steam systems were robust, large, and more familiar to the power industry. While steam turbine companies also made gas turbines, their target market for gas turbines tended to be aircraft companies, or the military. The flexibility of the gas turbine would be occasionally exploited by companies such as US Steel where the blast furnace gas, which was essentially free, could be used to produce power at the mills (and also minimize the release of toxic gases into the environment). Turbine manufacturers would introduce heavy industrial gas turbines into the metals production industry. But most new power generation construction was still steam based. This included a new entrant into the market, nuclear steam power generation. Gas turbines in power generation would remain just over the horizon until one night in November, 1965.

On the night of November 9th, 1965, much of the Northeastern United States was plunged into a blackout. The cause was a protective relay incorrectly set at a switchyard near Niagara Falls, resulting in a cascading voltage collapse that plunged nearly 30% of the population into darkness for 13 hours. It only took approximately five minutes from the time the first relay tripped until much of the region was plunged into darkness. For many, a bright full moon became the only source of light in the outside world. The blackout would change the way industry would plan for growth, and planners would reach out to the aircraft engine manufacturers to provide small, rapidly starting generators that could be deployed at critical locations across the grid.

In the six years after the blackout, power generators would acquire almost 780 gas turbines representing 20,000 MW of capacity. Sometimes even the facility names would reflect the derivation of power technology. Names such as Fisk Jet, or Waukegan Jet would hint at the connection these turbines had to their aviation jet aircraft roots. Nearly one third of these facilities were oil fired, implying that they did not depend upon the natural gas supply to backup the grid. By the end of the 20th Century, almost all of these were still in place, offering a relatively inexpensive insurance policy against further widespread incidences on the grid.

The gas turbine was so new in the power sector that it even garnered the unique special status category of 'other' in the Clean Air Act. The Clean Air Act focused regulatory emphasis on coal, gas, and oil systems in steam generation. But the gas turbine (also called the combustion turbine) was so new that it was lumped into the industrial classification that captured anything not already defined in the new regulation.

The new market for gas turbines became a major turning point in the industry. And designers took note. The earliest designs were relatively small, between 10 and 30 MW in size. But with the smaller size came a smaller capital expenditure. And if load planning was inaccurate by 50 or 60 MW, the power generation industry finally had a fall-back position, they could order a gas turbine to fill in the gap between their real capacities and actual needs that were above this.

Over the next decades turbine size increased rapidly- from 30 MW to 50, then 80 then 100 and beyond. But the overall efficiency of these was not particularly high-perhaps 25%, not even as high as supercritical coal plants coming on line at the time. That 25% figure was exceptional compared to the starting point decades earlier, but not quite good enough to produce viable cash flow in for commercial operation or revenue service. But there was a caveat. The very high exhaust temperatures from the gas turbine would make it a logical to adapt a heat recovery system, and drive a steam turbine with the excess energy. The two cycles Brayton (gas) cycle and Rankine (steam/vapor) cycle evolved into the combined cycle. Combining these two cycles produced efficiencies beyond any thermal plant in operation, a feat that would continue to this day.

By the 1980's the era of prior overbuilding would pressure the utilities to scale back future expansions. And while Three Mile Island may have dampened the plans for a great expansion of nuclear power, there already were 50,000 MW of nuclear power yet to come on line. Orders for new gas turbines were drying up quickly. In contrast to the large volume of nuclear power still under construction, by 1982 new gas turbine orders dwindled to a mere 320 MW. Yet little of this was destined for central power stations, the vast majority of it was smaller gas turbines for new cogeneration. In addition, pipelines which had traditionally relied on reciprocating engines were also being replaced with a new breed of smaller gas turbines.

The changes in the market would result in wholesale changes to the industry. Beyond the added capacity in the United States, there was a new change on the horizon deregulation of electricity. With deregulation, it wasn't clear who had responsibility for building capacity to meet new demand, or who had responsibility for delivery. And who was ultimately responsible for system reliability? With this amount of uncertainty, long range planning came to a halt. But the economy actually grew, and that growth accelerated. Using the stock market as a benchmark, stock prices began an unparalleled increase near coincident with the commencement of hostilities in the First Gulf War. A decade long expansion lay ahead. Likewise the US appetite for power expanded rapidly while reserve margins shrank to record lows. With uncertainty about how to recover the capital costs, there was little appetite to sink millions into new generation.

By the end of the 20th Century, an unplanned critical mass was reached in the industry. Suddenly a flurry of orders was placed for new equipment. Only this time the customer had changed dramatically. Not only did orders come from the traditional integrated utilities, but also new entrants, the merchant generators, independent power producers, and consortiums with accessing to capital. Power generators that had not existed previously were now placing the bulk of the orders for new capacity. It was new, and very different, and more detached from the traditional long-range resource planning. In the past if a utility would need 50 MW of capacity in a given year; this had to be planned for nearly five years in advance. Forecasting being what it is planners were almost always guaranteed to have planned for either too much capacity, or too little. The previous technology of choice was large, and labor intensive to construct. Now not only was the market different, but the technology was different too. By the first decade of the 21st Century it was possible to build a combined cycle that was larger than the largest thermal plant in the country. Gas turbines were no longer limited to just 10 MW emergency backup generation.

And there were new regulations, and lots of them. While no industry would escape the oversight of the EPA, the natural gas fueled turbines would find the regulatory burdens less onerous. For the power industry, the major issues were with emissions of NOx, CO, unburned hydrocarbons, particulates and SO2. The choice of natural gas as a fuel mitigated much of these, but NOx was intractable. It was not a necessary step in the combustion process; it was part of the breakdown of nitrogen in the atmosphere. The only solution to this problem is the one still in use today controls the combustion zone temperature. Solving that challenge would take nearly 25 years of technical innovation. Obviously it wasn't an overnight success. And it came with a cost. Much of that fuel flexibility achieved in the early days, using simpler designs, was absent from the stock commercial gas turbine. The sophisticated combustion designs used in the US markets could not be easily adapted to operate on operate on heavy oil or synthetic gases. Specialized designs would be required if using anything other than natural gas, and the occasional backup fuel (No. 2 distillate).

And the cost structure shifted with the technology. Not too different from the way automobile is built, most of the components of a gas turbine are manufactured in a factory. It's an environment with cost, controls, and efficiencies that could minimize the final product cost. A complete gas turbine and generator package could be fabricated in less than 90 days for a small package that might be destined for a cogeneration or pipeline application, perhaps 12 months for a gas turbine that might be as large as 300 MW. Unlike large thermal stations that might take 4 to 8 years to build, a gas turbine station might be up and running between 120 to 720 days.

Along with the new gas turbine designs came new manufacturing centers. To produce faster, lower cost products manufacturing now included new criteria such as design for reliability, design for repair, or design for assembly. The new capacity and manufacturing centers appear to have arrived at just the right time for the North American power industry. By 2020, nearly all of the installed thermal fleet, the older fossil gas steam power systems built between the 1950's and 1970's, will be over 50 years old. Both the facilities and the workforce are edging closer to retirement that might be anything but early. Couple this altogether with the discovery of enormous volumes of unconventional shale gas in the North America, the result is the expectation for a new era for the gas turbine as the new primary source of power generation in the United States and Canada.