By Andrew Freeman, Vice President - Global Sales, Pulsafeeder
There are three primary methods to cool water in power plants:
• Once-through systems take water from nearby rivers, lakes or oceans and circulate it through pipes to absorb heat from the steam in condensers. Once used, water is discharged back to its local source. About 30 percent of the legacy power plants on the east coast of the US still use this inefficient process. They were built several decades ago when it was possible (and inexpensive) to build plants near abundant water sources. However, most of the plants built since the 1980s leverage a more efficient cooling system that causes fewer disruptions to local ecosystems.
• Closed loop, wet-recirculating systems reuse cooling water in repeated cycles rather than immediately discharging it. These systems use cooling towers to expose water to ambient air. Some of the water evaporates; the rest is sent back to the plant’s condenser. These systems only withdraw water to replace what’s lost through evaporation – but in total, they end up consuming a higher percentage of incoming water and discharging less of it. About 60 percent of all power plants in the US (and almost every plant built on the west coast) feature wet-recirculating systems.
• Dry-cooling systems use air instead of water to cool steam in turbines. These systems can decrease water requirements by as much as 90 percent, but they require more fuel per unit of electricity. Most plants that use this system run on natural gas. Despite the name, dry-cooling systems still require water for system maintenance, cleaning and boiler blowdown.
How much water do power plants need?
Power plants that rely on once-through cooling systems waste a lot of water (which is why they aren’t built anymore). Coal-fired plants need about 30,000 gallons of water for every megawatt of electricity produced. Nuclear plants need twice as much water. In both types of plants, only about 1 percent of this water is consumed, while the remaining 99 percent must be treated before disposal.
Plants that use recirculating cooling systems require far less water. Coal-fired plants can make a megawatt of electricity with about 1,200 gallons, while nuclear plants with recirculating systems need about 2,600 gallons per megawatt. In either case, it takes a lot of water to make electricity - perhaps too much water. The cost of water and its increasing value/scarcity around the globe are driving momentum towards natural gas powered plants that can leverage dry-cooling processes. This is why almost all of the new power plants being built in China, India, Africa and other parts of the developing world are standardizing on natural gas combined-cycle (NGCC) techniques.
For more than 30 years, the state of California has been on the leading edge of building and maintaining natural gas powered plants, as most of the state’s plants switched from coal to gas in the 1980s. The lessons learned are being adopted by the 50 or so remaining coal plants on the east coast of the US, through the “repowering” and modernization process of replacing coal boilers with gas-fired turbines.
NGCC systems can increase a plant’s efficiency by up to 40 percent over coal fired plants. They help plant owners reduce costs, specifically maintenance costs, as newer systems are easier to work on. And the switch to natural gas brings other savings because feed stocks are abundant, less expensive and easier to transport via pipelines.
How natural gas power plants operate:
The natural gas combined-cycle power plant uses a gas combustion turbine to generate electricity, and it also uses waste heat to make steam, which then generates additional electricity in a steam turbine. Because the first stage (the gas combustion turbine) has no steam to condense, it doesn’t require cooling. As a result, the combined process requires only 25 percent as much water to generate the same megawatt of electricity as a coal fired plant.
To read the full article by Andrew Freeman, please contact Pump Engineer's Editor, Deirdre Morgan.