Since incidents at Three Mile Island, Chernobyl, and Fukushima, many countries have switched from nuclear power to electricity production fired by fossil fuels, despite the environmental consequences of burning fuels such as coal. A new study used data from the United States to analyze the costs and benefits of electricity production from coal-fired versus nuclear sources. The study’s authors conclude that policymakers should look at nuclear power as a low-carbon electricity source, but that utilities will need to have incentives to do so.
The study, by researchers at Carnegie Mellon University (CMU) and the IZA Institute of Labor Economics, appears in Resource and Energy Economics.
“By calculating the economic and environmental costs associated with producing electricity using coal-fired power plants rather than nuclear sources, our study informs the ongoing policy debate about whether to subsidize existing nuclear power generation,” explains Akshaya Jha, assistant professor of economics and public policy at CMU’s Heinz College, who coauthored the study.
Researchers used monthly operations data from the Energy Information Administration on nearly every power plant in the United States from 1970 to 2014 to estimate the extent to which the buildout of nuclear power replaced fossil fuel-fired electricity generation. They also estimated the extent to which fossil-fuel power generation increased during unplanned nuclear outages from 1999 to 2014, and they explored why a declining share of U.S. electricity generation came from nuclear sources despite the fact that using conventional fossil fuels resulted in significant increases in air pollution.
The installation of nuclear plants led to an average reduction in monthly coal-fired generation of approximately 200 GWh (gigawatt hours, a unit of energy representing a billion watt hours) in the first year. Solely by displacing the generation of coal-fired electricity, the average opening of a nuclear plant resulted in nearly 2 million metric tons less in carbon dioxide emissions, 5,200 metric tons less of sulfur dioxide, and 2,200 metric tons less of nitrogen oxides in the first year, the researchers concluded.
The study also found that forced outages at nuclear plants led to an increase in monthly coal-fired generation of approximately 200 GWh. Changes due to increased use of natural gas or fuel oil (rather than coal) were neither statistically nor economically significant, the researchers found; this is likely because both nuclear and coal plants are designed to run continuously throughout the year while natural gas plants are more often intended to ramp production up or down quickly to respond to changes in demand.
The production costs per MWh of coal-fired versus nuclear power are similar: Although fuel prices are higher for coal versus nuclear, nuclear power is associated with higher nonfuel operations and maintenance costs. However, burning coal emits substantial quantities of global pollutants that increase climate risk and local pollutants that harm the health of exposed populations. The environmental costs of these emissions are substantial. Consequently, the costs associated with nuclear waste disposal or the expected costs of a nuclear accident (i.e., the probability of an accident multiplied by its costs) would have to be sizable to justify using coal-fired sources rather nuclear sources.
“Based on the results of our study, we think policymakers should consider the benefit of nuclear power generation as a low-carbon source of electricity,” says Edson Severnini, assistant professor of economics and public policy at CMU’s Heinz College, who coauthored the study. “But a substantial amount of regulatory pressure on fossil fuels–for example, in the form of an emissions tax or regional emissions standards–would be needed to provide an incentive for utilities to shift toward increased nuclear generation.”