There are two interesting and related articles on the interwebs today. The first is an article at Climate Central and reprinted in Scientific American that summarizes a study published in Proceedings of the National Academy of Sciences. The study is titled Integrated Life-Cycle Assessment of Electricity Supply Scenarios Confirms Global Environmental Benefit of Low-Carbon Technologies. It looks at the life-cycle environmental and climate costs of a continuing shift to renewable energy through the year 2050. The researchers assumed that by then 39 percent of global electricity production will be generated by renewables and asked themselves what the environmental impact would be.
We all know that sources of renewable energy produce less CO2 during their operation than burning fossil fuels, but that’s not the whole picture. The life-cycle analysis looked at the environmental costs and benefits of renewables and traditional sources of power from their inception (such as mining the ores needed to produce the metal parts) through their entire operational lives. For example, wind turbines require more than 10 times the iron needed for comparable electricity generation powered by oil or coal, and solar panels require up to 40 times more copper. That’s a lot of additional mining, with the associated environmental impact. Are renewables really better for the environment?
The answer turns out to be, “Yes.” How could that be? There are two factors. One is that after a solar panel or a wind turbine is manufactured it doesn’t require additional raw materials. Coal, oil, and gas-fired power plants require a continuous input of new raw materials that are extracted, transported, and then burned. The second is that although renewable energy equipment requires more material to produce, the required amount is a small amount relative to global production. For instance, the copper required for the estimated increase in solar panels over the next 36 years is only 2 years of copper production (or 5 percent) at current rates.
An increase in environmentally friendly energy production is good news because the second article, published in the op-ed section of today’s New York Times, says that historical trends in increased lighting efficiency suggest that we may see an increase, not a decrease, in energy consumption in the years to come. The authors explain that as the production of light became cheaper, from coal gas to whale oil to kerosene to electricity, the demand for these cheaper technologies resulted in an overall increase in energy consumption, a process called rebound. They note that
The I.E.A. and I.P.C.C. estimate that the rebound could be over 50 percent globally. Recent estimates and case studies have suggested that in many energy-intensive sectors of developing economies, energy-saving technologies may backfire, meaning that increased energy consumption associated with lower energy costs because of higher efficiency may in fact result in higher energy consumption than there would have been without those technologies.
That’s not a bad thing. Most people in the world, still struggling to achieve modern living standards, need to consume more energy, not less. Cheap LED and other more efficient energy technologies will be overwhelmingly positive for people and economies all over the world.
But LED and other ultraefficient lighting technologies are unlikely to reduce global energy consumption or reduce carbon emissions. If we are to make a serious dent in carbon emissions, there is no escaping the need to shift to cleaner sources of energy.