Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Democrat's Platform Up Close - 2016

“We believe America must be running entirely on clean energy by mid-century.”

The term “clean energy” is not defined in the platform, so it is not clear whether natural gas and propane are included, which leaves significant political flexibility and significant energy user uncertainty. If “clean” refers to “carbon pollution”, a term used elsewhere in the platform, then we are left with nuclear, solar, wind, biomass and possibly Ocean Thermal Energy Conversion (OTEC) and Wave Energy, which would make achieving this goal both more expensive and less likely.

“Democrats believe that carbon dioxide, methane, and other greenhouse gases should be priced to reflect their negative externalities, and to accelerate the transition to a clean energy economy and help meet our climate goals.”

The “carbon tax” rears its ugly head again. Note that there is no recognition of any positive externalities of increased atmospheric CO2 concentrations, such as the greening of the planet documented by NASA. Also, note that a “carbon tax” is a “sin tax”, purportedly intended to drive the “sin” out of existence, though focused more on revenue generation. However, as the “sin” begins to decline, the tax must be increased to sustain revenue production.

“The impacts of climate change will also disproportionately affect low-income and minority communities, tribal nations, and Alaska Native villages—all of which suffer the worst losses during extreme weather and have the fewest resources to prepare. Simply put, this is environmental racism.”

The Democrats just can’t help themselves. The “race card” must be dealt, early and often. At least they avoided the tried and true “women and children most affected”. However, they did highlight “environmental and climate justice”.

“All corporations owe it to their shareholders to fully analyze and disclose the risks they face, including climate risk. Those who fail to do so should be held accountable. Democrats also respectfully request the Department of Justice to investigate allegations of corporate fraud on the part of fossil fuel companies accused of misleading shareholders and the public on the scientific reality of climate change.”

The effort to criminalize dissent would continue. However, no fossil fuel company denies “the scientific reality of climate change”, though they do question the magnitude of the anthropogenic contribution to climate change and the likelihood of that anthropogenic contribution leading to climate catastrophe. Future climate catastrophe is not now “scientific reality”. The bigger danger to fossil fuel industry shareholders is the financial risk to their investments from precipitous and unnecessary government regulations.

“Democrats oppose efforts to undermine the effectiveness of the Endangered Species Act to protect threatened and endangered species.”

However, Democrats continue to advocate for wind energy expansion and incentives, despite the increasingly well documented impacts of wind farms on a number of endangered species of birds and bats. Democrats also continue to advocate for concentrating solar power, despite the documented effects on birds flying through the concentrated beams of solar energy.

Tags: Democrats, 2016 election

Republican Platform 2016

Republican Party Platform 2016 – Climate Change

Information concerning a changing climate, especially projections into the long-range future, must be based on dispassionate analysis of hard data. We will enforce that standard throughout the executive branch, among civil servants and presidential appointees alike. The United Nations’ Intergovernmental Panel on Climate Change is a political mechanism, not an unbiased scientific institution. Its unreliability is reflected in its intolerance toward scientists and others who dissent from its orthodoxy. We will evaluate its recommendations accordingly. We reject the agendas of both the Kyoto Protocol and the Paris Agreement, which represent only the personal commitments of their signatories; no such agreement can be binding upon the United States until it is submitted to and ratified by the Senate.

The Republican Platform 2016 is far less detailed and specific than the Democrat Platform with regard to climate change, but it offers the prospect of a more rigorous and more transparent process for the development of environmental law and regulation, with more direct congressional involvement. The specific reference to eliminating the cozy “sue and settle” approach taken by US EPA in dealing with environmental regulation signals a potential major step in the right direction. The Clean Air Act, as amended, is “black letter law”, which should not be open to broad reinterpretation by unelected bureaucrats.

The insistence on “dispassionate analysis of hard data” might well temper the wholesale “adjustment” and “re-adjustment” of data, which has been the hallmark of the NOAA / NCDC / NCEI and NASA GISS temperature anomaly products. It might also lead to insistence that  near-surface temperature measuring stations globally be brought up to the standards set for the US Climate Reference Network, since the measurements taken at these stations do not need to be subjected to myriad “adjustments”.

The insistence on “dispassionate analysis of hard data” might also lead to a more dispassionate (skeptical) analysis of the scenarios produce by the multiple climate models, which appear to be diverging from even the “adjusted” data. The models which have been in existence long enough to allow the scenarios they generated two to three decades ago to be compared against “adjusted” temperature anomalies have demonstrated little or no predictive skill.

The reference to the IPCC’s “intolerance towards scientists and others who dissent from its orthodoxy” might suggest a willingness on the part of a new Administration to fund research without regard to the likelihood that its results would support the UN / UNFCCC / IPCC narrative; or even specifically to fund research intended to aggressively test the validity of the narrative and perhaps falsify aspects of the narrative.

The identification of the IPCC as a political mechanism suggests that US EPA would be required to support new or expanded regulations based on the results of its own research, rather than on the work of the IPCC, as was the case for the CO2 Endangerment Finding.

The apparent willingness to reassert the Senate’s role in ratifying treaties is also encouraging.

Tags: 2016 election, Republican

Energy Policy: The Facts and Myths on the Consensus on Climate Change In-Depth Article

"No challenge--no challenge--poses a greater threat to future generations than climate change. 2014 was the planet’s warmest year on record.  Now, one year doesn’t make a trend, but this does: 14 of the 15 warmest years on record have all fallen in the first 15 years of this century," said President Obama in his 2015 State of the Union Address.

 

 

“I am skeptical humans are the main cause of climate change and that it will be catastrophic in the near future. There is no scientific proof of this hypothesis, yet we are told ‘the debate is over’ and ‘the science is settled’. … We have no proof increased carbon dioxide is responsible for the earth’s slight warming over the past 300 years,” said Dr. Patrick Moore, one of the founders of Greenpeace.

 

“Multiple studies published in peer-reviewed scientific journals show that 97 percent or more of actively publishing climate scientists agree: Climate-warming trends over the past century are very likely due to human activities. In addition, most of the leading scientific organizations worldwide have issued public statements endorsing this position.” NASA (the guys that put a man on the moon)(emphasis added)

 

 

The authors of the Nongovernmental International Panel on Climate Change “say the IPCC [United Nations’ Intergovernmental Panel on Climate Change] has exaggerated the amount of warming likely to occur if the concentration of atmospheric CO2 were to double, and such warming as occurs is likely to be modest and cause no net harm to the global environment or to human well-being.” NIPCC Summary for Policymakers, page 3 (2013)

 

Who could blame you if you were confused by all the conflicting claims about climate change?  The problem is almost everything you hear has some truth to it but much of it is exaggerated.  But frankly, both sides have extremists in the debate that are guilty of some fudging if not outright prevarication. As with so many things, the devil is in the details and who has time to ferret out all those details? (I have spent the last 5 years focusing on the impact of climate change on energy markets.)

But Climate Change Alarmism is now one of the main drivers of energy policy.  I call it the Golden Thread.  One’s views on energy policy are nearly completely a function of what you believe about climate change.  If the alarmists are wrong and you pull out this Golden Thread, then nearly all of current energy policy unravels or at least must be radically altered.

While I have opinions on many of the issues below, I have made a studious attempt to refrain from any pontificating and intend to simply follow Joe Friday’s advice in Dragnet “Just the facts, ma’am, just the facts.”  My main goal is to put claims that there is a “consensus” on climate change into some perspective so the debate over energy policy can be better understood. I have tried to present a range of opinions faithfully.

There does indeed seem to be a broad consensus on at least nine points:

  • First, since the dawn of time, the earth’s climate has changed, is changing, and will change in the future due to natural variability. The earth has historically been both colder and hotter than it is today. Carbon concentrations in the atmosphere have historically been both higher and lower than it is today.
  • Second, the science on the impact of releasing ever increasing carbon emissions is theoretically sound.  All other things being equal, there is a strong scientific consensus that more carbon in the atmosphere will increase the greenhouse effect. 
  • Third, man’s use of fossil fuels has increased the concentration of carbon in the atmosphere over the last century and will continue to increase it in the future under the status quo.
  • Fourth, increasing the concentration of carbon in the atmosphere has had and will “very likely” have some impact on increasing earth’s average temperature (temp).
  • Fifth, even scientists labeled as Skeptics (or more derogatorily “Deniers”) acknowledge that average temp has increased about 1 degree Centigrade between 1880 and the present.
  • Sixth, even scientists labeled Warmists (or more derogatorily “Alarmists”) acknowledge that average temp has been fairly stable (The Pause) over the last 18 or so years, i.e., it has not increased as predicted by the models despite dramatic increases in global carbon emissions.
  • Seventh, the United States acting alone cannot solve the problem, whatever that turns out to be. 
  • Eighth, US action to radically reduce carbon will have a profound effect on our economy.
  • Ninth, we don’t know what the “right” or optimal temperature for the earth should be.

So when you see a statement like “97%” of scientists agree on climate change, these facts are the strongest basis for that claim.  In fact, these claims about consensus would be largely correct if they are limited to these nine conclusions.  

So end of article, right?  Actually, this is just the beginning and the easy part.  Unfortunately, some have tortured this “consensus” into confessing more than is actually supported by the science.  

Did you happen to notice that the statement about the rise in temp followed the statements about how carbon emissions created a greenhouse effect, that using fossil fuels had increased carbon emissions, and that there was a recent increase in temp?  You probably reasonably assumed that the temp increase was caused by man’s use of fossil fuels. Therein lies the rub.

Did you notice the phrase “all other things being equal”?  Well guess what? All other things are not equal.  The earth’s climate is an exceedingly complex phenomenon.

Let’s first start with how we measure the temp of the earth.  Think of what that means the temp of the earth.  You probably call to mind how we measure our own temp.  Put a thermometer in our mouth for a minute and read the temp.  It should be obvious there is no single place or device to measure the earth’s temp.  The earth is a pretty big place.  If it is hot in the northern hemisphere it is cold in the southern hemisphere.  So measuring the earth’s temp is tricky business. 

Today, there is a consensus that the most accurate temp readings are from satellite data.  Guess what?  There were no satellites a century ago much less a millennia ago.  In addition to satellites, we use historical data from thermometers on the ground.  Not surprisingly, this yields some questionable results.  Some thermometers are affected by how the land they are located on has changed over time.  The “urban heat island effect” is one such development.  If a thermometer was located 50 years ago around vegetation and now is surrounded by parking lots and buildings, then its ability to compare today’s temp to historical data is tainted.

In some cases, we actually have consistent thermometer readings going back hundreds of years, but not enough to have a high degree of confidence in their ability to accurately measure the temp of the globe.  We thus use other surrogates for estimating the past correlation between carbon concentration and temp.  Popular methods includes tree ring data, ice core data and ocean floor seashell deposits.  To make a long story short, there is actually a vigorous debate about how to accurately measure the past temp of the earth and how it correlates with the amount of carbon in the atmosphere.  While it is fair to say that there is a consensus on increases in the last century, it is also fair to say that nothing approaching 97% of scientists agree on the accuracy of different methods of measuring historic temp before satellite data and how the recent temp compares to historic temps.

Another important fact of climate history is that there are periods where warming occurred in a stable CO2 atmospheric condition and in which CO2 thereafter increased.  Did the warming increase the growth of vegetation which then emits more CO2, thus increasing CO2?

One of the biggest controversies is the attempt to explain warmer temps in the medieval period (900 to 1200 AD) and colder temps that followed (1400 to 1600).  There had been historically strong support that higher and lower temps existed during this medieval period, i.e., grapes grew where they can no longer grow; rivers froze that do not freeze today, ships could pass areas now frozen and couldn’t in areas that are now passable.  But since this was obviously before man started burning fossil fuels, it created a dilemma for the theory that fossil fuels alone were causing higher temps in recent times. 

So several scientists reassessed the temp data of the medieval period and concluded the temps were actually colder during this period, thus adding support to the role of fossil fuels in the current period as the likely cause of the temp increase.  This set off a firestorm of controversy and has largely been discredited.  As part of this controversy over some scientific conclusions, someone hacked a prominent university’s email system and released the emails to the public.  Sadly, even a generous reading of the emails indicated that there was manipulation of data and political considerations in how to interpret, treat, and release important data and conclusions. 

Given the magnitude of some of the actions that would need to be taken if the alarmist theory is right, trust in the scientific community is essential.  This email episode and many of the actions of the United Nations International Panel on Climate Change have undermined that trust and conflated science and politics.

So what caused the higher temps in the medieval period?  There is certainly no significant consensus on the temps of the medieval period and thus this period remains a difficult anomaly for those who believe that fossil fuel is a very significant cause of any recent warming.

Second, there is the question of how much man’s burning of fossil fuels has contributed to any temp increase and will contribute to any future increase.  Even the UN IPCC (the main organization that tries to develop scientific consensus) acknowledges that climate change is “a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over comparable time periods.” (emphasis added)

As the UN IPCC recognizes, many natural factors affect the earth’s temp.  Let’s start with the most obvious, the sun.  The sun’s rays are not static over time but ebb and flow. Sunspots can go “silent” for periods of time and be very active in other periods.  The earth has clouds that block the radiative force of the sun.  Another little understood factor is extra-galactic cosmic ray bursts which can significantly affect the earth’s cloud formation.  Then there are the oceans that absorb and release carbon.  Volcanos and aerosols affect temp by blocking the sun. The axis of the earth has an effect, and this axis is constantly changing.  Additionally, there are periodic weather patterns called El Nino and La Nina that can have a profound effect on temperature and rainfall.  Like sunspot activity, the strength and thus impact of these weather patterns are not fully predictable.  

No scientist denies that “natural variation” plays some role in the temp of the earth.  But it is a knotty problem to ferret out how much is attributable to man and how much is due to natural variation.  There is a vigorous debate about how much each of these factors, and many others, contribute to the temp of the earth.  For example, if the sun is responsible for 99% of the earth’s temp and carbon from fossil fuels for 1%, then you can see that there is probably not much advantage to reducing fossil fuel emissions since it will not have much effect.  The fact is that there is a lot we don’t know about how clouds, oceans, aerosols, and many other factors affecting temp. So while there is little debate that man has some impact, there is a vigorous debate about how significant man’s impact is to any potential future warming.

So how much carbon does man emit compared to these other factors. One scientist developed the following table to put man’s contribution into perspective with other natural factors:

Based on concentrations (ppb) adjusted for heat retention characteristics

 % of Greenhouse Effect

% Natural

% Man-made

Water vapor

95.000%

94.999%

0.001%

 Carbon Dioxide (CO2)

3.618%

3.502%

0.117%

 Methane (CH4)

0.360%

0.294%

0.066%

 Nitrous Oxide (N2O)

0.950%

0.903%

0.047%

 Misc. gases (CFC's, etc.)

0.072%

0.025%

0.047%

 Total

100.00%

99.72

0.28%

 

Table 4a. Anthropogenic (man-made) Contribution to the "Greenhouse Effect," expressed as % of Total (water vapor INCLUDED)

While it may look like man’s contribution is miniscule (about a quarter of 1%), some scientists argue that this small amount is the tipping point that will cause the dramatic rise in temp.  Other scientists disagree and believe that man’s contribution is not and will not be a major factor in climate change.  But there is nowhere near a 97% consensus here on how much warming can be attributed to man versus natural variability.

Much of the discussion of “man’s” contribution centers around burning fossil fuels.  Yet there is a vigorous debate about how much carbon is emitted by agribusiness and the consumption of meat.  Cite Cowspiracy

Related to man’s use of fossil fuel is the question of how quickly carbon dissipates once it is emitted. Some scientists believe (and historically the conventional wisdom is) that it dissipates within 5 years.  But more recently, some scientists have come to understand that the answer is much more complex since there is a constant interchange of carbon between the earth (plants and oceans) and the atmosphere.  Some now believe that carbon emissions hang around for a much longer period that previously believed. Again, there is nowhere near a 97% consensus on this issue.

One of my favorite questions is given the variability of the historical record what is the right temp for the earth?  Why do we assume that the current temperature is scientifically proven to be optimal?  We certainly have a scientific basis for knowing that 98.6 degrees is the “normal” temp of the human body.  No such scientific consensus exists on the right temp for the earth.

To sum up, there is indeed a strong consensus on some issues, but there is also strong debate on other issues.

So given that many uncertainties still exist, how do we know what the temp will be a century from now?  We develop models.

Let’s try a thought experiment.  Think of the issue of how much money you will spend in April 2019.  Suppose I gave you and several other financial experts a million dollars to each come up with the best possible calculation.  You would take a spreadsheet and start listing all the categories of expenditure (and likely sources of income since that would operate as a boundary).  You would then try to estimate how much you would spend for each category.  You would try to anticipate all the life events that would happen in just 3 years.  You might get married or divorced.  You might have a child.  An elderly parent might come to live with you.  You might get sick.  You might get fired or get a promotion.  The economy might crash.  Inflation might accelerate.  Taxes might increase.  You might have a car accident and buy a new car.

You do your best to make educated guesses about your life.  After all who knows you better than you?  You then hit the sum button and you get a number.  Is that number a “fact?”  Or is it an educated guess, indeed the best educated guess you could make after lots of effort.  But at the end of you day, you would surely realize that there are a hundred things that could happen to throw off your calculation.  Additionally, how do you think your calculations would compare to the financial experts?  How would the financial experts compare to each other?   You would not be surprised if everyone had a different estimate.  Only time will tell whose educated guesses will come closest to reality.

This is what scientists have done.  They have created incredibly sophisticated models to predict how much the earth’s temp will increase over the next century.  We don’t have just one model.  We have a lot of models.  And the scientists building the models are incredibly credentialed, hardworking, and well-funded.  And different models make hundreds of different assumptions and not surprisingly reach different conclusions.

Let’s conduct a thought experiment on change over time.  Modelers want to make an educated guess about the earth’s climate about 100 years from now.  Take half that time period (50 years) and answer this question.  If you were to predict in 1965 what life would be like in 2015 (50 years), how close to reality do you think you would be?  A century ago the most challenging environmental problem was horse poop in burgeoning urban centers.  Indeed, this calls for some humility.  It seems simply preposterous to anticipate all the technological changes that will happen over a 50-year period much less a century.  You may call it naïve but isn’t it likely that we will find an innovative technological solution to climate change if indeed carbon concentration turns out to be as serious a problem as the Alarmists believe?

Nonetheless, the models seem to generally support a projection that temp will increase with some correlation to our burning of fossil fuels, but with a significant variation as to how much temp will increase and the degree of the increase attributable to burning fossil fuels. To be fair, there are some credentialed critics (MIT, Princeton, and Harvard) that are concerned that there is a bias in the models, reflecting the need for ever increasing funding for the modelers scientific efforts.  The concern is that there is more academic success by winning large grants of funds and that demonstrating a serious problem leads to large funding.  If there is no serious climate change problem, funds will likely dry up for research.  There are also claims that scientists who are skeptical about the seriousness of climate change do not get funded on an equal basis if at all.  But this may be an unfair criticism.  Some of the scientists no doubt care about doing sound scientific research.

So how well have the models done so far?  Well, not a single model predicted that temp would remain relatively stable for almost the last two decades.  It is fair to ask if the models cannot accurately predict the easy stuff (how much will you spend next month versus 3 years from now?), how much confidence should we have in predicting the hard stuff a century from now?  This is especially true given that many of the conditions embedded in the model are the subject of substantial debate and uncertainty.  Indeed, there are some instances where scientists use “plug” numbers to make sure that the models can be reconciled with historic climate patterns.  It is not unusual for modelers even in modeling outside the arena of climate models to use a variety of techniques to accommodate uncertainties.  So track records have to matter in whether our confidence in the results of models should be increased or decreased.  And indeed it seems somewhat surprising that when “adjustments” to the data or models are made they all too often seem to be in the direction of increasing projections of warming.

The climate change literature is now replete with explanations of why the models failed to anticipate the “pause.”  One wag has actually counted 66 different explanations.  There is certainly no 97% consensus here.

In 2015, there was a good example of the difference between engineering models and their ability to accurately predict future reality.  There is an engineering model that is nearly universally used to predict the cost and benefits of making various energy efficiency investments in a given residential home.  For example, if you invest in insulation, more efficient windows, and weather stripping and it cost you $5000, the model will show how quickly that investment will save you enough in lower energy bills to pay back the investment.  The key calculation is projecting the anticipated energy savings.  A study by professors at the University of Chicago and Berkley did a very detailed analysis of the projected energy savings and compared them to the actual energy savings in 30,000 homes that were part of a federal program for funding such investments.  The study found that the model systematically overestimated savings by more than half.  Thus investments that the model predicted would be cost effective were in fact bad investments. 

Another example of the difficulty of making even much more focused computer projections is the famous bets between Ehrlich-Simon and the Simmons-Tierney (put your money where your mouth is).  In both instances, a bet was made between experts who were alarmists about the future of scarcity of natural resources and those who thought they were, well, being alarmists.  The bets consisted of predicting natural resource prices over relatively short periods (10 and 5 years respectively).  In both instances the alarmists were, indeed, alarmists, wrong in their predictions, losers in the bet (both paid off).  Yet, alarmists are much more likely to get media coverage than those who claim that the alarmists claims are overblown.

The point of discussing the personal finance thought experiment, the nearly two decade pause, the energy efficiency study, and the natural resource bets is to raise a cautionary concern about relying on computer models for making projections far into the future.  Computer models are no doubt helpful to our understanding of what may happen in the future but the results of these models are NOT FACTS.  They are best-guess estimates that are subject to a variety of flaws and biases.      

OK so there is a high degree of consensus on some issues and a lot of debate among scientists on other issues.  Where does that leave us?

Let’s assume that we magically develop a high degree of confidence that the earth’s temp will increase in the future by something like 5 degrees Fahrenheit and that man’s use of fossil fuels is a very significant reason for the increase. (Alert: There certainly is no such consensus today but for the sake of argument let’s assume there is.)

The question then is what will happen.  Surprisingly, there is no clear consensus on what the earth looks like in 2100 if temp increases by 5 degrees.  Botanists pump carbon dioxide into greenhouses to help plants and flowers grow better.  Carbon dioxide is essential to life.  We breathe in oxygen and exhale carbon dioxide.  Plants absorb carbon dioxide and release oxygen.  The earth has been hotter and colder than our predicted conclusion of a 5 degree increase and it has had higher and lower concentrations of carbon dioxide.  So will a temp increase be beneficial or catastrophic for humankind? 

On the plus side is the fact that the global area producing food crops will substantially increase.  With higher CO2 concentrations, crop production in tropical and temperate areas can be expected to increase.  Fishing catches in the oceans of the world may increase.  Based on the current models of temperature increases, Arctic and Antarctic ice and glaciers may not melt enough to inundate coastal urban areas or islands.  There will be fewer deaths from frigid temperatures, which by far outnumber deaths from high temperatures. 

But some scientists predict the end of the world as we know it.  Storms will increase.  Floods will cover Manhattan. People will suffer from pestilence and starvation. Al Gore won an Oscar for shocking us in “An Inconvenient Truth.”  The UN’s IPCC has won a Nobel Peace prize for raising concerns about the potential devastating impacts of climate change. There are clearly a lot of loud voices from both scientists and non-scientists claiming that the results will be catastrophic and that radical efforts must be undertaken to avert this outcome.

Some scientists believe just the opposite.  “The chief benefits of global warming include: fewer winter deaths; lower energy costs; better agricultural yields; probably fewer droughts; maybe richer biodiversity.”  Even some of those scientists that recognize that there will be some negative impacts from climate change believe the harms are exaggerated and that radical solutions are premature. Some Skeptics believe that changing the terminology from “global warming” to “climate change” was a deliberate attempt to claim that any weather anomaly could be attributable to man’s burning of fossil fuels.

Some Warmists seem to argue that any harm caused by weather is attributable to carbon. Indeed, many even argue that many harms that are not directly attributable to weather are caused by climate change.  One blogger has even compiled a web page of hundreds of horrific things that have been claimed as a result of climate change.  (My favorite is an increase in major league baseball home runs.) 

Additionally, there is the problem of how one would disprove climate change by looking at weather anomalies.  If everything proves climate change, nothing can disprove climate change.  It seems that every weather event is blamed on climate change.

The Oxford Dictionary defines the scientific method as: “A method of procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses.”

If the scientific method is the ability to test the truth or falsity of a hypothesis, what evidence would you want to see that proved that carbon emissions had at best a de minimus effect on temp or weather conditions? If drought or rain, storms or lack of storms, snow or no snow, cold or hot, temp increase or no such increase, all prove the existence of climate change, then what would disprove it? Suppose that we would still have storms in 2100 but they would be only 2% more severe?  Not even the most ardent Warmist claims there will never again be weather events even if we completely weaned ourselves off carbon.

Additionally, it harms the credibility of those genuinely concerned with the scientific analysis of climate change that many of the most ardent advocates of radical carbon reduction are also harsh critics of capitalism. Is it at least possible that some are overstating the problem to further a more nefarious/hidden agenda?  Similarly, some of the most ardent Warmists have made predictions before that have turned out to be abysmally incorrect:

In 1971, John Holdren edited and contributed an essay to a book entitled Global Ecology: Readings Toward a Rational Strategy for Man. He wrote …the book’s sixth chapter, called “Overpopulation and the Potential for Ecocide.”  … In their chapter, Holdren and Ehrlich speculate about various environmental catastrophes, and on pages 76 and 77 Holdren the climate scientist speaks about the probable likelihood of a “new ice age” caused by human activity (air pollution, dust from farming, jet exhaust, desertification, etc).

John Holdren is now not only the “Science Czar” for the United States, but he’s also one of the original leaders of the “alarmist” wing of the Global Warming debate and he now promotes the notion that the current climate data points to a looming planetary overheating catastrophe of unimaginable dimensions (he helped make the charts and graphs for Al Gore’s film An Inconvenient Truth, for example).

As of July 2016, “Dr. John P. Holdren is Assistant to the President for Science and Technology, Director of the White House Office of Science and Technology Policy, and Co-Chair of the President's Council of Advisors on Science and Technology (PCAST).”  Dr. Holdren was one of the alarmist participants in the Ehrlich-Simon bet.  Apparently, being consistently wrong in making alarming projections does not harm one’s career.

Ok, but let’s assume that we become convinced that more bad things than good things will result from a significant increase in temp over the next century.  Then the big question is what is the right policy to address that situation?  Now the question is one of policy, not science. Broadly, there are three strategies: radically reduce carbon emissions, develop new technologies that will mitigate carbon concentration in the atmosphere, or adapt to a new reality, and there are lots of combinations in between.

What should we do?  What actions should we take?

Let me draw on an analogy from the past.  In 1949, polio was an epidemic in the US.  The consensus treatment for some polio victims was called an iron lung, which is a bulky contraption that helped polio victims breathe. (Illustration on the left.)  Scientists projected a lot more cases of polio.  The consensus policy solution might have been that we should order millions of iron lung machines for all the potential victims that would develop polio.  But then Drs. Salk and Sabin in the next several years developed a vaccine for polio.  Today, polio is nearly extinct around the globe, with less than 300 cases reported globally in 2012.  All in about 60 years...  So, what are we going to do with all those iron lungs we ordered?

Another example of how quickly things can change.  The Wright Brothers first flew about 200 feet in 1903.  Air power dominated World War II in the 1940s.  And we put a man on the moon in 1969.  All in 66 years!  Keeping in mind the pace of change in computer technology and communication, how wise is it to make policy now on uncertain predictions of what will happen a century from now?

Still, some people believe that climate change is the single most important issue facing the world and must be addressed by dramatically reducing our reliance on fossil fuels and transitioning to renewables and more efficient use of energy as soon as possible, irrespective of costs or quality of life.   

Such Warmists have not been successful in convincing the Congress of their position.  Even when President Obama had a filibuster proof Democratic Senate and a majority Democratic House, they did not agree on legislation on climate change.  Needless to say, legislation that will satisfy Warmists is not likely to pass a Republican House and Senate. 

Even the American people seem skeptical.  Poll after poll ranks climate change or global warming near the bottom of the priorities that should be addressed.  Even a United Nations poll of seven million people worldwide ranked “action taken on climate change” dead last in a list of proposed priorities.

Warmists have been more successful in Departments of the Executive Branch, some States, and courts, including the Supreme Court.  Most Warmists’ preferred solution to the “problem” of climate change is radical reduction of carbon emissions.  They have been successful in convincing President Obama and the Environmental Protection Agency to issue two rules that would have a dramatic effect on reducing the use of coal for electric generation.  To say these rules are controversial is an understatement.  You will see these rules referred to as the Clean Power Plan.  (The Supreme Court upheld an injunction against the Clean Power Plan pending a full review.)

Skeptics obviously oppose the Warmists’ agenda.  There are lots of criticisms made but they boil down to the belief that there is not enough evidence to support policies that would have a profoundly negative impact on our economy and quality of life for very little real impact on future temperatures.  Additionally, some are concerned that the developing countries will never achieve a higher standard of living without using fossil fuels for electricity and growth.

The Copenhagen Consensus Center has run a very interesting experiment several times over the last decade.  Every couple of years, they bring together experts, some with Nobel Prizes to their credit, who are asked to allocate $75 billion to the projects that would result in the greatest benefit to mankind (most benefits for least costs).  As the Center states:

The Expert Panel was presented with nearly 40 investment proposals designed by experts to reduce the challenges of Armed Conflict, Biodiversity Destruction, Chronic Disease, Climate Change, Education Shortages, Hunger and Malnutrition, Infectious Disease, Natural Disasters, Population Growth, and Water and Sanitation Shortages. They found that fighting malnourishment should be the top priority for policy-makers and philanthropists.

Given the budget constraints, they found 16 investments worthy of investment (in descending order of desirability):

  1. Bundled micronutrient interventions to fight hunger and improve education
  2. Expanding the Subsidy for Malaria Combination Treatment
  3. Expanded Childhood Immunization Coverage
  4. Deworming of Schoolchildren, to improve educational and health outcomes
  5. Expanding Tuberculosis Treatment
  6. R&D to Increase Yield Enhancements, to decrease hunger, fight biodiversity destruction, and lessen the effects of climate change
  7. Investing in Effective Early Warning Systems to protect populations against natural disaster
  8. Strengthening Surgical Capacity
  9. Hepatitis B Immunization
  10. Using Low Cost Drugs in the case of Acute Heart Attacks in poorer nations (these are already available in developed countries)
  11. Salt Reduction Campaign to reduce chronic disease
  12. Geo Engineering R&D into the feasibility of solar radiation management
  13. Conditional Cash Transfers for School Attendance
  14. Accelerated HIV Vaccine R&D
  15. Extended Field Trial of Information Campaigns on the Benefits From Schooling
  16. Borehole and Public Hand Pump Intervention

(emphasis added).

It turns out that the panel of experts believed the costs of trying to reduce fossil fuel use does not result in enough benefits to merit immediate attention.  (One estimate is that there is only 10 cents of benefit for each dollar spent reducing a ton of carbon.  But others have found much higher benefit to cost ratios.)  Like the polio example, if we continue to study the problem and improve our understanding of climate, fossil fuel use, mitigation, and adaptation, it is likely that we will find a solution that is far more cost effective in the future and aimed at the real magnitude of the problem.  As noted in the quote at the beginning of the Commentary, even one of the founders of Greenpeace is skeptical that radical reductions in carbon emissions will be beneficial to the earth and the economy. The concept of Geoengineering (bolded in the list above) posits that we will discover a mechanism for neutralizing carbon in the atmosphere if indeed it turns out to be as serious a problem as some believe, a “vaccine” if you will.

The solution advocated most aggressively by Warmists is to dramatically reduce carbon emissions from using fossil fuels for our energy needs.  They advocate replacing fossil fuels with renewables and more efficient use of energy (light bulbs, more miles per gallon, better windows etc.) and some (but not many) advocate greater use of nuclear energy.  Many energy experts believe this strategy is not only costly (the US average for electricity is about 11 cents per kWh and Germany’s is 33 cents), but dangerous.  Renewable energy is simply not as reliable as fossil energy.  The challenge is something called intermittency.  Sometimes the wind doesn’t blow or the sun doesn’t shine.  Regrettably, there is not yet a cost effective means for storing electricity for this intermittency problem.  Additionally, major changes to the electric grid would be necessary to accommodate renewable energy on the scale that would be required to substitute it for fossil fuels.  There is no doubt room for debate about what the right mix of renewable and fossil energy should be but the main point is that there is nothing close to a consensus on this point.

Regarding the impacts of radical carbon emission reduction, some economists make the point that a prosperous economy is the best defense against the potential challenge of climate change.  It is simply a fact that richer countries are more environmentally sensitive than poorer countries.  Given that there will always be natural disasters even if carbon is reduced, some advocate that scarce resources are better spent more broadly on contingency planning and adaptation.  Venice and Amsterdam built canals to adapt to water levels in order to improve the quality of life in those cities.  Warmists often point out that recent hurricanes have caused much more property damage than previous storms.  But this is true because more people now live and build near water than in the past.  Adaptation would ensure that buildings would be constructed to withstand the inevitable tests of natural disasters.

Lastly, we have the problem of efficacy.  If nothing I do solves the problem, then I am wasting my time and money by focusing on my response to that problem.  Flushing money down the toilet as the saying goes.  Even assuming all the worst case uncertainties, the United States would not even make a dent in the problem by zeroing out its carbon emissions.  To be efficacious, the entire world would have to cooperate by reducing carbon emissions.  To be sure, the US could “lead the way.”  But at what cost?  Surely China and Russia would be giddy to strike such a blow to the US economy, while China builds a coal plant a week to fuel competition with the US.   

Finally, let’s deal with another aspect of the issue of climate change: public discourse. There are issues on which scientists have reached a “consensus.”  But as you can see above there are many important issues on which they disagree. 

At one time there was a consensus that the earth was the center of the universe.  If your best response is that I don’t need to look and discuss new evidence because there is a consensus on the current view, then I think that undermines the essence of not only the scientific method, but critical thinking capacity.  Many consensuses have turned out to be mistaken.  (We no longer have a consensus on whether eggs and the FDA’s Food Pyramid are healthy!)  And the nub of it is that there is NOT a consensus on many of the critically important issues on climate change.  In 1949, there was a consensus that polio was caused by a virus and that it was likely to infect millions in the future.  But that consensus did not and should not dictate what the public policy response should be.

As described above, a lot of issues remain uncertain relating to climate change.  One would think that so consequential a matter should result in a vigorous, civil discussion of how to reconcile climate change actions with other priorities in society.  For example, is it a better use of funds to lower our standard of living to try and ameliorate climate change or better educate the next generation of scientists that will find a “vaccine” for carbon, if it turns out that the carbon problem is serious?  I don’t know the answer to all these questions but I certainly think we need to discuss them in a serious and civil manner.

Rather than debate the issues on which there is disagreement, some Warmists shout down debate by stating that the “argument is over,” “the science is in,” and there is a “consensus of 97% of scientists.”   Unfortunately, one side stands ready willing and able to debate the climate change issue in all its dimensions.  But many on the other side have adopted a most unscientific position, indeed an antediluvian and Luddite position, of refusing to discuss the issue, indeed trying to shut down debate.  “The debate is over.”  “All responsible scientists agree so there is nothing left to debate.”  Indeed, climate Skeptics have been begging for an open debate for years now.  But the Warmists seem to have adopted a strategy of refusing to debate. 

The “public debate” issue has recently taken an even more odious turn.  On March 13, 2015, Al Gore publically called for skeptics to be “punished.”  There has been a growing drumbeat that anyone who disagrees with the Warmists’ position should suffer dire consequences for such heresy (a very small minority have even called for the death penalty, but one would hope this is merely hyperbolic rhetoric).  Recently, there has been a particularly nasty and vindictive campaign against Dr. Willie Soon, a Harvard University solar physicist, for a study which he co-authored.  Lastly, in 2016, the US Department of Justice indicated that it was looking at the issue of whether climate change denial violated the law, as were various state attorneys general. 

Why is that?  

So there it is.  I don’t have all the answers.  I hope that this provides a reasonable explanation for why the climate debate is so contentious and confusing.  The areas of a lack of consensus do not necessarily suggest that climate change is not an important issue that merits ongoing attention and additional scientific research.  One hopes that we can return to a day and time when we can openly debate the scientific basis for projections of what climate will likely be a century from now and fashion public policies appropriate to the scientific facts.

Tags: Climate Consensus

Sea Level Change

One of the greatest climate change concerns expressed by low lying coastal and island nations is the threat which would result from rising sea levels, caused by a combination of thermal expansion and the melting of glacial and land ice. Sea level, like near-surface temperatures, has been generally increasing over the ~200 year instrumental record, as would be expected during the warming after a major ice age as well as the recovery from the Little Ice Age just prior to the beginning of the instrumental record.

The surfaces of the global oceans are never truly at rest, but rather constantly in motion, which makes the accurate measurement of sea level challenging. The equipment used to measure sea levels has changed progressively over the period of the instrumental record, as has the extent of the ocean surfaces measured. Until 1993, the beginning of the satellite measurement era, almost all sea level measurements were taken at coastal stations, usually by instruments which were actually in contact with the ocean surface. NOAA currently uses satellites to measure sea level at 10 day intervals, with a claimed uncertainty of 3-4 millimeters, or more than twice the reported long term trend in annual sea level rise and approximately equal to the reported annual rate of sea level rise since 1993.

The satellite measurements are made from satellites in orbits 1336 kilometers above the oceans’ surfaces. The measurements are used to report annual changes in sea level to a precision of 0.1 millimeters, or one part in 13.36 billion. Arguably, since the measurements are based on the time for microwave signals to travel from the satellite to the ocean surface and return, 0.1 millimeter precision might actually represent one part in 26.72 billion, since the microwaves travel 26.72 billion millimeters from the satellite to the ocean surface and back to the satellite.

There is no single, generally accepted explanation for the apparent doubling of the annual rate of sea level rise beginning in 1993, essentially the beginning of the satellite measurement era, nor is this apparent doubling reflected in the measurements in geologically stable coastal areas. One has to wonder whether the explanation is as simple as the failure to recognize that a 1 millimeter rise in sea level results in a 2 millimeter reduction in the round trip distance between the ocean surface and the satellite.

Tags: Sea Level Change, Sea Level Rise

The Gates Math Formula

Bill Gates has propounded what he asserts is the math formula which will solve climate change.

P * S * E * C = CO2

Where: P is the population of the globe;

                                    S is the services demanded by the population;

                                    E is the energy required to provide those services; and,

                                    C is the carbon released in producing that energy.

Gates point is that global population and the population’s demand for services is growing faster than can be offset by increases in energy efficiency and transitions to lower carbon fuels. He is certainly correct in that assessment.

His message is that, if global annual CO2 emissions are to be reduced by 80% by 2050 and to zero by the end of the century, reliance on increased energy efficiency and a transition to lower carbon fuels will not be sufficient to achieve the desired result. Rather, there is the need for a massive increase in R&D funding in search of breakthrough technologies which could achieve the desired result. One such breakthrough would be a source which is always available. Advanced, modular nuclear generators could be one example. Another breakthrough might be low cost, high capacity, high charge rate and discharge rate energy storage systems. Such storage systems, combined with lower cost, higher efficiency solar and wind systems could broaden the potential of intermittent energy generators to provide reliable grid power.

Unstated in the Gates message is the realization that it is far more beneficial to invest limited capital in R&D on technologies which could be effective, rather than spending that capital attempting to commercialize technologies which are incapable of being effective. This is clearly not the approach currently being pursued by the globe’s governments.

Obviously, Gates position is based on the premises that: climate change is caused by human activities; climate change is undesirable; climate change can be eliminated; and, it is urgent that climate change be eliminated. Given these premises, Gates position makes eminent good sense; far more sense than the programs being pursued and proposed by the globe’s governments.

However, the first premise ignores the historical fact that the climate of the globe has been changing over the entire period of the earth’s history we have been able to study. The second premise ignores the benefits currently resulting from the increase in atmospheric CO2 concentrations, as documented by the greening of the planet observed by satellites and the expansion of global growing seasons and of tillable land to higher latitudes. The third premise relies on the belief that halting the increase in atmospheric CO2 concentrations would halt the recently observed warming, though that belief relies on models which have not been validated. Finally, the fourth premise relies on the sensitivities and feedbacks input to the climate models, which are unmeasured and currently unmeasurable.

Gates asserts that what he perceives as vast problems will require solutions based on vast ideas. Governments today are attempting to solve what they assert to be vast problems with half-vast ideas.

Tags:

Hottest Year Evah

2015 – The Warmest Year in the Near-surface Instrumental Temperature Record

The US National Center for Environmental Information (NCEI), the US National Aeronautics and Space Administration Goddard Institute of Space Studies (NASA GISS) and the UK Hadley Center / University of East Anglia Climate Research Unit (HadCRUT) have proclaimed 2015 to be the warmest year in the instrumental record of global near-surface temperatures. They reported that 2015 was 0.16 +/- 0.09°C (NCEI), 0.13 +/- 0.10°C (NASA GISS) and 0.18 +/- 0.10°C (HadCRUT) warmer than 2014.

Obviously, the increase in global average near-surface temperature from 2014 to 2015 could not be precisely 0.16°C and precisely 0.13°C and precisely 0.18°C, though it might fall within the range of those calculated figures (0.13 – 0.18°C). However, based on the confidence limits expressed by the producers of each of these global temperature anomaly products, the estimated global near-surface temperature difference between 2014 and 2015 would more likely fall within the range of 0.03°C (0.13°C – 0.10°C) – 0.28°C (0.18°C + 0.10°C), or 5 times the range of the calculated figures (0.05°C vs. 0.25°C).

Assuming that the confidence limits on the temperature increases reported by the producers of these near-surface temperature anomaly products for 2014 were the same as the confidence limits reported for 2015, it is statistically possible that 2014 was actually warmer than 2015. However, the linked paper suggests that the global near-surface temperature anomaly calculations are based on temperature readings with an estimated +/- 0.2°C station error, which has been incorrectly assessed as random error; and, that there is also systematic error from uncontrolled variables. The author calculates a representative lower limit of uncertainty in the calculated temperature anomalies of +/- 0.46°C; and, based on this lower limit of uncertainty, the global near-surface anomaly trend is statistically indistinguishable from zero.

The Law of Large Numbers, relied upon by the global near-surface temperature anomaly producers to report global anomalies to greater precision than the underlying “adjusted” temperatures, requires that the errors in the underlying temperatures be random. Assessments of the errors introduced by the temperature measuring instruments, their enclosures, their siting and changes in the characteristics of their surroundings suggest strongly that the measurement errors are not random; and, that the “adjustments” made to the temperature readings are not random and do not make the errors in the resulting “adjusted” readings random either.

Tags: Warmest, Hottest, Temperature Record

Highlighted Article: A Guide to Understanding Global Temperature Data

Dr. Roy Spencer just published this booklet.

A Guide to Understanding Global Temperature Data

This is a pretty basic, balanced view of the global temperature issue.

"Whether we use thermometers, weather balloons, or Earth-orbiting satellites, the measurements must be adjusted for known sources of error. This is difficult if not impossible to do accurately. As a result, different scientists come up with different global warming trends—or no warming trend at all."

Tags: Highlighted Article

The “Pause” Returns

The climate science community had been troubled by an extended “pause” in global warming prior to two events in the Spring of 2015: the onset of a major El Nino in the NINO region of the Pacific; and, the publication of Karl et al 2015 (Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus’), the “pause buster” reanalysis of global sea surface temperatures (ERSSTv4). As the result of one or both of those two events, the “Pause” paused, though it was frequently said to have ended.

However, the end of the 2015/2016 El Nino and the disappearance of the Pacific “Warm Blob” off the West coast of North America have restored the pause in the satellite anomaly products produced by UAH and RSS, to 23 years and 1 month and 22 years and 8 months respectively, through May 2016. The pause has also been restored in the HadCRUT near-surface temperature anomaly product, to 11 years and 2 months; and, in the HadSST sea surface temperature anomaly product to 19 years and 10 months, through April 2016. The pause has not yet been restored in the NASA GISS LOTI (land/ocean temperature index) temperature anomaly product through May 2016, nor in the NOAA NCEI combined anomaly product through April 2016. This is likely the result of the sea surface temperature revisions in NCEI’s ERSSTv4 sea surface temperature product, as well as near-surface data “adjustments”.

It appears likely that the pause will ultimately be restored in the NASA GISS and NOAA NCEI combined temperature anomaly products as both near-surface and sea surface temperatures continue to drop with the end of the 2015/2016 El Nino and the “Blob”; and, with the anticipated onset of the 2016/2017 La Nina, though the increased sea surface temperatures resulting from the Karl et al 2015 reanalysis will likely delay the restoration by several months in both combined anomaly products.

Tags: Pause, El Nino

Goodbye, El Nino

The major El Nino of 2015-2016 is over. Sea surface temperatures in the Nino regions have dropped to normal or below normal levels.

All of the temperature anomaly products are showing cooling from the peak of the 2015-2016 El Nino. The NASA GISS near-surface temperature anomaly product is the only near-surface temperature anomaly product currently available through May, 2016. It has declined by 0.4°C from its peak. The UAH tropospheric temperature anomaly has dropped by 0.3°C from its peak, while the RSS tropospheric temperature anomaly has dropped by 0.5°C from its peak. The NCEI and HadCRUT temperature anomaly products have also cooled from their peaks through April, 2016, though the drops are not as large since they do not include the May 2016 changes.

The Pacific “warm blob”, which has also been affecting global temperatures, has also disappeared, according to NASA. This should further reduce the sea surface temperature anomalies; and, thus, all of the global integrated temperature anomalies.

NOAA and the Australian Bureau of Meteorology have issued La Nina alerts for 2016-2017. Both organizations are anticipating a moderate to strong La Nina. However, there is no basis on which to project the magnitude and extent of the sea surface temperature cooling which will result.

The news releases from the producers of the near-surface temperature anomaly products tended to minimize the assessment of the impact of the El Nino on their surface temperature anomalies. However, the end of the El Nino has already produced a very significant reduction in the NASA GISS anomaly products. There is no reason to expect that similar reductions will not appear in the NCEI and HadCRUT anomaly products when the May 2016 anomalies are announced. This should make it quite clear that the minimal attribution of the 2015 warming to the El Nino was political spin, rather than scientific assessment.

Tags: El Nino

COP 21 Agreement

The Obama Administration insisted that the Agreement reached at the conclusion of COP 21 in Paris, France not be legally binding on the parties because of the near-absolute certainty that the US Senate would not ratify the Agreement as a treaty; and, because of the very limited likelihood that the Congress would appropriate the anticipated level of funding which would have been required by the UN Clean Energy Fund.

The US has reportedly pledged $3 billion of the ~$10 billion currently pledged to the UN Green Climate Fund, with the stipulation that the US pledge not exceed 30% of the total funds pledged. The COP 21 Agreement calls for funding from a “floor” of $100 billion per year, beginning in 2020, apparently in perpetuity. The Group of 77 + China have stated that the $100 billion is insufficient and must be increased substantially.  Assuming that the US maintains its 30% stipulation, the US would be expected to pledge ~$30 billion per year, or more, to the Green Climate Fund.

The current US pledge of $3 billion is to be provided over a period of 4 years; and, it is not certain whether it is intended to be “new money”, or funds reallocated from other appropriations. However, meeting the expected US share of the $100 billion per year would require a new congressional appropriation, which is highly unlikely in the current Congress. That appropriation would have to be funded by a new tax, such as a carbon tax, which would have to be adjusted upward “Progressively”, as CO2 emissions declined, to maintain the pledged funding stream.

There is currently very little definition regarding the criteria to be used to determine allocations from the UN Green Climate Fund. Much of the funding would likely be distributed to known kleptocracies; and, predictably, very little of the funding would actually reach the citizens of those kleptocracies purportedly adversely impacted by climate change. Based on the UN’s history, in which the Iraq “Oil for Food” program degenerated into the Iraq “Oil for Palaces, Payloads and Payoffs” program, a high percentage of the $100 billion per year would likely be consumed by waste, fraud and abuse.

Tags: COP - Conference of Parties

US EPA Clean Power Plan

The primary weapon in the US Administration’s crusade to save the world from anthropogenic global warming (climate change) is the US EPA Clean Power Plan, which sets CO2 emissions limits for coal-fired electric generators which could only be met with the installation of carbon capture and sequestration (CCS) systems. The plan requires CO2 emissions from power generation to be reduced by 32% from 2005 levels by 2030.

The Plan effectively requires the retirement of older coal-fired generators which are either technologically or economically unsuitable for the installation of carbon capture systems. The Plan also effectively discourages (prevents?) the construction of new coal-fired generators, since CCS is currently not demonstrated technology at commercial scale; and, the technology does not appear to be economically viable, even on new plants designed to accommodate CCS. The Plan also encourages expanded transition from economic dispatch of power generators (lowest cost first) to environmental dispatch (renewables and lowest emissions first), thus inexorably increasing wholesale power costs.

A coalition of 26 states and one coal company have filed suits against EPA in federal court, seeking to have the court overturn the EPA rule; and, to stay its execution until the legal challenges are resolved. The stay is viewed as essential, given EPA’s history of “slow walking” the appeals process, thus forcing those affected by the regulations to prepare to comply with the contested regulations, even as they challenge them in the courts. This is particularly important for electric generators, which require long lead times for planning, regulatory approval, equipment procurement and installation, construction and commissioning.

Members of the US Senate and House of Representatives have also filed challenges to the Plan under the Congressional Review Act, challenging EPA rules for both existing and new power plants. Congress considered the inclusion of CO2 under the Clean Air Act, but chose not to include CO2 as a criteria pollutant under the Act. Thus, certain members of Congress believe that EPA has effectively rewritten the Act in the issuance of the Clean Power Plan.

Tags: EPA, Clean Power Plan, Coal

The Many Costs of COP 21

“The time has come, the walrus said, to talk of many things; of painted ships and sealing wax and cabbages and kings.”

The pledges of GHG emissions reductions made by the developed country participants at COP 21 will impose a variety of costs on individuals and businesses in those countries. Premature replacement of existing coal-fired electric generating facilities will result in economic dead losses, both for the owners of the generation facilities and for the owners of the coal mines which have provided their fuel. The extent of these economic dead losses cannot be accurately determined until the specific generators to be abandoned and coal mines to be closed have been determined; and, that process is ongoing. The closure of these plants and mines will also result in job losses in both industries, as well as job losses in the transportation industries which moved the coal from the mines to the generators.

Closure of these existing facilities will require the construction of new facilities to replace their electricity output. New natural gas combined-cycle generators will be more efficient than the coal generators they replace; and, in the current market, will use less expensive fuel. In areas without available natural gas transmission and storage capacity, major investments will also be required to deliver and store the additional natural gas required to fuel the new generators.

New renewable generating facilities will be more expensive per megawatt hour generated than either the closed coal plants or the new natural gas plants; and, will require significant expansion of the electric transmission grid and the installation of massive quantities of grid storage, increasing electricity rates as has occurred in several countries in Europe, including England and Germany.

Taxpayers in the developed countries will also be required to provide the $100 billion per year pledged to the UN Clean Development Fund to support energy development and climate change mitigation and remediation in the developing countries. The developing countries, to the extent that they implement renewable generation, will find their development impeded by the higher costs of the renewable generation and the construction costs of the storage required to achieve acceptable reliability of service.

China, India and other developing nations have clearly expressed their unwillingness to impede their economic development in the interest of controlling the global climate. While these nations intend to increase their reliance on nuclear generation and renewables, they are also aggressively increasing their use of coal-fired generation. Once each of these countries reaches its peak CO2 emissions rate, at some undefined date in the future, their emissions will decline slowly over a period of approximately 40 years (expected plant life, absent life extension investments), assuming that old coal-fired plants being retired are not replaced by new coal plants. The COP 21 Agreement does not preclude such actions by these countries.

The COP 21 Agreement includes the intent to achieve net-zero global annual CO2 emissions by 2100. The global investments required to achieve this result are estimated to exceed $1 trillion per year. The Agreement also includes the desire to achieve net-zero emissions, at least among the developed nations, far sooner, with the hope of limiting global mean near-surface temperature increase to 1.5oC. The investments required to achieve that result would be far greater than $1 trillion per year, because of the shorter period over which the investments would be made and the state of development of the technologies required to achieve that result.

A recent analysis by University of Alabama Hunstville (Drs. John Christy and Roy Spenser) suggests that, at current warming rates, the global temperature anomaly would peak below 1.5oC regardless of the emissions reductions agreed to in the COP 21 Agreement. However, that analysis, like the continuing satellite temperature record, is likely to be ignored, since it does not fit the globalist narrative.

Tags:

Prosperity and Electricity In-Depth Article

The United States faces a prosperity crisis.  Increased prosperity will make it somewhat easier to resolve many of the problems that the next generation must face head-on.  Reviving prosperity in the United States poses some difficult challenges over the next decade.  Many articles explain the myriad issues that will need to be addressed to restore prosperity after 16 years of focus on other priorities, resulting in at best tepid regard for economic growth.  I believe one such subject is not fully appreciated for its potential to thwart prosperity if left on its projected course or to be a catalyst for prosperity if we include it as part of the dialogue for restoring prosperity: ELECTRICITY.

In energy policy discussions, oil receives a disproportionate amount of attention, especially as regards the threat to quality of life and prosperity.  I have thought long and hard as to why oil receives more attention than electricity and I can only conclude that oil issues are easy to understand and electricity issues are not.  (The ease of understanding oil has not, however, led to sound policy as is discussed in our earlier Commentary “In Praise of Global Oil Markets: Will the Idiocy End?”.)  In this Commentary I will lay out the case for paying more attention to electricity.

First, an important observation.  I have carefully perused the websites of prominent national think tanks and I have not found one such organization that dedicates a full time person to electricity industry structural issues. 

  • There are environmental people who will dabble in the overlap of electricity and environment. 
  • There are energy people who will pay lip service every once and a while to an electric issue.  
  • There are national security people who will worry about the electricity infrastructure exposure to terrorism. 
  • There are regulatory reform people who every once and awhile make a glancing blow at some electricity issue. 

But I have not found any broad-based think tank organization with a single person who on a full time basis and with the proper credentials concentrates on electricity issues as their exclusive focus.  The organizations that have full time people addressing electricity issues are all trade associations or environmental organizations, dripping with self-interest in their analysis and recommendations, or relatively small specialized organizations like the Institute for Energy Research.  This concerns me.  Either I am woefully misguided about the threat (as demonstrated in this article) or there is something missing in the think tank community.  If the latter, then I believe a grave risk exists that we will be caught with our pants down when the electric system begins to freeze up.

Why are think tanks asleep at the switch?  Part of my theory is that electric utilities are part of the corporate donor base of many of these think tanks and are part of the coalition on taxes, labor, health, environment, etc.  I perceive (but I could be wrong) that there is some reluctance to dedicate resources to electricity policy because it would offend the donor base of electric utilities, which if done right almost certainly would.

A second preliminary point.  There is a rich history of reforming industries that have similar network characteristics to the electric industry.  Ironically, there is no consensus name in economics about these types of industries and that may be part of the problem. 

Since the mid-1970s, the US has massively and successfully restructured a series of industries that have several similar characteristics, though many would fail to see the commonality.  First, they are network or grid type industries, i.e., industries that move people, goods, or digits from point A to point B without fundamentally changing the physical properties of the “transported” item.  Second, the reforms transitioned the industry from heavy-handed governmental control to a more competitive, market based policy.  Third, all required preemption of some traditional State authorities that became anachronistic as the maturation of the industry increasingly implicated more interstate commerce.  The industries include airlines (1978), trucks (1978), railroads (1980), telecommunications (1982-84), natural gas (1985-92), cable (1984), internet (1986), GPS (1996), Microsoft Windows (2001), and oil pipelines (1994).  Interestingly, the Supreme Court adopted a similar analytical framework for the movie theatre industry in 1948.  Most recently, Google has become a target of European regulators on much the same theory as these other industries (using monopoly assets to advantage affiliated competitive assets of their company and disadvantage independent competitors).

For lack of a commonly understood term for these types of industries (and because “network” industries would be confused with computer networks), I have coined the term “plexus” industries.  The dictionary definition of plexus is “an intricate network or web-like formation.”  Plexus industries are the connective technology that allows a person, good, or digit to be moved from a lower value condition to a higher value condition but without changing its fundamental characteristics.  This is not the place to explore the technical arguments for plexus but a couple of examples will help understand the point.

The easiest plexus industry to understand is a gas pipeline.  Gas goes into a pipe in Texas and comes out in Boston.  The chemical properties of the molecule of gas are unchanged.  It has just been moved from Texas to Boston.  The process of such movement is well understood within the gas industry but if you asked someone to compare that “process” to Windows operating system they would look at you confused.  At a certain level of abstraction, pipelines and Windows perform the same function.  Windows has a point at which something is put into the system.  We call it an application, e.g., Chrome, Firefox, Adobe Acrobat, AOL etc.  The application is then “delivered” to a user who then creates value by using the combination of the application and the operating system.  The Department of Justice used the same basic rationale for suing Microsoft for using their “monopoly” facility (Windows operating system) to impede competition in those products that need the monopoly facility to compete with other Microsoft products.  Microsoft still operates under a consent decree with the Department of Justice.  At a certain level of abstraction, this is identical to the reforms in the natural gas industry separating pipeline functions from commodity functions.

The key insights to all these largely successful policy reforms is that the “plexus” facility was recognized as having monopoly characteristics that would distort markets if left unchecked, the goods being moved through the plexus facility were capable of being subjected to market principles even if the plexus facility was not.  Rather the plexus facility had to be “regulated” in such a way as to promote efficient input and output markets.  This all sounds rather abstract but the concepts apply to a wide array of goods and services and a widely varying terminology is used by different plexus industries.     

My point is that pro-market reforms of fundamentally interstate plexus industries has been done before, we have a template for reform that has been enormously supportive of prosperity policies.  

Now onto the main stage of how this applies to electricity, but starting with a bit of boring history.

Electricity was originally introduced to the US on a city by city basis, indeed sometimes neighborhood by neighborhood.  Thomas Edison’s first foray into electricity was the famous Pearl Street Station.  The Wiki entry gives a sense as to how local was its’ introduction:

Pearl Street Station was the first central power plant in the United States. It was located at 255-257 Pearl Street in Manhattan on a site measuring 50 by 100 feet (15 by 30 m), just south of Fulton Street and fired by coal. It began with one direct current generator, and it started generating electricity on September 4, 1882, serving an initial load of 400 lamps at 85 customers. By 1884, Pearl Street Station was serving 508 customers with 10,164 lamps. The station was built by the Edison Illuminating Company, which was headed by Thomas Edison.

Without recounting the tortured history of electricity regulation, we arrived at the current allocation of jurisdiction of regulatory authority by 1935.  The States had regulatory authority over generation, constructing transmission and distribution grids, sales to the consumer by the distribution company, billing, and metering -- in essence, the whole system from generation to consumption.  The feds had a very limited role in transmission and wholesale sales in interstate commerce.[1]  But in 1935, there was precious little interstate commerce in electricity, especially compared to today.

That simple framework has been under nearly constant assault since 1978.  Today, the electricity system[2] is a mess, bordering at times on chaos and calamity. 

Much like the AT&T phone monopoly came under scrutiny for the extent to which it had developed into a very broad monopoly, electric utilities were put under a microscope to examine the continuing vibrancy of the assumption that they should be permitted as a comprehensive monopoly.

The first crack in the dam came in 1978 when utilities were required by Federal law to purchase electricity from third parties who used certain technologies.  The Federal law required that the State commissions set the price at which the purchase would take place.  In 1992, Congress broadened who could sell electricity to the utility.  Over the next decade, the Federal Energy Regulatory Commission enacted rules to encourage even more competition in generation.  At the same time a number of states began to experiment with programs to allow customers to purchase electricity from competitive marketers.  But then the Enron scandal and the catastrophic California electricity crisis created a more hostile climate for competitive electricity and progress slowed.  Today the US finds itself in a situation where there is a confusing mixture of models of electricity competition and traditional regulation.    

Does the current situation create risk of failure of the system?  By what standard should we judge adequacy of the existing system?  When the basic jurisdictional allocation was solidified, electricity was virtually a luxury.  Many homes did not have electricity and we certainly didn’t have ubiquitous air conditioning, labor saving appliances, electric cars, and the magical world of digital technology.  I suggest that our test should be how well the US electric system serves the world we envision in 2050.  Luckily, the US is not the only country asking this question.  Indeed, we are laggards compared to how aggressively some other countries are addressing the electric system of the future.  Maybe not so surprisingly, China, for example, has been far more thoughtful and aggressive than the US in addressing electric system structural issues.  Australia has also grappled with some of the challenging issues of electric market reform.

So what is wrong with the US electric system today?

Climate Change:  Far and away, the most confounding variable in the electric system today is the issue of carbon emissions.  Let’s put to one side the degree to which carbon emissions are problematic and just take a look at the current situation.  The US Congress has steadfastly refused to pass comprehensive legislation regulating carbon.  Environmentalists have thus turned their attention to the States and the Federal executive branch.  Accordingly, there are literally hundreds of different approaches being taken to deal with carbon, mostly by discouraging coal and nuclear generation (in a variety of ways and forums) and encouraging renewable energy and efficiency (in a variety of ways and forums).  Whatever may be said of this approach, one thing is certain.  There is absolutely no reason to believe that this chaotic approach will achieve cost effective carbon emissions reductions.  Yet billions are being spent debating, analyzing, and executing these programs, many of which are cosmetic.      

Jurisdictional Allocation: As noted above, the allocation of jurisdiction over the electric industry is an historical accident that has not been formally reconsidered since its inception in 1935.  Rather there has been a nipping at the heels over the years to address the absurdities that result from the current allocation of jurisdiction between the states and the feds.  In ALL the plexus industry reforms, one key, indeed essential, element was the reconsideration of the allocation of jurisdiction between the feds and the States.  Railroads, air travel, trucking, natural gas, and telecommunications all shifted the allocation of jurisdiction from the states to the feds as these industries became more entwined in interstate commerce.  Just as importantly, when the feds exercised control under all of these reallocations it did so by adopting more competitive, market based policies.  Some will no doubt resist the feds taking a stronger role in the electric industry on historical, ideological, policy, economic, or constitutional grounds. But these arguments are weak given the historical success of the reforms to other plexus industries, the inherent impact of electricity on prosperity, and the increasing impact on interstate commerce.

Regulatory Chaos: While overlapping with the issue of jurisdictional allocation, it is necessary to analyze the regulatory chaos that exists when literally hundreds of governmental organizations have control over different pieces of the electric system.  FERC has done yeoman’s work in trying to work around this chaos with its establishment of Regional Transmission Organizations (RTOs) but there is substantial opinion that these organizations have become bureaucratic, expensive, political, and inefficient (though admittedly more efficient than the status quo ante).  Moreover, having each State make a utility by utility decision on each and every issue is expensive, confusing, and inefficient.  Indeed, even in some States policies are very different depending on which utility service territory one lives in.  Not only is this costly, it has the added disadvantage of making the electric system more fragile than would be the case with more coherent integration.

Technological Innovation: There is a joke told at electric meetings.  If Thomas Edison came back he would recognize the electric industry.  Nothing’s changed!  LOL (not).  But it’s true that both digital technology and just plain old innovation have created a situation where there is a possibility that many new technologies exist that might (?) improve the operation of the electric system.  The reason I say “might” is that I take Hayek’s admonition about Fatal Conceit seriously.  I don’t pretend to know what new technologies make sense and which don’t.  But I also know that most State regulators know less than I do about which technologies make sense, and they are in the driver’s seat. 

One of the more general critiques of regulation is that it impedes innovation.  But that is true in spades in the electric industry, given the dispersed regulatory authority.    

Aging Infrastructure: The electric system is victim to the same malady as much of America’s infrastructure.  It is simply a fact that no matter which part of the industry you look at there are issues of the need for modernization of infrastructure.  Putting new infrastructure in place to satisfy increasing demand is relatively easy.  Investments will result in more customers and greater revenue.  But that is not what we are talking about.  The investment in replacing aging electric system infrastructure only results in higher costs but not necessarily increased demand for service.  Thus commissions and consumers can be quite beggarly when they are asked to support higher rates needed for modernization. Compounding this issue is the disaggregated nature of decisions to require renewable energy in the grid.  Arguably, dollars are being wasted on this endeavor that would do more good in promoting modernization.

Reliability: Some of the aforementioned drivers already hint at the issue of reliability.  Reliable electricity is essential for a prosperous Nation.  While reliability was always an important concern, digitalization makes reliability more imperative.  Food will not spoil if electricity goes out for an hour.  But if electricity ceases for even the blink of an eye it can cause damage to some electronic equipment.  Measuring the risk of reliability is difficult.  We have had two major reliability failures in the US, in 1965 and 2003.  Additionally, California suffered debilitating blackouts in 2001-02.  While it is stating the obvious, a catastrophic failure of the US electric system would result in catastrophic property loss and sometimes lives.  Congress has formalized the issue of electric reliability in legislation; it has not taken steps to put in place a regulatory model that will stand the test of 2050.

Reliability is threatened in ways too numerous and technical to list here but several can be highlighted.  First, I doubt there is anyone who has not heard there is a “war on coal.”  The Environmental Protection Agency (EPA) has proposed regulations that would make it difficult bordering on the impossible to build new coal plants and would also force the closure of some existing plants.  Less familiar is the “war on nuclear.”  A new nuclear plant has not been built in the US since the late 1970s because of Three Mile Island, Chernobyl, and unanticipated cost overruns due to safety regulations.  Just as the US was on track for a small nuclear renaissance, an earthquake and typhoon hit the Fukushima nuclear plants in Japan, causing immediate devastation and long term harm.  Fukushima was undoubtedly a setback for nuclear power in the US.  Environmentalists a few years ago were very positive on natural gas to replace coal and nuclear in the generation mix as a short term strategy to transition to renewables.  But as it became clear in recent years that natural gas would be more plentiful (measured in centuries), environmentalists soured on natural gas and now anti-fracking has become part of the extremist mantra.  So where will the base of generation come from to supply our electric needs.  Some believe that renewables and policies encouraging less need for electric will fill the gap.  But this is a pipedream (pun intended).  This leads to the second point about reliability.  Renewables are an intermittent source electricity.  If the sun don’t shine and the wind don’t blow, you don’t have reliable renewable energy.  The technology for storage of electricity (to smooth out intermittency) is still not there, although it has been “10 years away” for the last 30 years.   

Terrorism: Few targets would cause as much disruption as would a major terrorist attack on the electric system.  And yet few targets are as exposed as the electric system.  By definition, the electric system must be spread out all across the Nation and has been called the largest machine in the history of the world. 

Indeed, one has to wonder why there has not been a comprehensive attack on the US electric system.  Recently, there was an event that put a scare into those that worry about this type of attack.  In 2013, there was a small arms attack on an electric switching station in San Jose, California.  As of the writing of this Commentary, the FBI has not made any arrests. 

Utilities and State, and Federal authorities are working behind the scenes to prepare for such an attack and, like many terrorist attacks; we may never know what attacks were prevented by these actions.  Nonetheless, two points must be made.  First, protecting the electric system from attack is no doubt expensive. Unfortunately, these efforts must compete with many of the other priorities being placed (many indeed misplaced) on the electric system.  Second, our protections have to be successful 100% of the time, while the terrorist only has to be successful once.  The magnitude of the potential harm to the Nation is unimaginable yet must be understood and dealt with within an atrophied regulatory framework.     

Electromagnetic Disruption:

Saving the best for last, the end is near!!  It has become widely recognized in the esoteric world of electricity and has started to spill over into popular culture that an electromagnetic pulse could bring down all or part of the electric system.  Literally, we are talking end of the world type catastrophe, with millions dying in months.  Such shocks can result in three ways: nuclear bomb detonated above the earth, solar geomagnetic disturbance, and a ground based weapon.  There is currently an active debate in the US Congress about how to deal with this vulnerability and FERC has begun to issue rules for electric utilities on developing contingency plans for solar threats.  As with all the other issues, this will cost money to address and there are competing priorities for dollars to be spent on the electric system.

Conclusions and Recommendation

Scared yet?  I hope so because I am. 

I usually rail against many left wing arguments as “alarmism.” I might be accused of alarmism in this Commentary.  The difference between “alarmism” and “alarm” is evidence and sound analysis.  My goal in this Commentary was to convince you that there was a sleeping giant of a threat to economic prosperity.  I am trying to think of any other system in our economy that would wreak as much havoc as a major break of the electric system.  I honestly cannot think of a major failure of any other system that would cause as much harm to prosperity as a major failure of the electric system. 

Significantly, you are probably thinking of a “catastrophic” failure of the system as a unique, obvious, one-time event.  That might happen; a 9-11 type event.  But it will probably be more insidious than that.  Think of it like a deteriorating highway.  There is no major, obvious failure.  But every day there is a bit of damage to cars and trucks.  There is less efficient travel.  There is more political tension caused by consumer complaints and the need for more resources.  Some people, likely people with options and money, move away to avoid poor public services.  The tax base erodes and now everything is more difficult.  It is more like a cancer than a fractured skull.  That is an equally frightening scenario for electricity.  There is no singular measure of where our collective national electric system is on a spectrum from third-world to best in class.  Everyone will have a different opinion on how significant the threat of widespread interruption is.  I myself am not sure what the particular scenario is that will cause us to wake up and say “why didn’t we see this coming?”   

So, what to do?

There are two dimensions to fixing the electric system: a policy and a plan to implement the policy. 

The policy is actually pretty simple to identify but very difficult to implement. 

First, we need to embrace a policy of reliance on market competition for ALL services that are capable of sustaining competition.  This was done in other plexus industries and it worked out either reasonably well or spectacularly well.  Thus all generation should be operate in competitive markets and for reasons below should not be owned by either transmission or distribution companies.  But recognize this will affect a lot of economic interests so there will be wailing and gnashing of teeth. 

Second, transmission (not a function easily capable of competition) must be reconceptualized.  Today, hundreds of business entities own electric transmission facilities and some of those facilities are operated by Regional Transmission Organizations.  The reason that RTOs operate the facilities owned by many other business organizations is that there is an inherent conflict of interest when the same business organization owns generation (competitive), transmission (monopoly), distribution (monopoly), and retail services.  We can agree on the principle that the business organization should not be permitted to advantage its potentially competitive operations by abusing their monopoly power over certain facilities by linking the competitive good to the monopoly good.  In antitrust law this is called a tying arrangement.  An easy way to understand the problem is to envision an umpire in a Little League game.  His daughter is the pitcher for one of the teams.  Could you really blame the opposing coach objecting to his being the umpire, no matter how solid his reputation for honesty?  Similarly, would it surprise you to find out that consumers believed they would get better service from the marketing affiliate of the utility than from an independent marketer?  Attempts to regulate this type of abuse are next to impossible, though FERC and many State commissions have tried.  It is burdensome, ineffective, and not trusted by potential independent competitors.  I once was hired to testify against a gas pipeline who had abused their monopoly by advantaging their storage facilities in such a way as to drive an independent storage facility into bankruptcy.  The smoking gun is that the monopolist applied more favorable requirements to its affiliate than it did to independents.  It argued that this was reasonable because it made good business sense.  Many of the independents were poorly financed, unreliable, and untrustworthy; whereas their affiliate was none of these. RTOs are not a natural business construct.  They were imposed because of the limitations caused by jurisdictional allocation and existing authorities.  In theory, a single company owning and operating transmission and only transmission over large regions, regulated by a Federal authority would promote more efficiency and have incentives for technological innovation and more efficient decision-making.

Third, distribution companies should be regulated by States under a Federal policy that promotes competition.  Retail services should be competitive and not performed by the distribution company, again avoiding conflict of interest complications.  There should be strong encouragement for massive consolidation of distribution companies since that would dramatically simplify both the regulation of such companies and easier for competitive independent marketers to do business across larger regions.

Fourth, all retail customers should be served by a competitive entity that is independent of the regulated entities.  I don’t have a clue what the aggregation model would be.  Would Google or Apple or Walmart end up as the aggregators of many of our electric services?  Maybe it will be Visa or MasterCard?  Maybe it will be tied to a bundle of cable, internet, telecommunications, gas, water, and other in-home services?  The beauty of the competitive market is that we don’t know what will develop by 2050.  The key is to put the right institutions in place with the right incentives and then let the market innovate.  When I was working on reforming natural gas policies, I could scarcely have imagined the role that natural gas and oil would play 30 years later.  I can guarantee you that none of us thought that we would cripple OPEC and create headaches for Russia.  It may sound naïve but it is nonetheless true.  Markets can be magnificent tools for progress if they are not distorted by policies that inhibit price signals to consumers.

So that’s the big picture.  I could write a book on the many tedious and technical economic, legal, regulatory, political changes that need to be made to prepare the US electric system for 2050.  But there are several such books already on the market and few get the attention they deserve. 

So how do we get there?  We need a game changer.

Frankly, I wish I had a bold, exciting, innovative recommendation on accomplishing a radical restructuring of the electric system that would cause you to sit back in your chair and say “WOW!”  But I don’t.  There are hundreds of reports but it has not resulted in anything more than tinkering at the periphery, often for the benefit of special interests.

Maybe one you reading this will email me such a recommendation.  But for now all I can come up with is the recommendation that we take this issue more seriously than we have done.  While we don’t need another Congressional hearing, or DOE Report, or Industry Association Sponsored Strategy, we do need to develop a compelling plan and build the consensus to execute it iby people that have the gravitas to make it a game changer.  Unless I missed it, such an effort is not underway and nowhere in sight.


[1] It is a bit different when it comes to the environment.  The feds have taken a stronger role in environment than in industrial organization issues.

[2] I struggle to find the right words to describe the whole of the problem.  If I said “electric industry” some would construe that to mean electric utilities, clearly too limited a concept.  If I use the term “electric policy” it might be perceived as being limited to the world of “wonks.”  So I use the term “electric system” to include the widest possible look at the challenge of delivering reasonably priced, reliable, environmentally responsible electricity.

 

Tags:

Financial Mechanism of the Convention

The United Nations Framework Convention on Climate Change (UNFCCC) established the Green Climate Fund (GCF) at COP 16 in 2010, as an operating entity of the Financial Mechanism of the Convention. The original intent was for the developed countries to provide a fund of $100 billion for use by the developing nations in climate adaptation and remediation, of which only about $10 billion has actually been pledged.

COP 21 set a new “collective quantified goal” of $100 billion per year for GCF funding, beginning in 2020. However, the Group of 77 plus China argued that this base funding level must be substantially increased if it is to meet the requirements of the Group of 77 plus China to contribute to the goals of the COP 21 Agreement, as well as meet their adaptation and remediation needs.

It is interesting that the number 1 and number 3 GHG emitters are members of this group; and, that neither of these countries has submitted an Intended Nationally Determined Contributions (INDC) document which makes any commitment to emissions reductions.

The UNFCCC COP 21 Agreement expresses some degree of urgency with regard to this funding, despite the fact that there are no demonstrated adverse impacts of anthropogenic global warming (AGW); and, significant evidence of positive climate impacts on crop production and general vegetation growth.

The Board of the Green Climate Fund has determined that the pledged funds should be allocated approximately 50% to adaptation and 50% to remediation over time; and, that at least 50% of the adaptation funds should be devoted to meeting the needs or the most vulnerable nations.

There is currently no formula which determines the contributions of individual nations to the Green Climate Fund. Currently, relatively few nations have announced commitments.

Tags:
Search Older Blog Posts