Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

More Anomalous Anomalies

The three primary producers of global near-surface temperature anomalies, NASA GISS(Goddard Institute for Space Studies), NOAA NCEI(National Centers for Environmental Information) and HadCRUT all begin the process with access to the same data. They then select a subset of the data, “adjust” the data to what they believe the data should have been; and, calculate the current “adjusted” anomalies from the previous ”adjusted” anomalies. NASA GISS alone “infills” temperature estimates where no data exist. Each producer then independently prepares their global temperature anomaly product.

The calculated anomaly in any given month relates directly to a global average temperature for that month. The difference between the calculated anomalies in any pair of months is thus the same as the difference between the calculated global average temperatures for those months. However, the differences reported by the three primary producers of global average temperature anomaly products from month to month, or year to year, are rarely the same; and, the changes are not always even in the same direction, warming or cooling.

The global average temperature anomaly differences reported by the three primary producers of global near-surface temperature anomalies for the months of November and December, 2016 are an interesting case in point. NASA GISS reported a decrease of 0.12oC for the period. NOAA NCEI reported an increase of 0.04oC. HadCRUT reported an increase of 0.07oC. Each provider estimates a confidence range of +/- 0.10oC for their reported anomalies. (NASA GISS: -0.22oC / -0.12oC / -0.02oC; NOAA NCEI: -0.06oC / +0.04oC / +0.14oC; HadCRUT: -0.03oC / +0.07oC / +0.17oC) Therefore, the confidence ranges overlap, suggesting that the differences among the anomaly estimates are not statistically significant. However, it is clear that the global average near-surface temperature did not both increase and decrease from November to December.

The second decimal place in each of these reported anomalies is not the result of temperature measurement accuracy, but rather of numerical averaging of less accurate “adjusted” temperature estimates resulting from data “adjustment”. The Law of Large Numbers states that it can be appropriate to express calculated averages to greater precision than the precision of the numbers being averaged, if the errors in the individual numbers are random. However, the nature of the factors which cause individual temperature measurements to be inaccurate and thus require “adjustment” suggests that the resulting errors are not random. Certainly, the “adjustments” made to the data are not random. Therefore, it is highly unlikely that reporting global average near-surface temperatures to greater precision than the underlying “adjusted” temperature data is appropriate.

Tags: Temperature Record, Global Temperature

Hottest Year Time Again

The near-surface global temperature anomaly data for 2016 have been collected, selected, infilled, “adjusted” and analyzed. The results are in; and, again, they are anomalous. NASA GISS(Goddard Institute for Space Studies) and NOAA NCEI(National Centers for Environmental Information) both report the average anomaly for 2016 as 0.99oC. This represents an increase of 0.13oC for the NASA GISS anomaly, compared with 2015; but, an increase of 0.09oC for the NOAA NCEI anomaly, compared with 2015. Both NASA GISS and NOAA NCEI place the confidence limits on their reported anomaly at +/- 0.10oC, or approximately the same magnitude as the reported year to year global average anomaly change. Both NASA GISS and NOAA NCEI estimate that the influence of the 2015/2016 El Nino contributed 0.12oC to the increase in the reported anomaly for 2016, 0.01oC less than the global average anomaly increase reported by NASA GISS and 0.03oC more than the global average anomaly increase reported by NOAA NCEI. That is, essentially all of the 2016 global average temperature anomaly increase reported by both agencies was the result of the influence of the 2015/2016 El Nino, which was very similar in magnitude to the 1997/1998 El Nino, which are the two strongest El Ninos recorded in the instrumental temperature record. HadCRUT reported an average anomaly of 0.774oC, an increase of 0.14oC from the 2015 average anomaly. HadCRUT estimated similar confidence limits and a similar El Nino contribution.

All of the near-surface temperature anomaly products reported dramatic drops in their anomalies, beginning in the Spring of 2016, though these drops were from record high monthly peaks driven by the El Nino. The NASA GISS anomaly dropped from a high of 1.36oC to a December 2016 level of 0.81oC, a change of -0.55oC. The NOAA NCEI anomaly dropped from a high of 1.22oC to 0.79oC, a change of -0.43oC. The HadCRUT4 anomaly dropped from a high of 1.08oC to 0.59oC, a change of -0.49oC. This a variance of 0.12oC between the near-surface temperature anomaly products, approximately equal to the magnitude of the reported 2016 anomaly increases, the estimated impact of the 2015/2016 El Nino and half the confidence range claimed for the reported anomalies.

University of Alabama Huntsville (UAH)  and Remote Sensing Systems (RSS) both reported that 2016 was 0.02oC warmer than 1998, which both sources still report as the previous warmest year in the satellite temperature record. Dr. Roy Spencer of UAH stated that the increase in the reported temperature anomaly between 1998 and 2016 would have had to be ~0.10oC to be statistically significant. The UAH anomaly dropped from a 2016 high of 0.83oC to a December 2016 level of 0.24oC, a change of -0.59oC. The RSS anomaly dropped from a 2016 high of 1.0oC to a December level of 0.23oC, a change of -0.77oC. Both the UAH and RSS anomalies show the dramatic impact of the 2015/2016 El Nino. Both anomalies suggest that the “Pause” has returned, since they show no statistically significant warming since at least 1998.

The question now is whether there will be a La Nina in 2017; and, if so, the extent to which it will further reduce the post El Nino anomalies.

Tags: Warmest, Global Temperature, Temperature Record

Highlighted Article: Climate Models for the Layman

Here is an excellent paper on climate models by Dr. Judith Curry and The Global Warming Policy Foundation (GWPF).

Climate Models for the Layman

"This report attempts to describe the debate surrounding GCMs to an educated but nontechnical audience."

Tags: Highlighted Article

Opening the Kimono

Steve Goreham, the author of the book Climatism! Science, Common Sense, and the 21st Century's Hottest Topic coined the term Climatism, which he defines as “the belief that man-made greenhouse gas emissions are destroying the Earth's climate". Arguably, the definition should include the assertion “that man-made greenhouse gas emissions are destroying the Earth's climate", even absent belief in the assertion.

Ari Halperin manages the blog Climatism. (https://climatism.wordpress.com/) Early in 2016, Halperin authored a guest essay on Watts Up With That (http://wattsupwiththat.com) entitled Who unleashed Climatism (https://wattsupwiththat.com/2016/01/17/who-unleashed-climatism/) in which he discusses the origins of climate alarmism at length. He concludes that: “Climatism is a foreign assault on America. The aggressor is not another nation-state, but an alliance of UN agencies and environmental NGOs.”

Leo Goldstein (http://defyccc.com/) has recently posted two guest essays on Watts Up With That: https://wattsupwiththat.com/2016/12/23/the-command-control-center-of-climate-alarmism/; and, https://wattsupwiththat.com/2017/01/05/is-climate-alarmism-governance-at-war-with-the-usa/. Goldstein has coined two new terms in these essays: Climate Alarmism Governance (CAG); and, Climintern (Climatist International). Goldstein traces the history of CAG from the founding of the Climate Action Network (CAN) in 1989. CAN now has 1100+ members globally. CAN, the UN and numerous foundations provide CAG; and, are referred to collectively as the Climintern.

The Climintern refers to the analyses provided by Halperin and Goldstein as conspiracy theories. Others might refer to them as conspiracy exposes or histories. The documentation they provide in these essays clearly establishes the nature and scope of CAG and the influence of the Climintern. Their analyses are well worth reading. They provide a historical record of how climatism has proceeded from its earliest days to today; and, how it works to influence global and national climate policy through the UNFCCC and the UN IPCC.

“If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.” (HT: James Whitcomb Riley)

Tags: Climate Skeptics

Another One Bites the Dust – Judith Curry Resigns

Dr. Judith Curry has resigned her position at Georgia Tech, in frustration over “the politics and propaganda that beset climate science”. Dr. Curry explained,

“the deeper reasons have to do with my growing disenchantment with universities, the academic field of climate science and scientists… I no longer know what to say to students and postdocs regarding how to navigate the CRAZINESS in the field of climate science. Research and other professional activities are professionally rewarded only if they are channeled in certain directions approved by a politicized academic establishment — funding, ease of getting your papers published, getting hired in prestigious positions, appointments to prestigious committees and boards, professional recognition, etc.”
“How young scientists are to navigate all this is beyond me, and it often becomes a battle of scientific integrity versus career suicide.” (https://judithcurry.com/2017/01/03/jc-in-transition/)

Dr. Curry’s concerns regarding university climate science education do not bode well for the future of climate science over the next 30-40 years.

Dr. Curry follows Dr. Roger Pielke, Jr., who did not resign his faculty position at the University of Colorado, but has redirected his research efforts away from climate science. (http://www.wsj.com/articles/my-unhappy-life-as-a-climate-heretic-1480723518)

Other climate scientists, including Dr. David Legates (formerly Delaware State Climatologist) and Dr. Noelle Metting (formerly US DOE) were terminated for failing to adhere to the political climate change narrative. (http://www.delawareonline.com/story/news/local/2015/02/26/university-delaware-professor-caught-climate-changecontroversy/24047281/)

(http://freebeacon.com/politics/congress-obama-admin-fired-top-scientist-advance-climate-change-plans/)

Dr. Wei-Hock Soon and Dr. Sallie Baliunas, both of the Harvard-Smithsonian Center for Astrophysics have been under attack since 2003 for their work on the solar contribution to climate change. (Dr. Baliunas has since retired.)

The Climategate e-mails released in 2009 and 2010 exposed efforts on the part of members of the consensed climate science community to destroy the careers of other climate scientists, including Dr. Christopher deFrietas, Dr. Christopher Landsea and Dr. Patrick Michaels. Fortunately, these efforts were unsuccessful.

The life of a non-consensed climate scientist is hardly a bed of roses.

Tags: Climate Consensus

Trump’s Corruption Mandate

Donald Trump’s astonishing election victory was in part a backlash against increasingly corrupt American politics.

Transparency International publishes an annual Corruption Perceptions Index, ranking all nations from most to least clean in their political conduct. The United States entered the twenty-first century by falling out of the top ten. Scandinavian nations such as Finland, Denmark, and Sweden along with Commonwealth nations such as New Zealand, Canada, and the United Kingdom dominated the top spots, while the USA was ranked fourteenth.

Since then the USA has declined further in the Index's rankings.

Both corruption and the perception of corruption increased during the tenures of Bill and Hillary Clinton, George W. Bush, and Barack Obama. Examples included the use of the IRS to bully political enemies, government bailout funds going to politically-connected crony businesses, the use of high office to enrich one’s private foundation, and presidents and their appointees to regulatory bodies using their discretionary power indiscriminately.

Given this, one therefore understands the joke that the wildly-popular television show House of Cards is really a documentary.

Yet there is a danger in that joke. While corruption occurs in all governments, there a huge difference between cultures in which corruption is normalized and implicitly tolerated and those in which corruption is condemned, vigilantly monitored, and forced to go underground.

So Trump’s astonishing election victory may be a healthy reaction against the increasing corruption -- and therefore even more astonishing because his own character seems imbued with significant elements of personal and business-political corruption -- and because the combination of his presidency with his personal financial holdings is fraught with conflicts of interest (as this Wall Street Journal graphic shows). Yet since conflict-of-interest rules apply differently to the president and vice-president, according to 18 U.S. Code § 208, it is unclear how many conflicts will actually be avoided.

Of course some Trump supporters argue that it takes a beast to fight a beast, but what we really need is a political culture that does not lend itself to amoral animal metaphors.

Political leadership is a human endeavor, and effective human leadership in the free and open democratic republic we aspire to be requires both integrity and the widespread perception of integrity. We are a rich country economically, so we can recover from billions of dollars of loss. But the erosion of character among our leadership is much more expensive. It encourages cynicism among the citizenry. It imposes demoralization and disengagement costs upon them. It discourages the morally principled from seeking political office. And it attracts the even-more-corrupt to the corridors of power. No democratic republic can survive that downward cycle for long.

So while I did not vote for Trump, I am encouraged that his administration is following up on at least one of his campaign promises to for example, a five-year ban on lobbying for all transition and administration officials. We can debate the morality and likely effectiveness of that particular anti-corruption policy, but as a post-election statement of intent its seriousness is evident and positive.

Nations always have a choice. A century ago Argentina was among the top ten most prosperous and clean nations in the world, but it has declined precipitously and is now relatively much poorer and ranked in the bottom half of nations for bribery and related corruptions. South Africa was only moderately corrupt a generation ago and has also declined sadly.

Yet some countries have cleaned up their corruption impressively. Botswana improved dramatically in one generation, as did Chile -- both overcoming the widespread stereotype of irredeemable business-as-usual-corruption in African and Latin American politics.

A banana-republic destiny is avoidable for the USA. President Trump’s character -- with its odd mix of obviousness and unpredictability -- will be decisive, as will the vigilance of the rest of us and our commitment to putting the animals back in their cages and cleaning up their messes.

Tags: Corruption, Politics, Donald Trump

“Mann Overboard”

Much has been written this year about the 2015/2016 El Nino and about the apparent record global temperature anomalies. Professor Michael Mann of Penn State University was quick to provide his opinion that the El Nino contributed only about 0.1oC, or about 15%, to the 2015/2016 global average temperature anomaly increase. Others provided estimates ranging from 0.07oC to 0.2oC. The balance of the temperature anomaly increases was attributed to the continuing increase in atmospheric CO2 concentrations as the result of human fossil fuel combustion.

However, the 2015/2016 El Nino is now over; and, global temperature anomalies have dropped sharply: by approximately 0.4oC overall; and, by approximately 1.0oC over land only. The sea surface temperature anomalies are expected to decrease further, although more slowly, especially if a significant La Nina develops in 2017. The equatorial Pacific is in a weak La Nina condition at present, but La Nina conditions appear to be weakening.

Regardless, Mann and others who minimized the potential contribution of the 2015/2016 El Nino to the rapid global temperature anomaly increases in those years are now faced with explaining the large, rapid decreases in the global average anomalies following the end of the El Nino. It would be difficult enough to explain rapid anomaly increases in association with slow increases an atmospheric CO2 concentrations; but, even more difficult to explain rapid anomaly decreases in association with slow increases in atmospheric CO2 concentrations.

Tags: Temperature Record, Global Temperature

Political protest in a "post-fact era"

“Everyone is entitled to his own opinion, but not to his own facts” (Senator Daniel Patrick Moynihan)

 

A protester was shot at the University of Washington during a clash between rival factions — one faction physically blocking an audience from hearing a speech, the other faction seeking to hear a rabble-rousing orator.

The orator was Milo Yiannopoulos, a leading spokesman for the alt-right movement, a revitalized and muscularized version of nationalist and populist politics long submerged in American politics.

Outside the auditorium, blocs of red-wearing Trump supporters and black-wearing anarchists and others faced each other  (unconsciously updating Stendhal's novel The Red and the Black.) The man who was shot was apparently a peacemaker, placing himself in the middle of the verbally-abusing and pushing-and-shoving factions.

The victim's positioning was unfortunate, as there is little "middle" left in our polarized political times.

And it is symbolic that the shooting took place at a university, because it was precisely at universities where the battle for civility has been lost.

A generation ago in universities we had vigorous debates about truth, justice, freedom, and equality. The governing premises was that through argument rational people could fine-tune their grasp of the facts and test the logic of their theories. The process would often be contentious. Yet with professors and students committed to a baseline civility, it would be cognitively progressive.

But the leading professors of the new era — Michel Foucault, Jacques Derrida, and Richard Rorty among them -- undercut that entire process. Facts, they argued, are merely subjective constructs and masks for hidden power agendas. Over the next generation the words "truth," "justice," "freedom," and "equality" began to appear exclusively in ironic scare quotes.

"Everything," declared post-modern professor Fredric Jameson, "is political." And absent facts, argued post-modernist Frank Lentricchia, the professor's task is transformed from truth-seeker to political activist: in the classroom he should "exercise power for the purpose of social change."

We live in the resulting postmodern intellectual culture, with an entire generation (mis-)educated to see politics not as a cooperative quest to solve economic problems and protect human rights — but as a ceaseless clash of adversarial groups each committed to its own subjectivist values. Feminist groups versus racial groups versus wealth groups versus ethic groups versus sexuality groups versus an open-ended number of increasingly hostile and Balkanized subdivisions.

Thus we have a generation populated with biologically mature people who lack the psychological maturity to handle debate and occasional political loss — at the same time convinced of the absolute subjective necessity of asserting their goals in a hostile, victimizing social reality.

As reasonable discussion declined in universities, physicalist tactics quickly replaced them. Arguments about principles were replaced with routine ad hominem attacks. Letters of invitation to guest lecturers prompted threats of violence. The heckling of speakers turned to shouting them down. Picketing protests became intentional obstruction.

And now we get the inevitable backlash as other, rival factions learn the new rules and steel themselves for engagement.

Yiannopoulos himself is a product of post-modern culture, as it was he who exultingly coined the phrase "post-fact era" to describe how politics now works. He is proving himself to be an effective player of that brand of political activism.

Yet the governing ethic of our political culture is not a lost cause, as large swathes of the American populace are still committed to the core democrat-republican civic virtues of intellectually honest debate, free speech, tolerance — and of being both a good loser and a good winner. A fractious election brought out many of the worst among us. But journalistic headlines aside, our choice is not only between the tactics of post-modern political correctness and those of alt-right populism. Our leading intellectuals, especially those within universities who are nurturing the next generation of leaders, must also teach the genuinely liberal-education alternative.

Tags: Free Speech, College Students, Protests

“Just the facts, ma’am.”

“Politics is a battle of ideas; in the course of a healthy debate, we’ll prioritize different goals, and the different means of reaching them. But without some common baseline of facts; without a willingness to admit new information, and concede that your opponent is making a fair point, and that science and reason matter, we’ll keep talking past each other, making common ground and compromise impossible.”

– President Obama (Jan. 10, 2017)

Simple Definition of fact

  • : something that truly exists or happens : something that has actual existence
  • : a true piece of information

Source: Merriam-Webster's Learner's Dictionary

 

“Just the facts, ma’am.” #1

Some simple words are often used imprecisely. In discussions related to climate science, the simple word “fact” is a case in point. For example, a temperature measurement taken by an observer from a particular instrument, in a particular enclosure, at a particular location and at a particular time is frequently referred to as a “fact”. However, it is only a “fact”, as defined above, in those particular circumstances. It is not necessarily and not even likely “a true piece of information”, in the broader sense, since it is affected by those circumstances.

Temperature measurements taken “near-surface” are “selected” for inclusion in the temperature record; and, are then “adjusted” to account for the particular instrument, enclosure, location and time of observation. These “adjusted” measurements are not “something that truly exists or happens”, but rather an estimate of something that might “truly exist or happen”.

An “ideal” near-surface temperature measurement site is identified as follows: Climate Reference Network (CRN) Site Information Handbook

Class 1– Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19º). Grass/low vegetation ground cover <10 centimeters high.    Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots.  Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away.  No shading when the sun elevation >3 degrees.

Such a site is estimated to be able to produce a near-surface temperature measurement with an error of less than 1oC, assuming proper instrument selection and calibration, proper enclosure and timely reading. Such a measurement is a “fact”, subject to those limitations.

Climate science deals with these errors of “fact” regarding near-surface temperature measurements by using temperature anomalies, or the differences between temperature measurements taken at a particular site. These anomalies are “facts” only if there have been no changes in any of the circumstances which affect the measurements; and, they cease to be facts if the measurements are “adjusted”, rendering them merely estimates.

 

 

“Just the facts, ma’am.” #2

Above I discussed the limitations on “facts”; and, the difference between facts and estimates related to individual temperature measurements, whether analyzed as discrete temperatures or temperature anomalies.

Once near-surface temperature measurements have been recorded, selected and “adjusted”, the next step in the process is to combine these selected, “adjusted” temperature estimates, expressed as anomalies from previous selected, “adjusted” temperature estimates, into an estimated global average near-surface temperature anomaly. While it might be argued that errors in the recorded temperature measurements might be random, it cannot be argued that the selection of the temperature measurements to be included in the global average calculation or the “adjustments” made to these temperature measurements are random. There could be no rational explanation for making random “adjustments” to measurements.

The estimated global average surface temperature anomaly is reported to two decimal place “precision”; and, used to calculate decadal rates of temperature increase to three decimal place “precision”. This level of “precision” is highly questionable, bordering on ridiculous, considering the inaccuracy of the underlying temperature measurements. The underlying temperature measurements are estimated to be in absolute error by an average of more than 2oC in the US, where they have been surveyed and their siting compared to the US CRN1 siting requirements. The expected inaccuracy of the remaining global temperature measuring stations is assumed to be similar, though they have not been surveyed and their siting compared to the US CRN1 siting requirements.

Finally, the estimated “adjusted” global average temperature is reported to two decimal place “precision”. This estimate is reported as a “fact”, though the particular circumstances under which the estimate might have been a “fact” are ignored.

 

 

“Just the facts, ma’am.” #3

“Just the facts, ma’am.” (1 & 2) discussed “facts” in the context of individual near-surface temperature measurements and global average temperature anomaly calculations. The final step in the climate change analysis process is the creation of future climate change scenarios using climate models.

There are numerous climate models, none of which have been verified. Therefore, it cannot be said that there is a climate model which is a “fact”, in the sense that it accurately models “something that truly exists or happens”, rather than hypothesizes something which might happen if the model were accurate.

The climate models are run using a range of inputs for climate sensitivity and climate forcings, because there are no specific, verified values for the various sensitivities and forcings. Therefore, not only are the climate models not “facts” (“something that truly exists or happens”), the inputs which feed the models are not “facts” either, in the sense that they are “a true piece of information”. It is not even a “fact” that the actual climate sensitivity or actual climate forcings are within the ranges of the values used as inputs to the models.

Therefore, the modeled scenarios of future climate change (temperature change) are not “facts”, or arguably even based on facts. Rather, they are estimates, based on estimates, of the potential change in current estimates over some future period.

Based on the “facts”, as discussed in these commentaries, there is only a tenuous basis for concern about catastrophic anthropogenic climate change.

That’s “Just the facts, ma’am.” (HT: Sgt. Joe Friday, LAPD)

Tags: Temperature Record, Estimates as Facts, Global Temperature, Adjusted Data

Energy Efficiency as a Climate Change Reform Strategy: Are we just throwing money at the problem?

(For an introduction to this E3 blog series click here)

Put to one side for a moment whether we need to do anything about climate change.  Assume it is real and we need to do “something.”  There are a wide variety of “somethings” that we can do.  Indeed, right now we are in the “throw spaghetti on the wall and see what sticks” phase.

But let’s face it, we have many more problems than climate change (even assuming it is a real problem).  There is, if you will, strong competition for scarce public resources to solve problems.  I would think it hardly controversial to state that we should spend public tax dollars in the most cost effective way possible.  Bjorn Lomborg makes the point that we don’t want to just feel good, we want to DO good!

For example, if there are two competing proposals to reduce a certain amount of greenhouse gases, all other things being equal, the one that does it cheapest should be chosen.  Similarly, if there are two competing proposals that will save lives, we should choose the one that saves lives for lower costs.  What if one proposal is to save a life a century from now and one to save it today?  More difficult, what if one is to buy mosquito netting for developing countries to mitigate malaria and another to slow the increase in global temperature in 50 years?  These all involve difficult trade-offs on how to use scarce resources.

Nearly every discussion of remedies for climate change discusses the enormous opportunity for energy efficiency.  Based on engineering models, rather optimistic claims are made for the potential for investments in energy efficiency to cure a variety of what ails us, often called a win-win-win situation.  We would use less energy.  Our total energy bills would be reduced.  We would emit less greenhouse gases.  We would need to build fewer electric power plants.  And best of all, the return on investment would rival that of Bernie Madoff’s and it would be tax free.

To be fair, the US does have an energy efficiency problem.  If energy prices (either gasoline or electricity) are distorted, then by definition are we not using energy efficiently.  This has possible environmental, energy security, and economic implications.

Both liberal and conservative energy analysts agree that the way that electricity is priced in the US leaves lots of room for improvement.  Broadly, we set prices that are too low in the peak period and too high in the off-peak period.  Additionally, many states set electricity rates in a way that gives electric utilities incentives to build new power plants rather than to invest in electric efficiency technologies.

The key disagreement between market oriented analysts and many liberals is the technique that should be adopted to correct these problems.  Market analysts promote the use of competition and market forces and market-based regulatory approaches to achieve better pricing signals for consumers.  Once prices are “efficient,” then let the consumer choose how to make the myriad trade-offs as to how to spend their money.  Many liberals promote a much more command-and-control regulatory approach to rectify the distortions created by bad pricing.  In essence, they don’t believe the consumer will make the “right” choices and thus adopt policies that force correct choices.

In 2015, a dramatic study was released by several professors from the University of Chicago and University of California at Berkeley.   The study found that the engineering model that is most often used to project the costs and benefits of energy efficiency technologies was seriously flawed. It found that the model’s projections seriously overestimated the energy savings that would result from investing in a given technology. This is important because public monies are often used to fund investments in energy efficiency. If the study is correct many projects that are funded do not meet the standard of being cost beneficial.

The study caused quite a stir in the energy policy community. It threatened to slaughter one of the sacred cows of progressives. But the study is significant because it is the first of its kind to comprehensively study the actual after-the-fact results to compare projected energy savings to actual energy savings. If the study is correct, it severely undercuts one of the main arguments that is often used to justify significant public investment in energy efficiency.

Some who are climate skeptics will no doubt tout the study as evidence for a variety of propositions, i.e., wasteful government programs, unreliability of engineering models, the triumph of good intentions over good policy.  But even for those genuinely concerned with climate change, if the study is correct and it turns out we have a real climate change problem, energy efficiency strategies will fail to address the intended problem.  This means we are not really addressing climate change.  We are just throwing scarce public resources at the problem.

Tags: Efficiency Standards

Priorities

One of the principal concerns raised regarding climate change is its potential effects on agriculture. There is continuing discussion that the potential combination of increased temperatures with drought or increased rainfall might result in reduced crop yields or crop failure in some or all of the traditional crop production regions. There is also continuing discussion of the perceived need to reduce meat consumption, so that grazing land could be converted to food production.

There is little discussion of the likelihood that production of these food crops would move to areas which have been too cold or had too short growing seasons in the past. There is also little recognition of the contribution of plant genetics to increased plant tolerance and yield.

However, in the face of all of this concern about food production, at least in the United States, the number one cash crop in ten US states is marijuana, as shown in the attached table. Marijuana is among the top five cash crops in a total of 39 of the 50 states; and, among the top ten cash crops in all but 2 states.

This is not to suggest that marijuana is a major crop in any of these states by volume or weight, or that it is crowding out production of other crops for human or animal consumption, including export. Rather, it is to suggest that a very high value has been placed on a crop which has no food value (even when baked into brownies), in the face of vocal concern about the adequacy of future food production.

Current efforts to legalize the consumption of marijuana for other than medicinal purposes will likely increase the demand for the product, increasing the productive acreage dedicated to its production, though it might also result in corresponding reductions in its commercial value. One has to question whether this agricultural product should have such a high priority.

 

Marijuana Rank as Cash Crop, by State

Alabama 1 Louisiana 6 Ohio 4
Alaska NA Maine 1 Oklahoma 3
Arizona 3 Maryland 5 Oregon 4
Arkansas 4 Massachusetts 2 Pennsylvania 5
California 1 Michigan 5 Rhode Island 1
Colorado 4 Minnesota 6 South Carolina 3
Connecticut 1 Mississippi 3 South Dakota 9
Delaware 3 Missouri 4 Tennessee 1
Florida 2 Montana 4 Texas 6
Georgia 3 Nebraska 9 Utah 2
Hawaii 1 Nevada 2 Vermont 2
Idaho 5 New Hampshire 2 Virginia 1
Illinois 4 New Jersey 7 Washington 5
Indiana 3 New Mexico 2 West Virginia 1
Iowa 4 New York 2 Wisconsin 6
Kansas 6 North Carolina 5 Wyoming 8
Kentucky 1 North Dakota ?    

 

Source: NORML (USDA data)

Tags: Agriculture

Energy Policy and the Presidential Election

(For an introduction to this E3 blog series click here)

Professor Richard Muller of the University of California, Berkley, a PhD in physics, wrote a fascinating book in 2012, Energy for Future Presidents: The Science Behind the Headlines.  He wrote it as if it were a memo to the next president.  Though written for the 2012 presidency, bottom line, it stands the test of time and is still a good read for the next president, assuming the president also reads my critiques of some of the conclusions.

I just found the book but I wish I had found it sooner.  Several things are fascinating.

First, the book is comprehensive in its discussion of current energy policy issues.  He is remarkably lucid considering the complexity of the subject matter.  To be completely honest, I learned a lot about the underlying physics of energy that will prove helpful in making future E3  policy recommendations.

Second, four years have passed so we have more information now than Professor Muller had.  But he was remarkably prescient in his insights about many energy issues.  For example, fracking for oil shale was in its infancy, while fracking for natural gas was going gang busters.  He correctly predicted that if fracking for oil shale played out it would radically change global dynamics related to energy as well as foreign policy.  He even predicted that it could mean the end of OPEC.  I am ready to concede that he is correct on this one.

Third, he predicates much of his analysis of energy on two issues: oil security and climate change.  I call each of these issues a Golden Thread.  If his Threads are correct, his recommendations create an elegant garment.  But if you pull both of the Golden Threads from the garment, it unravels into rags. Future blog postings will discuss these Golden Threads and the attendant recommendations.  Since I disagree with the importance of the Golden Threads, much of my analysis will be critical of an otherwise very insightful book.

Fourth, I highly recommend the book because I am very impressed with Professor Muller’s intellectual integrity.  My career has been in energy policy specifically and domestic economic policy broadly.  I have been at it for almost 4 decades.  I am a lawyer by training but have mostly “practiced” economics and the implementation of free market policies in energy.  I worked in the Reagan and Bush Administrations for nearly 11 years on radically changing policy relating to natural gas.  In hindsight, we succeeded beyond our wildest imaginations, as natural gas transitioned from energy basket case in the 1970s to the sharpest arrow in the energy quiver in the 2000s.  Many predicted at the time of the reforms that this paradigm based on market reliance would result in complete failure. (The late Senator Howard Metzenbaum called the Wellhead Decontrol Act of 1989 “one of the most anti-consumer pieces of legislation we have produced in a long time.”)

Accordingly, I have had my share of arguments and disagreements about energy policy.  It is rare to find someone whose arguments do not correspond with their self-interest.  Self-interest doesn’t necessarily mean you are wrong; it just means your arguments have to be taken with a grain of salt.  I have had only a few discussions with persons with no self-interest, a deep understanding of the issues, and intellectual integrity.  Thus it was a pleasant surprise to find Professor Muller’s book.

He does not share my world view.  First, he is by his own admission a physicist and concedes that he does not have training in the many other disciplines that are required for sound energy policy.  But even I would concede that my recommendations must be consistent with sound physics; thus, his insights must be taken seriously.  Second, he has much more faith in government solutions than I do, though he is thoughtful in his consideration of policy alternatives as compared to conventional liberal wisdom.  Third, as noted above, he believes in the Golden Threads, and I disagree with him on a number of points relating to those Threads.

Despite our disagreements, I have found his ability to decimate some of the Left’s sacred cows absolutely compelling and brutally honest.  Given that he is a member of their tribe and I am not, he presumably has some credibility in his observations that a “hack” like me would lack.  Several examples follow:

  1. Nuclear power is safe.
  2. Energy accidents are overblown.
  3. A radical carbon reduction strategy is unsound.
  4. Electric cars are not a sound solution to any of our energy problems.
  5. Many assumptions made about solar, wind, geothermal, and electric storage are pie in the sky from a physics perspective and should be embraced with a grain of salt.

Thus, his book is an excellent starting point for deeper discussions of energy policy in future blog postings as we enter the cycle of re-envisioning energy policy in a new Administration.

(Truth in advertising, I was Dr. Ben Carson’s energy and environment advisor.)

Tags: Book Review

A Little Perspective

The angst-ridden, consensed climate science community is focused on an increase in global average near-surface temperature of approximately 0.7°C (1.3°F) per century, or a total increase of approximately 0.9°C (1.6°F) since 1880, according to NOAA.

To provide some perspective on the cause of this angst, I have selected Wichita, Kansas, a city located very near the geographic center of the contiguous 48 states of the US. The data source for this analysis is weather.com.

The record high temperature in Wichita is 114°F. The record low is -22°F. That is a difference of 136°F between the record high and low temperatures over the same period that NOAA reports a global average near-surface temperature increase of approximately 1.6°F.

The typical range of daily high and low temperature in Wichita is approximately 20°F throughout the year. Assuming that the transition from the daily low temperature to the daily high temperature occurs over a period of approximately 12 hours, the rate of diurnal temperature change in Wichita is approximately 1.7°F per hour, or approximately the same as the total change in global average near-surface temperature over the 136 years since 1880.

NOAA reports the global average near-surface temperature as approximately 57°F. Wichita average temperatures range from approximately 32°F in January to approximately 80oF in July, relatively close to the global average near-surface temperature. This is a local average temperature change of approximately 0.3°F per day, or approximately one fifth of the total reported change of global average near-surface temperature over the 136 years since 1880.

It is also interesting to compare the rates of change of temperature. The approximately 0.3°F daily rate of local average seasonal temperature change in Wichita is approximately 10 thousand times the reported rate of global average near-surface temperature change over the 136 year period since 1880. The approximately 1.7°F per hour rate of change of diurnal temperature in Wichita is approximately 1.2 million times the reported rate of change of global average near-surface temperature over the same period.

Similar analyses in other areas of the globe would produce similar, though not identical, results. Clearly, all life forms on earth experience far more rapid temperature changes on a daily and seasonal basis than the earth has experienced on a global basis over the past 136 years. Also, the global change has manifested predominantly as warmer nights and milder winters, rather than as increased maximum temperatures, thus reducing the stress imposed by the increase.

Tags: Global Temperature, Temperature Record

Introductory Energy Post and the Clean Power Plan

Hi.  My name is Ken Malloy and I am a Senior Scholar with the Mark H.  Berens Family Charitable Foundation, the non-profit organization that publishes this website, TheRightInsight.org.

 

My expertise is in the integration of energy, environmental, and economic (E3) for short) policy.  I hesitate to use the term “energy policy” alone to describe my expertise.  I have found that energy policy issues have come to intersect so significantly with economics and environment that the term energy policy can sometimes become too limiting.  I also found that too often experts were organized into silos of one discipline and only marginally qualified in the other disciplines to make sound energy policy decisions that have strong environmental and economic implications.  I have worked at the intersection of these three disciplines for three decades, especially in electricity and the radical restructuring of natural gas markets to promote wellhead and retail competition.

 

A good example of this type of confusion relates to oil imports.  From an energy/security policy perspective, many analysts argue that imports of oil from hostile regions are a bad thing and thus they support various policies to reduce reliance on oil, focusing on vehicle efficiency standards, ethanol requirements, petroleum reserves, etc.  An environmentalist might regard using oil as a problem because it depletes a finite resource or causes pollution and thus support policies that either reduce demand for oil, i.e., vehicle efficiency standards, or establish technology standards to reduce pollution, i.e., a catalytic converter.  Most economists, assuming reasonably competitive global markets, would not be very worried about oil imports or consuming a “finite” resource and few economists support technology standards as the most efficient means of dealing with the third-party effects of pollution.  Thus, I have concentrated my efforts on understanding the integration of policy in order to promote a sound and efficient energy industry.

 

In addition to the integrated analysis issue, I also believe that energy policy can be misleading in that I believe most people immediately think of “fuels” (such as oil, natural gas, or coal) when they hear “energy policy.”  In my view the most important “energy” policy issue is the electric system.  Yet, at least at the federal level, the electric system receives decidedly less attention than do fuel issues.  But as you will see in future blog posts, I think that the electric system presents more challenges for the future than does the “fuels” industries.

 

So, who am I?  I have been analyzing energy issues since 1978.  I was a law professor that taught several courses that had as one of their defining characteristics the line between economic activity that would ordinarily be limited only by competition and free markets and the interests of the state or the federal government in “regulating” or affecting the competitive rules related to such activity.  The period between 1978 and 1981 was a surprisingly fertile time for this focus since the Federal Government was deregulating airlines (1978), railroads (1980) and trucking (1980), but heavily regulating energy (1978).

 

In 1981, I joined the Reagan Administration at the Federal Energy Regulatory Commission and began working on the rules related to price controls of natural gas under the Natural Gas Policy Act of 1978.  I eventually continued that work at the US Department of Energy until 1992, where I picked up issues relating to oil pipeline deregulation.  The radical reforms that were adopted for natural gas in the 1980s and early 1990s and the dramatic success of those reforms informs much of my analysis of E3 policy. After 1992, I also began working on issues related to the electric industry competition.  For reasons detailed in other sections of TheRightInight.org, the reform of the electric industry has not been as successful as other connective industry reforms.

 

The work I did promoting competition and deregulation of natural gas for 11 years turned out to be very successful.  The nation increasingly relies on natural gas as an abundant energy source that is the cleanest burning fossil fuel, is plentiful, is reasonably priced and is responsive to market forces.  Not bad for an energy resource that both the Ford and Carter Administrations had declared in short supply.   Even a liberal professor at Berkley Professor Richard Muller,  with a PhD in physics concluded:

Natural-gas use will grow rapidly, not just in the United States but around the world. This fuel is going to be so important that [the President] might consider launching a nationwide program, called something like The Natural Gas Economy, that recognizes the value of the new gas source and develops a coherent policy and infrastructure to encourage its exploitation.[1]

While I don’t agree with his conclusion regarding the need for a “nationwide program,” I share his sentiment that we have experienced a remarkable transformation in natural gas markets over the last two decades and that natural gas will continue to play an increasingly significant role in the US’s energy future.

 

After leaving DOE in 1996, I worked for an international consulting firm for three years helping companies understand the significance of the transition between previously heavily regulated natural gas and electric markets and the emergence of policies relying on competition in those markets.  Then, I started a think tank on issues of competition in the electric industry from 1999-2006.  I then started another think tank in 2009 to focus more broadly on E3 policy” issues related to that enigmatic line between free markets and government policy, CRISIS & energy markets!, of which I am the Executive Director.

 

Two issues led to this broader scope for the think tank, the impact of the BP Deepwater Horizon on the resurgence of the debate about oil policy and the growing impact of global warming/climate change on E3 policy.  I realized that increasingly a “crisis” was too often used to justify interventions into energy markets that were ill-advised.  (Full disclosure, most recently I was the energy and environment advisor to the presidential campaign of Dr. Ben Carson.  You can find the document I worked on for the campaign here.)

 

TheRightInsight has asked me to write three types of documents.  The first is a comprehensive summary of “energy policy.”  Energy Policy 1.0 has been completed and can be found here.  The goal of Energy Policy 1.0 is to provide a broad, market-oriented view of the current state of E3 policy for an audience that is not expert in energy policy, a Wikipedia on E3 policy but from a free-market perspective.  I will provide two types of updates to this article.  The first type of update will be minor technical corrections or changes as the underlying facts change.  These changes will be highlighted in the document on the website so that you can see the evolution of the document (for example Energy Policy 1.1).  The second type of change will possibly be an Energy Policy 2.0 if at some point in the future it becomes necessary to publish a new edition of the article, as for example might be the case with new legislation or dramatically new policies in President Trump’s Administration.

 

The second type of document is a Commentary.  Commentaries will be 6 to 10 page  analyses of a single E3 issue that is more in-depth and analytical than is found in the more general Energy Policy article.  We anticipate publishing a Commentary about once a month.  So far we have published three Commentaries, oil markets, electricity, and the consensus on climate change.

 

The third type of document is a Blog Post.  The document you are currently reading is the first Blog Post.  Our goal is to publish a Blog Post on an energy issue once a week.  A Blog Post is typically about 600 words, though the nature of this first Blog Post exceeds that length.


-----------------------------------------------------------------------------------------------------------------

That completes my introduction to this effort now let’s get on to substance.

 

Right now, the most important E3 issue is the Clean Power Plan (CPP) issued by the US Environmental Protection Agency (EPA).  Ed Reid, another Scholar with the Mark H.  Berens Family Charitable Foundation, has written a recent blog posting on the CPP broadly focusing on the impact on coal and the fact that both industry and Congress have requested that the Supreme Court issue a stay of the EPA’s CPP.  (Mr. Reid has also published a lengthy Article on the science of climate change for TheRightInsight.)

 

President Obama announced the final version of the CPP on August 3, 2015.  The Supreme Court issued a stay of the CPP on February 9, 2016, thereby temporarily preventing it from being implemented until the Supreme Court has an opportunity to review the Plan after the courts below had completed their review.

 

This is the setup to possibly the most significant E3 decision in the history of the United States (dramatic music playing in the background).  Both the Democratic and Republican Parties had specific language in their 2016 party platforms on the CPP.  Not surprisingly, the Republican Platform advocates repeal of the CPP, while the Democratic Platform supports the CPP.

 

So in plain English what is the CPP?

There is considerable debate about the impact of carbon emissions on global warming and what should be done about it.  Energy Policy 1.0 has a broad discussion of climate change and E3 Commentary 3 is an expanded discussion of the climate change “consensus.”  This Blog Post is not the place to engage in that debate.  Rather, it merely explains the significance of the CPP’s role in the national climate change debate.

 

Electricity generated from coal and natural gas emits carbon dioxide and about 66% of the Nation’s electricity is generated from coal and natural gas.  Generating electricity with fossil fuels accounts for about 40% of manmade carbon dioxide emissions.  If one accepts that carbon emissions cause some global warming that will be catastrophic at some point in the future, then one strategy for dealing with global warming is to reduce carbon emissions from the generation of electricity.  (Recognize, however, that it is not the only possible strategy.  Even some strong believers in climate change harm recognize the limitations of this strategy.)

 

Congress has not declared a national policy on climate change.  They came pretty close in 2009 with the Waxman-Markey bill, which passed the House but not the Senate (even though the Democrats had enough votes to overcome a possible Republican filibuster).

 

Without a national policy, chaos has reigned in energy policy relating to climate change.  Even if one is an ardent believer in the likelihood that carbon emissions will inevitably have catastrophic consequences, a fair-minded person would have to admit that the current pattern of policy implementation is haphazard, dysfunctional, costly, and ineffective.

 

The CPP, if allowed to go into effect, will require each state to develop a plan for its electric utilities to meet certain carbon emissions targets set by the Environmental Protection Agency.  The CPP gives states some flexibility to meet the carbon emissions target.

 

The CPP permits a combination of three strategies to attain a state’s carbon reduction targets. The first is to improve the efficiency of existing coal plants, thus allowing the same amount of coal to produce more electricity thereby reducing carbon emissions per unit of electricity. The second is to increase the use of natural gas generation, thereby reducing carbon emissions since natural gas emits about half as much carbon dioxide as does coal. The third is to increase the use of renewable energy. In addition, states can also promote increased energy efficiency as a way of using less energy, thus reducing the demand for electric generation. If a state fails to file a plan that meets the target, the rule allows the EPA to develop and implement a federal plan for that state. (For some reason, the EPA does not encourage the use of new nuclear power plants to reduce carbon emissions.  Nuclear power does not emit any carbon.)

 

There are several key points to be made about the CPP:

  1. The CPP requires a 32% reduction in carbon emissions over 2005 levels by 2030.
  2. It  is based on the premise that climate change is a serious problem.  While few scientists disagree that carbon emissions have an impact on the greenhouse effect or that the earth’s temperature has had a small increase during the 20th Century, there is still considerable debate about whether the continuing impact of carbon emissions will be catastrophic.  For almost two decades, there has not been any significant warming as predicted by the climate models, the so-called “pause.”
  3. If climate change will not cause catastrophic consequences, then actions such as the CPP are unnecessary and impose significant costs on the economy. Given that there are scarce resources available to address society’s needs, it would be wasteful to dedicate those resources to a problem that does not exist.
  4. Even if one assumes that climate change is a serious problem, there are a variety of policy strategies that are being debated as to what to do about climate change.  Broadly, there are three competing strategies for dealing with climate change.
  • The first and most often discussed is a radical reduction in carbon emissions. The theory that supports this strategy is that by reducing carbon emissions we will be able to better control the temperature of the earth.  There are three ways to achieve such reductions: mandates such as the CPP, a permitting program such as cap-and-trade, and a carbon tax.
  • The second strategy is called geo-engineering. This strategy posits that we can develop technologies by mid-century that will address the issue of the warming of the atmosphere caused by carbon emissions if such warming continues and it becomes clearer that the consequences would be catastrophic. For example, suppose that we could develop algae that would absorb carbon in the world’s oceans or that an aerosol could be developed that could be emitted into the atmosphere that would block radiation and control the temperature of the earth.
  • The third strategy is adaptation. Weather conditions vary dramatically across the globe. Humans adjust to this variation in a wide variety of ways. For example, Amsterdam built a series of canals in the 17th Century to make the land more habitable.  Given that the projected impact of climate change will have both benefits and detriments, it may be more cost-effective to adapt to a changing climate than to try to control the climate.
  • These strategies are not mutually exclusive. Thus, a fourth strategy would be to combine pieces of all three strategies as a way of coping with the potential impact of climate change. 
  1. The CPP adopts a specific strategy of requiring dramatic reductions in carbon emissions.
  2. Few would argue that this strategy will not be expensive and require massive adjustments to the electric utility system.  For example, it is likely that no new coal plants will be able to be built under the CPP unless a technology is developed that allows the sequestration of carbon dioxide emissions. Additionally, many existing coal plants that have significant useful life remaining will have to be closed. To make up for the loss of coal, there will have to be significant actions taken to enhance energy efficiency and to develop renewable resources. While this may be beneficial, there is no question that it will be expensive and even potentially disruptive to the electric system.
  3. Supporters of the CPP argue that such dramatic measures are required to address the serious consequences of climate change. Opponents of the CPP broadly argue either that climate change is not a serious problem or that the strategy of radical carbon reduction is ill advised for a variety of reasons.

 

There is no question that the election of Donald Trump and the Supreme Court review of the CPP will have a major impact on the implementation of the CPP.



[1] Energy For Future Presidents: The Science Behind The Headlines, page 294, Muller, Richard (2012)(emphasis in original).

 

Tags: Clean Power Plan

Personal Precautions

One of the idyllic images cherished by many environmentalists concerned about the climate is life “off the grid”, free of utilities, living off the land, minimizing their impact on the planet. However, reality frequently rears its ugly head, blurring the idyllic image.

I recently had the opportunity to spend several days visiting with friends who live on a quarter section in-lot (160 acres), completely surrounded by Bureau of Land Management land. They are not connected to the electric grid, to natural gas distribution, or to municipal or private water service. They do have radio-telephone service; and, satellite service for the internet and television.

They use both dual-axis tracking and fixed solar photovoltaic collectors to provide their electricity; and, they store excess electricity generated during the day in a battery bank to meet their needs when the sun isn’t shining. They also had, but have since removed, a wind turbine, which proved to be both inefficient and problematic. However, as a precaution, they also have a propane-powered generator, equipped with an automatic transfer switch to pick up the load when necessary.

They use solar thermal collectors to produce hot water, both for domestic use and as the heat source for the in-floor hydronic system, which provides the primary source of space heating for their home. However, as a precaution, they have both a propane-fueled instantaneous water heater and a propane-fueled furnace, as well as two wood stoves.

Their home is located in an area which receives relatively little rain and snow, so the availability of water is a prime concern. They collect their water from the roofs of their home and garages; and, store several thousand gallons of water in four large storage tanks. They also use composting toilets to reduce water consumption and avoid sanitary water (black water) disposal issues.

Their vehicles are all gasoline-fueled. Electric vehicles would require installation of additional solar or wind generation capacity; and, far greater useful vehicle range.

This is not to suggest that the idyllic life “off the grid” is not possible, but rather that it requires extensive and careful precautionary planning to assure continuous quality of life; and, technological evolution to “fill in the blanks”.

Tags: Backup Power
Search Older Blog Posts