Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

A Little Perspective

The angst-ridden, consensed climate science community is focused on an increase in global average near-surface temperature of approximately 0.7oC (1.3oF) per century, or a total increase of approximately 0.9oC (1.6oF) since 1880, according to NOAA.

To provide some perspective on the cause of this angst, I have selected Wichita, Kansas, a city located very near the geographic center of the contiguous 48 states of the US. The data source for this analysis is

The record high temperature in Wichita is 114oF. The record low is -22oF. That is a difference of 136oF between the record high and low temperatures over the same period that NOAA reports a global average near-surface temperature increase of approximately 1.6oF.

The typical range of daily high and low temperature in Wichita is approximately 20oF throughout the year. Assuming that the transition from the daily low temperature to the daily high temperature occurs over a period of approximately 12 hours, the rate of diurnal temperature change in Wichita is approximately 1.7oF per hour, or approximately the same as the total change in global average near-surface temperature over the 136 years since 1880.

NOAA reports the global average near-surface temperature as approximately 57oF. Wichita average temperatures range from approximately 32oF in January to approximately 80oF in July, relatively close to the global average near-surface temperature. This is a local average temperature change of approximately 0.3oF per day, or approximately one fifth of the total reported change of global average near-surface temperature over the 136 years since 1880.

It is also interesting to compare the rates of change of temperature. The approximately 0.3oF daily rate of local average seasonal temperature change in Wichita is approximately 10 thousand times the reported rate of global average near-surface temperature change over the 136 year period since 1880. The approximately 1.7oF per hour rate of change of diurnal temperature in Wichita is approximately 1.2 million times the reported rate of change of global average near-surface temperature over the same period.

Similar analyses in other areas of the globe would produce similar, though not identical, results. Clearly, all life forms on earth experience far more rapid temperature changes on a daily and seasonal basis than the earth has experienced on a global basis over the past 136 years. Also, the global change has manifested predominantly as warmer nights and milder winters, rather than as increased maximum temperatures, thus reducing the stress imposed by the increase.

Tags: Global Temperature, Temperature Record

Introductory Energy Post and the Clean Power Plan

Hi.  My name is Ken Malloy and I am a Senior Scholar with the Mark H.  Berens Family Charitable Foundation, the non-profit organization that publishes this website,


My expertise is in the integration of energy, environmental, and economic (E3) for short) policy.  I hesitate to use the term “energy policy” alone to describe my expertise.  I have found that energy policy issues have come to intersect so significantly with economics and environment that the term energy policy can sometimes become too limiting.  I also found that too often experts were organized into silos of one discipline and only marginally qualified in the other disciplines to make sound energy policy decisions that have strong environmental and economic implications.  I have worked at the intersection of these three disciplines for three decades, especially in electricity and the radical restructuring of natural gas markets to promote wellhead and retail competition.


A good example of this type of confusion relates to oil imports.  From an energy/security policy perspective, many analysts argue that imports of oil from hostile regions are a bad thing and thus they support various policies to reduce reliance on oil, focusing on vehicle efficiency standards, ethanol requirements, petroleum reserves, etc.  An environmentalist might regard using oil as a problem because it depletes a finite resource or causes pollution and thus support policies that either reduce demand for oil, i.e., vehicle efficiency standards, or establish technology standards to reduce pollution, i.e., a catalytic converter.  Most economists, assuming reasonably competitive global markets, would not be very worried about oil imports or consuming a “finite” resource and few economists support technology standards as the most efficient means of dealing with the third-party effects of pollution.  Thus, I have concentrated my efforts on understanding the integration of policy in order to promote a sound and efficient energy industry.


In addition to the integrated analysis issue, I also believe that energy policy can be misleading in that I believe most people immediately think of “fuels” (such as oil, natural gas, or coal) when they hear “energy policy.”  In my view the most important “energy” policy issue is the electric system.  Yet, at least at the federal level, the electric system receives decidedly less attention than do fuel issues.  But as you will see in future blog posts, I think that the electric system presents more challenges for the future than does the “fuels” industries.


So, who am I?  I have been analyzing energy issues since 1978.  I was a law professor that taught several courses that had as one of their defining characteristics the line between economic activity that would ordinarily be limited only by competition and free markets and the interests of the state or the federal government in “regulating” or affecting the competitive rules related to such activity.  The period between 1978 and 1981 was a surprisingly fertile time for this focus since the Federal Government was deregulating airlines (1978), railroads (1980) and trucking (1980), but heavily regulating energy (1978).


In 1981, I joined the Reagan Administration at the Federal Energy Regulatory Commission and began working on the rules related to price controls of natural gas under the Natural Gas Policy Act of 1978.  I eventually continued that work at the US Department of Energy until 1992, where I picked up issues relating to oil pipeline deregulation.  The radical reforms that were adopted for natural gas in the 1980s and early 1990s and the dramatic success of those reforms informs much of my analysis of E3 policy. After 1992, I also began working on issues related to the electric industry competition.  For reasons detailed in other sections of, the reform of the electric industry has not been as successful as other connective industry reforms.


The work I did promoting competition and deregulation of natural gas for 11 years turned out to be very successful.  The nation increasingly relies on natural gas as an abundant energy source that is the cleanest burning fossil fuel, is plentiful, is reasonably priced and is responsive to market forces.  Not bad for an energy resource that both the Ford and Carter Administrations had declared in short supply.   Even a liberal professor at Berkley Professor Richard Muller,  with a PhD in physics concluded:

Natural-gas use will grow rapidly, not just in the United States but around the world. This fuel is going to be so important that [the President] might consider launching a nationwide program, called something like The Natural Gas Economy, that recognizes the value of the new gas source and develops a coherent policy and infrastructure to encourage its exploitation.[1]

While I don’t agree with his conclusion regarding the need for a “nationwide program,” I share his sentiment that we have experienced a remarkable transformation in natural gas markets over the last two decades and that natural gas will continue to play an increasingly significant role in the US’s energy future.


After leaving DOE in 1996, I worked for an international consulting firm for three years helping companies understand the significance of the transition between previously heavily regulated natural gas and electric markets and the emergence of policies relying on competition in those markets.  Then, I started a think tank on issues of competition in the electric industry from 1999-2006.  I then started another think tank in 2009 to focus more broadly on E3 policy” issues related to that enigmatic line between free markets and government policy, CRISIS & energy markets!, of which I am the Executive Director.


Two issues led to this broader scope for the think tank, the impact of the BP Deepwater Horizon on the resurgence of the debate about oil policy and the growing impact of global warming/climate change on E3 policy.  I realized that increasingly a “crisis” was too often used to justify interventions into energy markets that were ill-advised.  (Full disclosure, most recently I was the energy and environment advisor to the presidential campaign of Dr. Ben Carson.  You can find the document I worked on for the campaign here.)


TheRightInsight has asked me to write three types of documents.  The first is a comprehensive summary of “energy policy.”  Energy Policy 1.0 has been completed and can be found here.  The goal of Energy Policy 1.0 is to provide a broad, market-oriented view of the current state of E3 policy for an audience that is not expert in energy policy, a Wikipedia on E3 policy but from a free-market perspective.  I will provide two types of updates to this article.  The first type of update will be minor technical corrections or changes as the underlying facts change.  These changes will be highlighted in the document on the website so that you can see the evolution of the document (for example Energy Policy 1.1).  The second type of change will possibly be an Energy Policy 2.0 if at some point in the future it becomes necessary to publish a new edition of the article, as for example might be the case with new legislation or dramatically new policies in President Trump’s Administration.


The second type of document is a Commentary.  Commentaries will be 6 to 10 page  analyses of a single E3 issue that is more in-depth and analytical than is found in the more general Energy Policy article.  We anticipate publishing a Commentary about once a month.  So far we have published three Commentaries, oil markets, electricity, and the consensus on climate change.


The third type of document is a Blog Post.  The document you are currently reading is the first Blog Post.  Our goal is to publish a Blog Post on an energy issue once a week.  A Blog Post is typically about 600 words, though the nature of this first Blog Post exceeds that length.


That completes my introduction to this effort now let’s get on to substance.


Right now, the most important E3 issue is the Clean Power Plan (CPP) issued by the US Environmental Protection Agency (EPA).  Ed Reid, another Scholar with the Mark H.  Berens Family Charitable Foundation, has written a recent blog posting on the CPP broadly focusing on the impact on coal and the fact that both industry and Congress have requested that the Supreme Court issue a stay of the EPA’s CPP.  (Mr. Reid has also published a lengthy Article on the science of climate change for TheRightInsight.)


President Obama announced the final version of the CPP on August 3, 2015.  The Supreme Court issued a stay of the CPP on February 9, 2016, thereby temporarily preventing it from being implemented until the Supreme Court has an opportunity to review the Plan after the courts below had completed their review.


This is the setup to possibly the most significant E3 decision in the history of the United States (dramatic music playing in the background).  Both the Democratic and Republican Parties had specific language in their 2016 party platforms on the CPP.  Not surprisingly, the Republican Platform advocates repeal of the CPP, while the Democratic Platform supports the CPP.


So in plain English what is the CPP?

There is considerable debate about the impact of carbon emissions on global warming and what should be done about it.  Energy Policy 1.0 has a broad discussion of climate change and E3 Commentary 3 is an expanded discussion of the climate change “consensus.”  This Blog Post is not the place to engage in that debate.  Rather, it merely explains the significance of the CPP’s role in the national climate change debate.


Electricity generated from coal and natural gas emits carbon dioxide and about 66% of the Nation’s electricity is generated from coal and natural gas.  Generating electricity with fossil fuels accounts for about 40% of manmade carbon dioxide emissions.  If one accepts that carbon emissions cause some global warming that will be catastrophic at some point in the future, then one strategy for dealing with global warming is to reduce carbon emissions from the generation of electricity.  (Recognize, however, that it is not the only possible strategy.  Even some strong believers in climate change harm recognize the limitations of this strategy.)


Congress has not declared a national policy on climate change.  They came pretty close in 2009 with the Waxman-Markey bill, which passed the House but not the Senate (even though the Democrats had enough votes to overcome a possible Republican filibuster).


Without a national policy, chaos has reigned in energy policy relating to climate change.  Even if one is an ardent believer in the likelihood that carbon emissions will inevitably have catastrophic consequences, a fair-minded person would have to admit that the current pattern of policy implementation is haphazard, dysfunctional, costly, and ineffective.


The CPP, if allowed to go into effect, will require each state to develop a plan for its electric utilities to meet certain carbon emissions targets set by the Environmental Protection Agency.  The CPP gives states some flexibility to meet the carbon emissions target.


The CPP permits a combination of three strategies to attain a state’s carbon reduction targets. The first is to improve the efficiency of existing coal plants, thus allowing the same amount of coal to produce more electricity thereby reducing carbon emissions per unit of electricity. The second is to increase the use of natural gas generation, thereby reducing carbon emissions since natural gas emits about half as much carbon dioxide as does coal. The third is to increase the use of renewable energy. In addition, states can also promote increased energy efficiency as a way of using less energy, thus reducing the demand for electric generation. If a state fails to file a plan that meets the target, the rule allows the EPA to develop and implement a federal plan for that state. (For some reason, the EPA does not encourage the use of new nuclear power plants to reduce carbon emissions.  Nuclear power does not emit any carbon.)


There are several key points to be made about the CPP:

  1. The CPP requires a 32% reduction in carbon emissions over 2005 levels by 2030.
  2. It  is based on the premise that climate change is a serious problem.  While few scientists disagree that carbon emissions have an impact on the greenhouse effect or that the earth’s temperature has had a small increase during the 20th Century, there is still considerable debate about whether the continuing impact of carbon emissions will be catastrophic.  For almost two decades, there has not been any significant warming as predicted by the climate models, the so-called “pause.”
  3. If climate change will not cause catastrophic consequences, then actions such as the CPP are unnecessary and impose significant costs on the economy. Given that there are scarce resources available to address society’s needs, it would be wasteful to dedicate those resources to a problem that does not exist.
  4. Even if one assumes that climate change is a serious problem, there are a variety of policy strategies that are being debated as to what to do about climate change.  Broadly, there are three competing strategies for dealing with climate change.
  • The first and most often discussed is a radical reduction in carbon emissions. The theory that supports this strategy is that by reducing carbon emissions we will be able to better control the temperature of the earth.  There are three ways to achieve such reductions: mandates such as the CPP, a permitting program such as cap-and-trade, and a carbon tax.
  • The second strategy is called geo-engineering. This strategy posits that we can develop technologies by mid-century that will address the issue of the warming of the atmosphere caused by carbon emissions if such warming continues and it becomes clearer that the consequences would be catastrophic. For example, suppose that we could develop algae that would absorb carbon in the world’s oceans or that an aerosol could be developed that could be emitted into the atmosphere that would block radiation and control the temperature of the earth.
  • The third strategy is adaptation. Weather conditions vary dramatically across the globe. Humans adjust to this variation in a wide variety of ways. For example, Amsterdam built a series of canals in the 17th Century to make the land more habitable.  Given that the projected impact of climate change will have both benefits and detriments, it may be more cost-effective to adapt to a changing climate than to try to control the climate.
  • These strategies are not mutually exclusive. Thus, a fourth strategy would be to combine pieces of all three strategies as a way of coping with the potential impact of climate change. 
  1. The CPP adopts a specific strategy of requiring dramatic reductions in carbon emissions.
  2. Few would argue that this strategy will not be expensive and require massive adjustments to the electric utility system.  For example, it is likely that no new coal plants will be able to be built under the CPP unless a technology is developed that allows the sequestration of carbon dioxide emissions. Additionally, many existing coal plants that have significant useful life remaining will have to be closed. To make up for the loss of coal, there will have to be significant actions taken to enhance energy efficiency and to develop renewable resources. While this may be beneficial, there is no question that it will be expensive and even potentially disruptive to the electric system.
  3. Supporters of the CPP argue that such dramatic measures are required to address the serious consequences of climate change. Opponents of the CPP broadly argue either that climate change is not a serious problem or that the strategy of radical carbon reduction is ill advised for a variety of reasons.


There is no question that the election of Donald Trump and the Supreme Court review of the CPP will have a major impact on the implementation of the CPP.

[1] Energy For Future Presidents: The Science Behind The Headlines, page 294, Muller, Richard (2012)(emphasis in original).


Tags: Clean Power Plan

Personal Precautions

One of the idyllic images cherished by many environmentalists concerned about the climate is life “off the grid”, free of utilities, living off the land, minimizing their impact on the planet. However, reality frequently rears its ugly head, blurring the idyllic image.

I recently had the opportunity to spend several days visiting with friends who live on a quarter section in-lot (160 acres), completely surrounded by Bureau of Land Management land. They are not connected to the electric grid, to natural gas distribution, or to municipal or private water service. They do have radio-telephone service; and, satellite service for the internet and television.

They use both dual-axis tracking and fixed solar photovoltaic collectors to provide their electricity; and, they store excess electricity generated during the day in a battery bank to meet their needs when the sun isn’t shining. They also had, but have since removed, a wind turbine, which proved to be both inefficient and problematic. However, as a precaution, they also have a propane-powered generator, equipped with an automatic transfer switch to pick up the load when necessary.

They use solar thermal collectors to produce hot water, both for domestic use and as the heat source for the in-floor hydronic system, which provides the primary source of space heating for their home. However, as a precaution, they have both a propane-fueled instantaneous water heater and a propane-fueled furnace, as well as two wood stoves.

Their home is located in an area which receives relatively little rain and snow, so the availability of water is a prime concern. They collect their water from the roofs of their home and garages; and, store several thousand gallons of water in four large storage tanks. They also use composting toilets to reduce water consumption and avoid sanitary water (black water) disposal issues.

Their vehicles are all gasoline-fueled. Electric vehicles would require installation of additional solar or wind generation capacity; and, far greater useful vehicle range.

This is not to suggest that the idyllic life “off the grid” is not possible, but rather that it requires extensive and careful precautionary planning to assure continuous quality of life; and, technological evolution to “fill in the blanks”.

Tags: Backup Power

Common Precaution

Many climate activists claim that the Precautionary Principle requires that humanity cease using fossil fuels to avoid the potential of catastrophic anthropogenic climate change. That is a relatively absolutist interpretation of the Precautionary Principle, particularly based on the uncertainties surrounding the purported adverse effects of increased atmospheric CO2 concentrations.

Society, in general, applies the Precautionary Principle quite differently. Humanity does not cease flying, or taking trains, or driving vehicles, or walking because of the potential of injury or death in an accident. Planes, trains and vehicles are inspected to minimize the likelihood of failure leading to an accident. People are careful how and where they walk to minimize the possibility of an accident.

People do not decide not to build homes or apartments near the ocean, because storms might damage their property. Rather, they design and build their homes to minimize the potential of damage from wind or storm surge. Insurers, likewise, do not refuse to insure these dwellings because of the potentially higher risk, but rather charge higher rates to insure these properties because of the higher risk.

People do not refuse to live in areas where tornadoes are a possibility, but they do install tornado shelters to protect themselves from possible harm. Likewise, people do not refuse to live in areas subject to earthquakes, but they do design and build their homes to minimize potential damage. Insurers, again, do not refuse to insure buildings in such areas, but do charge higher rates to insure these properties because of the higher risk.

People even participate voluntarily in risky activities, such as skydiving, bungee jumping and various other activities and sports, though they attempt to assure that the risks of their participation are minimized.


“I’m from the government … …and, I’m here to help you.”

“The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”

H. L. Mencken


We hear much about the Precautionary Principle as believers in Catastrophic Anthropogenic Global Warming (CAGW) would have it applied to avoid the possibility of catastrophic climate change. However, we hear little about it when applied to everyday issues, such as drought; and, particularly when government has failed to apply it to predictable problems, such as periodic drought events in desert regions.

The State of California is a prime example of the failure of government to apply the Precautionary Principle in the case of the current drought. The population of California has approximately doubled since the most recent water supply dam was commissioned in the state. Caution would have suggested the need to increase water impoundment to meet the needs of this growing population while continuing to provide the water required for agricultural irrigation in the state.

While the hobgoblin of drought induced by climate change might be imaginary, as Mencken suggested, the hobgoblin of drought is all too real; and, its adverse effects are magnified by the state’s failure to prepare for a fully anticipatable future event. These effects will manifest themselves nationally in limited availability and increased costs of vegetables, fruits and nuts grown in the state. The central valley of California is littered with abandoned fields and orchards deprived of contracted quantities of irrigation water.

Government would have the populace depend on it for a broad variety of services. However, this is one case in which the government has clearly failed to plan adequately for the provision of those services. Interestingly, in the face of the current drought, California is planning a high speed rail system, but not new water impoundments.

It is not clear what the proposed high speed rail system is a precaution against, but it is certainly not the effects of prolonged drought.

Tags: Policy, Drought

Climate Experiment

Many people will be traveling over the coming Christmas and New Years holidays. Most probably have all the equipment required to conduct a rudimentary climate experiment – a vehicle with a digital thermometer which measures outside air temperature. Don’t be concerned about the accuracy of this digital thermometer, since: it was probably calibrated more recently than most of the thermometers which contribute temperature data to the Global Historical Climate Network (GHCN); and, you will be measuring temperature anomalies rather than actual temperatures, so absolute accuracy is not essential.

The vehicle, particularly if stored in a garage, must be driven for several minutes to allow the digital thermometer to come to equilibrium. Then the experiment can begin. Observe the temperature as you travel and note your surroundings. This can be a wonderful learning experience for children, as well as a way to keep them busy and avoid: “Are we there yet?”

The most interesting experiment, from a climate science perspective, is the demonstration of the Urban Heat Island (UHI) effect. This experiment is best done either in the middle of the day or in the middle of the night, when ambient temperatures are not changing rapidly due to the morning warmup or the late afternoon or evening cool down period.

Driving through a city, from the exurbs through the suburbs and the city and then out through the suburbs and finally through the exurbs on the other side of the city completes this experiment. Observed temperatures will change significantly, warming as you drive toward the city center and cooling as you return to the exurbs. Depending on weather conditions, a change of 5-10oF is typical. The change may be even greater in very densely populated urban areas. This experiment also clearly demonstrates the potential for human activity to influence climate in limited areas.

Passing over, or even near, a large body of water can cause a change of 2-3oF, depending on the season. In summer, the temperature will decrease as you cross or pass the water. In winter, the temperature will increase, unless the surface of the water is covered with ice.  

Climbing or descending through several thousand feet of altitude can also cause multiple degree changes in measured temperature.

These simple experiments, conducted in conjunction with holiday travel, demonstrate the significance of the location of temperature measuring stations, since these changes are at least as large as the global warming reported over the past ~150 years; and, frequently, several times greater than this reported warming.

Tags: Temperature Record, Urban Heat Island

Seriousness of the Charge

Precautionary Principle

“When the health of humans and the environment is at stake, it may not be necessary to wait for scientific certainty to take protective action.”

--Science and Environmental Health Network


"Even though there is no evidence, the seriousness of the charge is what matters. The seriousness of the charge mandates that we investigate this."

--Thomas Foley (D, MA), Speaker, US House of Representatives


In many ways, the above statements rationalize the global governmental “rush to judgement” regarding climate change and the ongoing efforts to end debate and punish skeptics.

There is certainly evidence that: atmospheric CO2 concentrations have increased since the Industrial Revolution; atmospheric temperatures have increased; sea level has risen; and, glaciers have lost mass. However, there is no evidence that any of these changes, with the exception of the increase in atmospheric CO2 concentrations, have been exclusively, or even primarily, the result of human activity. There is also no evidence that these changes have had an adverse impact on human health or the environment. Finally, there is no evidence that these changes would have an adverse impact on human health or the environment in the future, were they to continue unabated.

However, the Precautionary Principle is frequently used to argue that there is no need to wait for evidence, since the potential adverse impacts portrayed in the scenarios produced by the climate models are perceived to be so potentially devastating. The seriousness of the charge of obstructing movement toward controlling climate change by even questioning the absence of evidence, or the failure to validate the climate models, is asserted as the rationalization for stifling debate and threatening skeptics.

Similarly, the seriousness of the charge of defrauding shareholders and the public on the part of energy companies and consulting companies which have not accepted essential nature of the governmental “rush to judgement”, communicated the essential nature of this governmental “rush to judgement” to their shareholders and the public-at-large and publicly donned “sackcloth and ashes” to atone for their previous “sins” is perceived as “mandating investigation”.

Tags: Precautionary Principle

“Fake News”

There has been much discussion recently about “fake news”, a concept which has nearly as many definitions as it has observers and commenters. Some “fake news” is totally made up, with no basis in fact. Some “fake news” is actually very clever satire. Some “fake news” is actually real news, blown totally out of proportion. Some “fake news” is real news, partially reported, slanted or skewed. Arguably, some “fake news” replaces real news which remains unreported as a result, for a variety of reasons. All “fake news” is intended primarily to influence, rather than to inform. It is stealth commentary.

Much of what causes some observers to refer to the purported threat of catastrophic anthropogenic climate change as a “hoax” is the result of various types of “fake news”, typically intended: to portray “estimates” as “facts”;  to portray what is merely “believed” as “known”; and, to portray modeled potential future scenarios as climate projections. Some, however, is factual misstatements and distortions intended to deceive. Reporting regarding increasing frequency and intensity of hurricanes, tornados, flooding and droughts falls into the latter category.

The most obvious example of “fake news” which attempts to portray estimates as facts is the near-surface temperature anomaly records. These records are based on readings taken from measuring instruments estimated to be in error by an average of more than 2oC, read to a precision of 0.1oC, reported as anomalies to 0.01oC and as decadal trends to 0.001oC. The actual temperature measurements are “facts”, though of questionable accuracy. Then these “facts” are “adjusted”, converting them into estimates, though still of questionable accuracy. Since the errors in the temperature measurements appear not to be random and the “adjustments” made to those measurements are definitely not random, the Law of Large Numbers cannot be applied to the estimates to produce a mean expressed to greater precision than the underlying estimates, since the estimates are clearly not random. Therefore, global temperature anomalies expressed to two digit precision are “fake news”, as are decadal anomaly trends expressed to three digit precision. Consequently, announcements of “the warmest year ever” are also “fake news”, since they are based on estimates which are either inaccurately precise or precisely inaccurate, or both.

The most obvious example of portraying what is merely “believed” as “known” are statements about the predominance of human influence, particularly human CO2 emissions, on recent climate change. No data exist to confirm that relationship. There is also no data which permits quantifying the impact of natural variability on climate change. Assertions to the contrary are “fake news”.

All studies which rely on unverified climate models to “predict” future trends in temperature, rainfall, storms, species extinctions, etc. are also “fake news” because none of these models has been verified, no less established to have any predictive skill.

The political science of climate change ignores this “fake news” and disparages those who question it. Interestingly, in another time and in another place, “fake news” used to be referred to as propaganda. In the more common vernacular, it was referred to as the “mushroom treatment”. It is quite dark in the climate science community; and, it stinks.

Tags: Bad Science, Estimates as Facts

A Look Back: The Coming Ice Age

The Coming Ice Age - Harper's Magazine - Sept. 1958

A friend brought this to my attention today. It is almost 60 years old. (Note the name of the authoress of the article.)

Very interesting. I had not seen it previously. I don’t think it is inconsistent with other things I have read.

Note, though, that the previous warming and subsequent ice age occurred without a human-induced increase in atmospheric CO2; and, that there was no discernable human contribution to the emergence from the ice age. It is only in the past ~65 years that human CO2 emissions are thought to have had any influence; and, that the extent of human influence is neither discernable nor quantifiable.

It is research like this which causes me to question the assertions that man is the sole, or even the principal, cause of the recent warming. The very rapid increase in temperature in 2015, followed by the very rapid decrease in 2016, suggests that the principal cause of those events was not a slow increase in atmospheric CO2 concentrations.

Tags: History, Global Temperature

The Party’s Over – US Commitments at COP22

The 22nd annual global ecotourism conference, officially known as the UNFCCC Conference of the Parties 22, in Marrakech, Morocco has ended. William Shakespeare might have described the conference results as: “full of sound and fury, signifying nothing”.

However, COP22 presented an interesting study in political polarization, at both the national and international levels.

US President Obama, represented by US Secretary of State John Kerry, presented a new “commitment” to reduce US annual CO2 emissions by 80% by 2050, compared with 2005 emission levels. This “commitment” was almost certainly developed in anticipation of a Hillary Clinton presidency, which would have been anticipated to preserve, enhance and expand upon the Obama climate “legacy”. This “commitment” was presented to COP22 after it was obvious that there would be no Clinton presidency, but rather a Donald Trump presidency in combination with a Republican controlled Congress.

There is little doubt that this “commitment” was presented, not only to enhance President Obama’s climate “legacy”, but also to attempt to embarrass President-elect Trump, who is not supportive of the previous US “commitment” to reduce US CO2 emissions by 26-28% by 2025, made at COP21 in Paris, France in 2015; and, would not be expected to be supportive of an even more far reaching “commitment”. Both of these US “commitments” were made as “Executive Agreements”, on President Obama’s sole authority, rather than as treaty commitments ratified by a two-thirds vote of the US Senate. Therefore, both of these “commitments” are also subject to abrogation by executive action.

The Secretary General of the UN and the Chair of the UNFCCC both stated essentially that the movement toward clean energy committed to at COP21 was “unstoppable”, suggesting that President-elect Trump might somehow try to stop this movement. However, there is no indication of any intent on the part of President-elect Trump to stop such a global movement. Rather, the President-elect has merely indicated that he does not support the existing US “commitment”; and, almost certainly, would not support this additional “commitment”. There is no reason to believe that Trump’s lack of support would stop other nations from pursuing current and potential future “commitments” to reduce annual CO2 emissions.

President-elect Trump has stated that he opposes the Obama EPA “Clean Power Plan”, which effectively requires utilities to shutter many existing coal-fired power plants and replace their capacity with natural gas, solar, wind, biomass or other capacity as required. Trump has not suggested that he would interfere in utility decisions to replace existing coal-fired generating capacity, but merely that he was opposed to forcing those decisions by EPA regulation. The environmental community has been quick to label Trump as a “climate denier” or “climate change denier”, though he neither believes that the earth has no climate, nor that the earth’s climate has not and does not change.

Meanwhile, at COP22, as has been the case since COP15, the developing and undeveloped nations have continued the refrain: “Show me the money.” They demand “commitment” from the developed nations of $100 billion per year, apparently in perpetuity, to assist them in adapting to the adverse effects of global climate change, none of which have yet occurred. The current level of global funding toward this “commitment” is far less than the level demanded; and, is likely to remain so for the foreseeable future. President Obama has provided some funding, without congressional approval, though it appears unlikely that President-elect Trump will seek to expand that funding to the level demanded; and, even less likely that he would attempt to do so without congressional authorization.

Tags: United Nations, Clean Power Plan

An Engineer’s Observations

Several aspects of government-funded climate science appear both curious and disturbing. The current level of government funding of climate science is certainly adequate to support rigorous scientific investigation, data gathering and data analysis. Regrettably, it is not doing so uniformly, comprehensively and consistently. That situation cannot be allowed to persist if we are to significantly expand our understanding of the earth’s climate; and, of our potential impact on that climate.

The current concern regarding climate change is based on two principal factors: temperature change and sea level rise. Therefore, the two foundational focuses of climate research should be accurate temperature measurement and accurate measurement of the rate of sea level rise. However, it appears that unjustified precision in reporting results is given greater priority than accuracy of physical measurement.

Near-surface temperature anomaly calculations are produced by multiple agencies, all using subsets of the same suspect data but differing approaches to “adjusting” that data; and, in some cases, “infilling” missing data. Tropospheric temperature anomaly calculations are also produced by multiple agencies, using the same data from the same satellites. There is a complex physical relationship between the tropospheric temperatures and the near-surface temperatures, but there appears to be little effort to understand this relationship, though that understanding appears to be critical to a thorough understanding of earth’s climate.

Sea level rise is measured directly at numerous locations along the sea shore, as well as from satellites. The rate of sea level rise reported from the satellite observations is approximately twice the rate measured by the shore-based instruments. The satellites and the shore-based instruments are measuring the rate of sea level rise of the same oceans, though the satellites measure virtually the entire ocean surfaces, rather than just the levels at the shore.  It is important to know which, if either, of these measurements is correct.

Also, the concern for the future of climate change is based on numerous unverified models, which are used to generate potential future temperature scenarios based on uncertain input factors, which are then used in other unverified models to generate potential future extreme weather, crop failure, disease, habitat loss and species extinction scenarios. Climate science would do well to focus on verifying one model with one set of accurate input conditions; and, then determining its predictive abilities.

Tags: Sea Level Rise, Sea Level Change, Temperature Record, Satellites

Tough Love - Open Letter to Trump Transition Team

Open letter to the President-elect Trump Transition Team – Climate Change

Ladies and Gentlemen:

Serious research focused on understanding the climate and the forces which cause it to change is worthwhile and important. However, it is long past time to apply a strong dose of “tough love” to US climate change research.

The United States is currently spending approximately $2.5 billion per year on climate change research, out of a total climate change related budget of approximately $20 billion per year. Much of this climate change research budget is being expended on duplicative and/or speculative activities, rather than on resolving several fundamental issues involving climate change. The US can hardly afford to waste federal research funding while ignoring these fundamental issues.

This letter addresses four fundamental issues of climate change research:  (i) data collection and analysis; (ii) understanding relationships and resolving differences between surface and satellite sources; (iii) determining the correct values of climate sensitivity and climate forcing factors used as inputs to climate models; and (iv) identifying or developing and then verifying a single climate model which actually models the global climate. It seems incredible that these fundamental issues have not been resolved, if the “science is settled”.


(i) Data Collection and Analysis

The instrumental global temperature data which underlies the concerns regarding global warming and global climate change are collected from near-surface temperature sensors, sea surface floating buoys, ships “passing in the night”, balloon-borne radiosondes and satellites. The near-surface temperatures are collected, aggregated, “adjusted” and analyzed by numerous government agencies around the globe. These agencies each produce monthly anomaly calculations, which differ among themselves as the result of differing selection, “adjustment” and analysis protocols. The satellite and radiosonde temperatures are analyzed by two organizations (UAH and RSS), which produce monthly anomaly calculations, which also differ one from the other. The US also operates a network of 114 state-of-the-art near-surface measuring stations: the Climate Reference Network. However, the data from the CRN is not included in the collection and analysis of the temperatures from other sources, though it is not clear why that is the case.

There is no need for continuing analysis of near-surface temperatures by multiple agencies. However, the reasons for the differences among the several analyses must be understood and resolved before any single agency is tasked with continuing the near-surface temperature analysis effort, if that effort is to be continued. The quality of the temperature data collected, aggregated, “adjusted” and analyzed by NCEI, NASA GISS and The Hadley Center are significantly lower than the quality of the data from the US CRN. However, rather than improve the quality of the temperature data, the agencies “adjust” the data, producing estimates of what the data should have been.

Satellite temperature data are far more comprehensive than the near-surface temperature data. The satellite temperature data and radiosonde data are also used to produce two different and differing monthly temperature anomaly products. Again, there is no need for continuing analysis of this data by multiple organizations. However, the reasons for the differences between the analyses must be understood and resolved before any single organization is tasked with continuing the satellite temperature analysis effort.

The recent surface sea level rise measurements, taken both with tide gauges in contact with the sea surface and microwave radar systems mounted above the sea surface near the shore, show a relatively stable rate of sea level rise over a period of approximately 200 years. Recent satellite-based sea level rise measurements, taken over a period of approximately 23 years, also show a relatively stable rate of sea level rise, though at about twice the rate measured by the surface-based sensors. Again, the satellite measurements are far more comprehensive than the land-based measurements, though they might not be more accurate. 

(ii) Understanding Relationships and Resolving Differences

There are significant differences between the near-surface temperature anomalies and the satellite temperature anomalies. The reasons for these differences must be clearly understood, should the decision be made to abandon the near-surface temperature anomaly products in favor of the far more comprehensive satellite measurements.

There are significant differences between the surface sea level rise data and the satellite sea level rise data. The reasons for these differences must also be clearly understood, should the decision be made to abandon the land-based sea surface measurements in favor of the satellite measurements.

(iii) Determining the Correct Values of Climate Sensitivities and Climate Forcing Factors

The scenarios for future climate produced by the climate models are driven by data on the rate of increase of global atmospheric carbon dioxide and other “greenhouse gas” concentrations and assumptions regarding the sensitivity of the climate to these increases. The climate models are also driven by assumptions regarding several other climate forcing factors. These sensitivities and forcing factors are not well understood, so climate modelers use a range of values in their model runs. The result is a range of potential future scenarios. It is not known whether any one of these scenarios is correct, or even if the actual future scenario falls within the range of scenarios output by the models.

Developing an accurate understanding of how the climate responds to human influences on the atmosphere requires determination of the actual climate sensitivity and the actual magnitude and direction of the forcing factors. This is a fundamental issue.

(iv) Verify a Single Climate Model Which Actually Models the Global Climate

There is currently no climate model which has been verified to accurately and comprehensively model the earth’s climate. Therefore, there is no climate model which can reasonably be expected to predict the future responses of the global climate. As a result, all of the climate research studies which are being used to create scenarios of various types of potential future climate catastrophes are highly speculative. These highly speculative studies are consuming significant climate research resources, to no demonstrably useful scientific purpose. Those resources could be used instead to improve the climate models; and, ultimately, to verify a single climate model.


It is clear that the science is hardly settled, since at least the above four fundamental issues remain unresolved. However, it appears that the practical politics have been settled, until very recently, largely in line with H. L. Menken’s perception.

“The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” H. L. Mencken

The national and international political class which has been driving and funding the climate change issue is desperately in need of some tough love and the imposition of some priorities to address fundamental issues, rather than continuing to fund the production of “hobgoblins”.

Tags: Politics, Policy, Satellites, Temperature Record, Donald Trump

Ultimate Goal

The Movement toward the ultimate goal of a global vegan commune continues apace.

Global Governance:

Zero GHG Emissions:

Global Veganism:

Wealth and Income Redistribution:

Population Control:


The Adjustocene

Adjustocene - Cartoons by Josh

Cartoons by Josh

The earth is currently experiencing the Holocene, the period of 11,700 years following the last major ice age. Some in the climate science community have begun referring to the most recent years of this period as the Anthropocene, suggesting that human activity is responsible for much of the change which has occurred since the Industrial Revolution, or since the Little Ice Age.

NASA has recently published a study suggesting that errors in observational data in the early years of the instrumental temperature record caused approximately 20% of the global warming which has occurred over that period to be “missed” due to quirks in the measurements. They have concluded that it is necessary to “adjust” the observations to correct these Quirks; and, that these “adjustments” bring the observations more in line with the scenarios produced by the climate models. These “adjustments” are in addition to the adjustments which had already been made by NCEI, NASA and Hadley Center/UEA CRU in the process of constructing the temperature anomaly products, both currently and retrospectively. The combination of all of these “adjustments” led one wag to rename this period the “Adjustocene”.

Two quotations regarding this issue bear repeating here, one serious and the other in jest:

       --“It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong.”  Richard P. Feynman

       --“95% of Climate Models Agree: The Observations Must Be Wrong.” Roy Spencer

NASA’s recent study seems to suggest that NASA has ignored Feynman and taken Spencer seriously.

Fundamentally, the temperature data are not “adjusted” because of their superb quality, accuracy or precision. Rather, they are adjusted because they lack those attributes. However, once adjusted, they are merely estimates of what the data might have been, had they been collected timely from properly selected, calibrated, sited, installed and maintained instruments. What the climate science community prefers to refer to as “datasets” are therefore, in fact, estimate sets. The climate science community acknowledges that the data are inaccurate, but insists that the estimates are both accurate and precise.

Tags: Temperature Record

Anomalous Anomalies

NASA GISS and NCEI have released their June Climate anomalies; and, they are anomalous. GISS and NCEI both have access to all of the same near-surface temperature data; and, both use the same sea surface temperature anomaly product, ERSSTv4, developed by NCEI and frequently referred to as the “pause buster” reconstruction. Therefore, any difference between the NASA GISS and NCEI land plus ocean anomalies must be based on the near-surface temperatures.

In June, the GISS temperature anomaly declined by 0.14°C to 0.79°C, a decline of more than 0.5°C since its peak during the 2015/2016 El Nino in February 2016. However, the NCEI temperature anomaly increased by 0.02°C to 0.90°C, though still a decline of more than 0.3°C since its El Nino peak. Therefore, the two anomaly changes for the month of June vary by 0.16°C, or more than 20% of the residual GISS anomaly. Also, the declines in the two anomalies since the 2016 El Nino peak vary by 0.2°C. or approximately 40% of the decline in the GISS anomaly. These are very large differences for anomalies produced from the same dataset.

These differences highlight the significance of the subsets of the monthly near-surface temperature data selected for “adjustment”; and, the significance of the “adjustments” made to the data by the various producers of the near-surface temperature anomaly products.

The near-surface temperature anomaly product from the Hadley Center and the University of East Anglia Climate Research Unit Is not yet available for June, 2016. HadCRUT4 had decreased by approximately 0.4°C from its El Nino peak through May 2016, to 0.68°C.

The tropospheric temperature anomalies produced by the University of Alabama – Huntsville and by Remote Sensing Systems are produced from the data collected by the same satellite-based instruments. The changes in these anomalies are also notably anomalous for June, 2016. The UAH anomaly decreased by 0.21°C, to +0.34°C. The RSS anomaly decreased by 0.06°C, to +0.47°C. Therefore, the difference in the calculated anomaly changes between UAH and RSS is almost as large as the difference between the GISS and NCEI anomalies for June, 2016.

These anomalous anomalies suggest that reporting global temperature anomalies to two decimal place precision represents either inaccurate precision or precise inaccuracy.

Tags: Temperature Record, Global Temperature
Search Older Blog Posts