Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation
▽ Explore More ▽ Hide

Climate and Climate Change

Climate and Climate Change

Climate Change

Two days before Halloween, 2011, New England was struck by a freak winter storm. Heavy snow descended onto trees covered with leaves.  Overloaded branches fell on power lines.  Blue flashes of light in the sky indicated exploding transformers.  Electricity was out for days in some areas and for weeks in others. Damage to property and disruption of lives was widespread.

That disastrous restriction on human energy supplies was produced by Nature.  However, current and future energy curtailments are being forced on the populace by Federal policies in the name of dangerous “climate change/global warming”.  Yet, despite the contradictions between what people are being told and what people have seen and can see about the weather and about the climate, they continue to be effectively steered away from the knowledge of such contradictions to focus on the claimed disaster effects of  “climate change/global warming” (AGW, “Anthropogenic Global Warming”). 

People are seldom told HOW MUCH is the increase of temperatures or that there has been no increase in globally averaged temperature for over 18 years.  They are seldom told how miniscule is that increase compared to swings in daily temperatures. They are seldom told about the dangerous effects of government policies on their supply of “base load” energy — the uninterrupted energy that citizens depend on 24/7 — or about the consequences of forced curtailment of industry-wide energy production with its hindrance of production of their and their family’s food, shelter, and clothing. People are, in essence, kept mostly ignorant about the OTHER SIDE of the AGW debate.

Major scientific organizations — once devoted to the consistent pursuit of understanding the natural world — have compromised their integrity and diverted membership dues in support of some administrators’ AGW agenda.   Schools throughout the United States continue to engage in relentless AGW indoctrination of  students, from kindergarten through university.  Governments worldwide have been appropriating vast sums for “scientific” research, attempting to convince the populace that the use of fossil fuels must be severely curtailed to “save the planet.”  Prominent businesses — in league with various politicians who pour ever more citizen earnings into schemes such as ethanol in gasoline, solar panels, and wind turbines — continue to tilt against imaginary threats of AGW.  And even religious leaders and organizations have joined in to proclaim such threats.   As a consequence, AGW propaganda is proving to be an extraordinary vehicle for the exponential expansion of government power over the lives of its citizens. 

Reasoning is hindered by minds frequently in a state of alarm.  The object of this website is an attempt to promote a reasoned approach; to let people know of issues pertaining to the other side of the AGW issue and the ways in which it conflicts with the widespread side of AGW alarm (AGWA, for short).  In that way it is hoped that all members of society can make informed decisions.

Highlighted Article: Presidential Executive Order on Promoting Energy Independence and Economic Growth

  • 3/29/17 at 10:12 AM
The White House
Office of the Press Secretary
For Immediate Release


Presidential Executive Order on Promoting Energy Independence and Economic Growth

- - - - - - -

By the authority vested in me as President by the Constitution and the laws of the United States of America, it is hereby ordered as follows:

(b)  It is further in the national interest to ensure that the Nation's electricity is affordable, reliable, safe, secure, and clean, and that it can be produced from coal, natural gas, nuclear material, flowing water, and other domestic sources, including renewable sources. 

(d)  It further is the policy of the United States that, to the extent permitted by law, all agencies should take appropriate actions to promote clean air and clean water for the American people, while also respecting the proper roles of the Congress and the States concerning these matters in our constitutional republic.

Sec. 2.  Immediate Review of All Agency Actions that Potentially Burden the Safe, Efficient Development of Domestic Energy Resources.  (a)  The heads of agencies shall review all existing regulations, orders, guidance documents, policies, and any other similar agency actions (collectively, agency actions) that potentially burden the development or use of domestically produced energy resources, with particular attention to oil, natural gas, coal, and nuclear energy resources.  Such review shall not include agency actions that are mandated by law, necessary for the public interest, and consistent with the policy set forth in section 1 of this order. 

(c)  Within 45 days of the date of this order, the head of each agency with agency actions described in subsection (a) of this section shall develop and submit to the Director of the Office of Management and Budget (OMB Director) a plan to carry out the review required by subsection (a) of this section.  The plans shall also be sent to the Vice President, the Assistant to the President for Economic Policy, the Assistant to the President for Domestic Policy, and the Chair of the Council on Environmental Quality.  The head of any agency who determines that such agency does not have agency actions described in subsection (a) of this section shall submit to the OMB Director a written statement to that effect and, absent a determination by the OMB Director that such agency does have agency actions described in subsection (a) of this section, shall have no further responsibilities under this section.

(e)  The report shall be finalized within 180 days of the date of this order, unless the OMB Director, in consultation with the other officials who receive the draft final reports, extends that deadline.  

(g)  With respect to any agency action for which specific recommendations are made in a final report pursuant to subsection (e) of this section, the head of the relevant agency shall, as soon as practicable, suspend, revise, or rescind, or publish for notice and comment proposed rules suspending, revising, or rescinding, those actions, as appropriate and consistent with law.  Agencies shall endeavor to coordinate such regulatory reforms with their activities undertaken in compliance with Executive Order 13771 of January 30, 2017 (Reducing Regulation and Controlling Regulatory Costs).

(i)    Executive Order 13653 of November 1, 2013 (Preparing the United States for the Impacts of Climate Change); 

(iii)  The Presidential Memorandum of November 3, 2015 (Mitigating Impacts on Natural Resources from Development and Encouraging Related Private Investment); and

(b)  The following reports shall be rescinded: 

(ii)  The Report of the Executive Office of the President of March 2014 (Climate Action Plan Strategy to Reduce Methane Emissions).

(d)  The heads of all agencies shall identify existing agency actions related to or arising from the Presidential actions listed in subsection (a) of this section, the reports listed in subsection (b) of this section, or the final guidance listed in subsection (c) of this section.  Each agency shall, as soon as practicable, suspend, revise, or rescind, or publish for notice and comment proposed rules suspending, revising, or rescinding any such actions, as appropriate and consistent with law and with the policies set forth in section 1 of this order.  

(b)  This section applies to the following final or proposed rules:

(ii)   The final rule entitled "Standards of Performance for Greenhouse Gas Emissions from New, Modified, and Reconstructed Stationary Sources: Electric Utility Generating Units," 80 Fed. Reg. 64509 (October 23, 2015); and

(c)  The Administrator shall review and, if appropriate, as soon as practicable, take lawful action to suspend, revise, or rescind, as appropriate and consistent with law, the "Legal Memorandum Accompanying Clean Power Plan for Certain Issues," which was published in conjunction with the Clean Power Plan.  

Sec. 5.  Review of Estimates of the Social Cost of Carbon, Nitrous Oxide, and Methane for Regulatory Impact Analysis.  (a)  In order to ensure sound regulatory decision making, it is essential that agencies use estimates of costs and benefits in their regulatory analyses that are based on the best available science and economics.  

(i)    Technical Support Document:  Social Cost of Carbon for Regulatory Impact Analysis Under Executive Order 12866 (February 2010); 

(iii)  Technical Update of the Social Cost of Carbon for Regulatory Impact Analysis (November 2013); 

(v)    Addendum to the Technical Support Document for Social Cost of Carbon:  Application of the Methodology to Estimate the Social Cost of Methane and the Social Cost of Nitrous Oxide (August 2016); and

(c)  Effective immediately, when monetizing the value of changes in greenhouse gas emissions resulting from regulations, including with respect to the consideration of domestic versus international impacts and the consideration of appropriate discount rates, agencies shall ensure, to the extent permitted by law, that any such estimates are consistent with the guidance contained in OMB Circular A-4 of September 17, 2003 (Regulatory Analysis), which was issued after peer review and public comment and has been widely accepted for more than a decade as embodying the best practices for conducting regulatory cost-benefit analysis.

Sec. 7.  Review of Regulations Related to United States Oil and Gas Development.  (a)  The Administrator shall review the final rule entitled "Oil and Natural Gas Sector:  Emission Standards for New, Reconstructed, and Modified Sources," 81 Fed. Reg. 35824 (June 3, 2016), and any rules and guidance issued pursuant to it, for consistency with the policy set forth in section 1 of this order and, if appropriate, shall, as soon as practicable, suspend, revise, or rescind the guidance, or publish for notice and comment proposed rules suspending, revising, or rescinding those rules. 

(i)    The final rule entitled "Oil and Gas; Hydraulic Fracturing on Federal and Indian Lands," 80 Fed. Reg. 16128 (March 26, 2015);

(iii)  The final rule entitled "Management of Non Federal Oil and Gas Rights," 81 Fed. Reg. 79948 (November 14, 2016); and

(c)  The Administrator or the Secretary of the Interior, as applicable, shall promptly notify the Attorney General of any actions taken by them related to the rules identified in subsections (a) and (b) of this section so that the Attorney General may, as appropriate, provide notice of this order and any such action to any court with jurisdiction over pending litigation related to those rules, and may, in his discretion, request that the court stay the litigation or otherwise delay further litigation, or seek other appropriate relief consistent with this order, until the completion of the administrative actions described in subsections (a) and (b) of this section.  

(i)   the authority granted by law to an executive department or agency, or the head thereof; or 

(b)  This order shall be implemented consistent with applicable law and subject to the availability of appropriations. 


Tags: Clean Power Plan, Coal, Regulation, Executive Order

Additional Perspective - Temperature Anomaly Record

It is generally acknowledged that human influence on global climate was minimal prior to approximately 1950. Therefore, virtually all climate change prior to 1950 is considered to have been the result of natural climate variation or a climate response to natural causes. However, the consensed climate science community asserts that climate changes since 1950 are significantly / largely / predominantly / exclusively the result of human activity.

Prior to the period of the instrumental temperature record, our knowledge of global temperature changes relies on general reconstructions, based on analysis of ice cores, ocean sediments and an increasing number of additional sources. These reconstructions indicate that global temperatures have varied continuously, but over a relatively narrow range, for the past several thousand years.

The period of the global instrumental temperature record is generally agreed to begin in 1880, though the Central England Temperature (CET) record dates back to 1650. The graph below displays the global annual average temperature anomaly product prepared by the British Met Office.

Global Average Temperature

The black diagonal line on the graph illustrates the slope of the change in the global annual temperature anomaly over the entire period on the global instrumental temperature record. The orange diagonal line on the graph illustrates the slope of the change in the global annual average temperature anomaly over the period since 1950, when human influence on global average temperature is generally considered to have begun. The slope of the black line is approximately 0.07oC per decade. The slope of the orange line is approximately 0.14oC per decade. Note that, according to the Hadley Centre, 2016 is the warmest year in the instrumental temperature record; and, that 2016 is also an El Nino year.

It is not possible to separate the natural variation from any human influence in this temperature anomaly record. It is obvious that the rate of global annual temperature increase over the period since 1950 is approximately double the rate of increase over the entire period of the instrumental temperature record. However, since the increase in atmospheric CO2 concentrations over the period has been relatively constant and the use of annual anomalies renders the seasonal variation in atmospheric CO2 concentrations moot, there is obviously significant natural variation in global annual temperature over the period.

The natural fluctuations in the global temperature anomaly are easier to visualize over shorter time periods, using the monthly global temperature anomaly estimate produced by the Hadley Centre.

HADCRUT4 Temperature Anomalies

The slope of the red diagonal line on the graph, drawn from the first to the last monthly global temperature anomaly estimate, is approximately 0.16oC per decade. Again, it is not possible to separate the natural variation from any human influence in this temperature anomaly record. However, it is obvious that there is significant natural variation in the temperature anomalies. This is particularly obvious in the cases of the 1997/1998 and 2015/2016 El Nino events, both of which demonstrate natural event driven anomaly changes of 0.5 – 0.6oC.

The natural variations in the global temperature anomalies are also obvious in the satellite temperature anomaly records, as illustrated by the UAH global lower atmosphere temperature anomaly record (below). The diagonal black line on the graph connects the first and last temperature anomaly estimates in the satellite record. The slope of the line is approximately 0.16oC per decade. As in the previous examples, it is not possible to separate the natural variation from any human influence in this temperature anomaly record. However, the natural variation caused by the 1997/1998 and 2015/2016 El Nino events is very clearly visible.

UAH Satellite-Based Temperature

While it might be possible to attribute a relatively smooth, gradual increase in the global temperature anomaly to the relatively smooth, gradual increase in atmospheric CO2 concentrations, it is clearly not rational to attribute both the rapid increases and the rapid decreases in the anomalies around El Nino events to the gradual increase in atmospheric CO2 concentrations.

The illustrations above of the instrumental and satellite temperature anomaly records are all plotted with a “Y” axis range of 2oC or less. This truncation of the “Y” axis makes the relatively small changes in the temperature anomaly estimates far easier to see. However, the truncated “Y” axis also distorts the significance of the changes in the temperature anomalies. The graph below displays the global annual temperature anomaly over most of the instrumental temperature record, using essentially the same temperature anomaly records as those in the truncated illustrations above. The “Y” axis in this graph ranges from -10oF to +110oF (-23oC to +43oC), a representative annual temperature range for the mid-latitudes. While the increase in the global annual temperature is visible in the orange line in this graph, on close inspection, its relatively minimal significance is also far more clearly displayed.

Wichita, Kansas is a typical mid-latitude city located close to the geographic center of the contiguous United States. The average temperature plotted by the orange line in the above graph is very close to the current average temperature of 57oF reported by the US National Weather Service for Wichita. The red shaded area in the graph above illustrates the typical daily average temperature range during the month of July in Wichita. The blue shaded area in the graph illustrates the typical daily average temperature range during the month of January in Wichita.

The reported increase in global annual temperature over the period of the instrumental temperature record is: approximately 8% of the daily average temperature variation experienced in Wichita throughout the year; approximately 3% of the seasonal average temperature variation; and, approximately 1.2% of the range of maximum and minimum temperature records.

The IPCC AR5 report expresses 95% confidence that more than half of the reported increase in global average temperature is the result of human influences, including gaseous and particulate emissions, land use changes and temperature “adjustments”. It is not currently possible to measure the relative effects of natural variation and human influences on global average temperature; and, it is not currently possible to measure the relative effects of the multiple potential human influences. Regardless, it is difficult to see imminent global catastrophe in the data we currently have available.

Tags: Temperature Record

Climate Science “The time has come, …”

"The time has come," the Walrus said,
"To talk of many things:
Of shoes--and ships--and sealing-wax--
Of cabbages--and kings--
And why the sea is boiling hot--
And whether pigs have wings."

The Walrus and The Carpenter, Lewis Carroll

Climate science is important. Climate science is a mess. Climate science must be fixed. The new Environmental Protection Agency (EPA) Administrator and the soon to be appointed new National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) Administrators have an obligation, either to make the science worth the money being spent on it, or to reduce the money being spent on it to a level commensurate with its apparent value.

The Climategate scandal in 2009 and 2011 began the very public questioning of climate scientists and the conduct of climate science. Climategate exposed Frantic Researchers Adjusting Unsuitable Data, attempting to control peer review, attempting to control or discredit peer reviewed journals, attempting to prevent certain scientists and their research from being published and attempting to ruin the careers of certain scientists. The recent revelations about the conduct of NOAA researchers in the preparation and publication of ‘Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus’ by Karl et al., Science 4 June 2015; and, the refusal of the NOAA Administrator to provide materials to a US House committee have rekindled the issue.

The new EPA, NOAA and NASA Administrators should immediately initiate a thorough, joint investigation of the acquisition, analysis and application of climate data by their agencies. This investigation should then expand to include those activities by other nations, the Intergovernmental Panel on Climate Change (IPCC) and the United Nations Framework Convention on Climate Change (UNFCCC). The initial focus of the investigation should be on data quality and data integrity. This should be followed by a focus on climate modeling and the application of climate models, particularly the need to verify climate models before using them as vehicles for producing questionable future scenarios.       

All the agencies using US and global near-surface temperature data to track global warming “adjust” the data because they either KNOW, or have strong reason to BELIEVE, that the data are flawed. The new NOAA and NASA Administrators should immediately question why the agencies have chosen to continue adjusting flawed data, rather than taking steps to ensure that the data they collect for analysis is accurate. Instrument calibration schedules and instrument enclosure and site maintenance schedules should be reviewed and amended as necessary to ensure data quality. Data archiving and storage procedures should also be reviewed and amended as required to ensure data integrity and accessibility. The new Administrators should establish firm policies regarding data and information access, with the expressed intent of eliminating the need for FOIA requests to obtain access.

These investigations and reviews should not be conducted by agency personnel, but rather by “tiger teams” of outside experts in all of the technical fields of concern, with support from agency personnel. Agency personnel should be notified that refusal to cooperate honestly, fully and timely with the tiger teams would be considered a notice of resignation from the agency; and, processed accordingly.

There is general recognition that the data available from satellites is more comprehensive than the data available from surface and near-surface sources. However, there is continuing disagreement regarding the relationship between satellite data and surface and near-surface data. The Administrators should redirect funds currently spent on studies applying unverified models to produce future scenarios of questionable value. These funds should be used instead to fund studies to resolve the differences between the various data sources, since such studies would likely result in an enhanced understanding of the atmosphere and the processes occurring within it.

Tags: Bad Science, Climate Science, 2016 election, Temperature Record, Climate Models

Oxymoron Alert (Carbon Taxes)

The Climate Leadership Council has just released a new study “The Conservative Case for Carbon Dividends” and presented it to the Trump Administration for its consideration. The study identifies Four Pillars of a Carbon Dividends Plan:





Interestingly, the study focuses on the carbon dividend in its title, rather than on the carbon tax. I would argue that a carbon tax is hardly “conservative”. I would also argue that it is not a “market mechanism”, though it would rely on market mechanisms to adapt to the market distortions caused by the tax. The magnitude of the market distortion which would be caused by a carbon tax of ~$40 per ton of CO2 emissions is estimated to be ~$500 per US social security card holder. It would manifest in the economy as an increase in the cost of every good and service; a tax-driven cost push inflation of prices.

The study begins with the premise that global CO2 emissions are undesirable and must be reduced; that is, it assumes that continued CO2 emissions would lead to catastrophic anthropogenic global warming (CAGW). The study also begins with the premise, shared by many economists, that a carbon tax is the most efficient approach to reducing carbon emissions. The study suggests that the recommended carbon tax would be a Pigouvian tax; that is, a tax intended to discourage activities which lead to negative externalities. However, this suggestion ignores any positive externalities associated with CO2 emissions, though positive externalities exist; and, might well currently dominate at present. It is more likely that the recommended tax is simply a “sin” tax, particularly since it is intended to increase over time until the identified “sin” (CO2 emissions) is eliminated.

A carbon tax at a given level does not reliably produce a CO2 emissions reduction of a specific magnitude or percentage. This is particularly true in the short term, as the market response to the tax is affected both by the availability of economic alternative technologies and by the remaining economic useful life of existing equipment in service. The carbon tax does have the flexibility to be increased progressively, as required, to drive the desired emissions reduction. Clearly, the ultimate goal of the tax is to drive CO2 emissions in the US to zero, which would essentially require a transition to an all-electric energy economy, with all electricity provided by nuclear and / or renewables. It is unclear how high the carbon tax would have to be to achieve that objective. It is clear the tax would penalize the current transition from coal to natural gas as the predominant fuel for electric power generation, which has been largely responsible for the reductions in US CO2 emissions over the past several years. It might also constrain that transition in the short term, by discouraging investment in the additional natural gas pipeline and storage capacity required to supply the growing natural gas generation base.

The carbon dividend is a thinly disguised form of income redistribution, since it would provide a uniform quarterly dividend to all US social security cardholders, regardless of the amount of carbon tax, if any, directly or indirectly paid by that social security cardholder. The carbon dividend would likely be popular, particularly among the ~70% of social security cardholders expected to receive a dividend greater than their tax expense. The carbon tax would be imposed upstream of the consumer and would probably appear to be a price increase imposed by suppliers and service providers, rather than as a flow-through tax. Therefore, many dividend recipients would blame suppliers and service providers for the price increases, though they would thank the federal government for the dividend.

The border carbon adjustments process would be extremely complex and costly. There would likely be massive disagreements over the “carbon content” of manufactured goods and the impacts of any carbon emissions reduction efforts on the part of the government of the country in which the imported goods were manufactured. The numbers of products and the number of countries from which they are imported, combined with technology changes over time, would make this a long term ongoing process.

The “significant regulatory rollback” would fly in the face of the regulatory agencies’ imperative to survive and grow. Certainly the tax would be preferable to the current and growing command and control “rats nest” which is environmental regulation. However, in this instance, neither the regulatory morass nor the tax appears to be necessary, absent near-religious belief in the scenarios produced by unverified climate models.

In my early youth, I believed in Santa Claus, the Easter Bunny and the Tooth Fairy. In my advancing maturity, I find it impossible to believe in a revenue neutral tax. The history of “cap and trade” (cap and tax) legislation in the US strongly suggests that the Congress is incapable of producing such a tax.


Related Articles:

Tags: Carbon Tax

Highlighted Article: At What Cost? Examining The Social Cost Of Carbon

  • 3/9/17 at 05:49 AM

Here is an excellent article we would like to highlight from Cato Institute's Patrick J. Michaels.

At What Cost? Examining The Social Cost Of Carbon

"My testimony concerns the selective science that underlies the existing federal determination of the Social Cost of Carbon and how a more inclusive and considered process would have resulted in a lower value for the social cost of carbon."

Tags: Cost of Carbon, Cato Institute

More Anomalous Anomalies

The three primary producers of global near-surface temperature anomalies, NASA GISS(Goddard Institute for Space Studies), NOAA NCEI(National Centers for Environmental Information) and HadCRUT all begin the process with access to the same data. They then select a subset of the data, “adjust” the data to what they believe the data should have been; and, calculate the current “adjusted” anomalies from the previous ”adjusted” anomalies. NASA GISS alone “infills” temperature estimates where no data exist. Each producer then independently prepares their global temperature anomaly product.

The calculated anomaly in any given month relates directly to a global average temperature for that month. The difference between the calculated anomalies in any pair of months is thus the same as the difference between the calculated global average temperatures for those months. However, the differences reported by the three primary producers of global average temperature anomaly products from month to month, or year to year, are rarely the same; and, the changes are not always even in the same direction, warming or cooling.

The global average temperature anomaly differences reported by the three primary producers of global near-surface temperature anomalies for the months of November and December, 2016 are an interesting case in point. NASA GISS reported a decrease of 0.12oC for the period. NOAA NCEI reported an increase of 0.04oC. HadCRUT reported an increase of 0.07oC. Each provider estimates a confidence range of +/- 0.10oC for their reported anomalies. (NASA GISS: -0.22oC / -0.12oC / -0.02oC; NOAA NCEI: -0.06oC / +0.04oC / +0.14oC; HadCRUT: -0.03oC / +0.07oC / +0.17oC) Therefore, the confidence ranges overlap, suggesting that the differences among the anomaly estimates are not statistically significant. However, it is clear that the global average near-surface temperature did not both increase and decrease from November to December.

The second decimal place in each of these reported anomalies is not the result of temperature measurement accuracy, but rather of numerical averaging of less accurate “adjusted” temperature estimates resulting from data “adjustment”. The Law of Large Numbers states that it can be appropriate to express calculated averages to greater precision than the precision of the numbers being averaged, if the errors in the individual numbers are random. However, the nature of the factors which cause individual temperature measurements to be inaccurate and thus require “adjustment” suggests that the resulting errors are not random. Certainly, the “adjustments” made to the data are not random. Therefore, it is highly unlikely that reporting global average near-surface temperatures to greater precision than the underlying “adjusted” temperature data is appropriate.

Tags: Temperature Record, Global Temperature

Hottest Year Time Again

The near-surface global temperature anomaly data for 2016 have been collected, selected, infilled, “adjusted” and analyzed. The results are in; and, again, they are anomalous. NASA GISS(Goddard Institute for Space Studies) and NOAA NCEI(National Centers for Environmental Information) both report the average anomaly for 2016 as 0.99oC. This represents an increase of 0.13oC for the NASA GISS anomaly, compared with 2015; but, an increase of 0.09oC for the NOAA NCEI anomaly, compared with 2015. Both NASA GISS and NOAA NCEI place the confidence limits on their reported anomaly at +/- 0.10oC, or approximately the same magnitude as the reported year to year global average anomaly change. Both NASA GISS and NOAA NCEI estimate that the influence of the 2015/2016 El Nino contributed 0.12oC to the increase in the reported anomaly for 2016, 0.01oC less than the global average anomaly increase reported by NASA GISS and 0.03oC more than the global average anomaly increase reported by NOAA NCEI. That is, essentially all of the 2016 global average temperature anomaly increase reported by both agencies was the result of the influence of the 2015/2016 El Nino, which was very similar in magnitude to the 1997/1998 El Nino, which are the two strongest El Ninos recorded in the instrumental temperature record. HadCRUT reported an average anomaly of 0.774oC, an increase of 0.14oC from the 2015 average anomaly. HadCRUT estimated similar confidence limits and a similar El Nino contribution.

All of the near-surface temperature anomaly products reported dramatic drops in their anomalies, beginning in the Spring of 2016, though these drops were from record high monthly peaks driven by the El Nino. The NASA GISS anomaly dropped from a high of 1.36oC to a December 2016 level of 0.81oC, a change of -0.55oC. The NOAA NCEI anomaly dropped from a high of 1.22oC to 0.79oC, a change of -0.43oC. The HadCRUT4 anomaly dropped from a high of 1.08oC to 0.59oC, a change of -0.49oC. This a variance of 0.12oC between the near-surface temperature anomaly products, approximately equal to the magnitude of the reported 2016 anomaly increases, the estimated impact of the 2015/2016 El Nino and half the confidence range claimed for the reported anomalies.

University of Alabama Huntsville (UAH)  and Remote Sensing Systems (RSS) both reported that 2016 was 0.02oC warmer than 1998, which both sources still report as the previous warmest year in the satellite temperature record. Dr. Roy Spencer of UAH stated that the increase in the reported temperature anomaly between 1998 and 2016 would have had to be ~0.10oC to be statistically significant. The UAH anomaly dropped from a 2016 high of 0.83oC to a December 2016 level of 0.24oC, a change of -0.59oC. The RSS anomaly dropped from a 2016 high of 1.0oC to a December level of 0.23oC, a change of -0.77oC. Both the UAH and RSS anomalies show the dramatic impact of the 2015/2016 El Nino. Both anomalies suggest that the “Pause” has returned, since they show no statistically significant warming since at least 1998.

The question now is whether there will be a La Nina in 2017; and, if so, the extent to which it will further reduce the post El Nino anomalies.

Tags: Warmest, Global Temperature, Temperature Record

Highlighted Article: Climate Models for the Layman

  • 2/22/17 at 07:29 AM

Here is an excellent paper on climate models by Dr. Judith Curry and The Global Warming Policy Foundation (GWPF).

Climate Models for the Layman

"This report attempts to describe the debate surrounding GCMs to an educated but nontechnical audience."

Tags: Climate Models

Opening the Kimono

Steve Goreham, the author of the book Climatism! Science, Common Sense, and the 21st Century's Hottest Topic coined the term Climatism, which he defines as “the belief that man-made greenhouse gas emissions are destroying the Earth's climate". Arguably, the definition should include the assertion “that man-made greenhouse gas emissions are destroying the Earth's climate", even absent belief in the assertion.

Ari Halperin manages the blog Climatism. ( Early in 2016, Halperin authored a guest essay on Watts Up With That ( entitled Who unleashed Climatism ( in which he discusses the origins of climate alarmism at length. He concludes that: “Climatism is a foreign assault on America. The aggressor is not another nation-state, but an alliance of UN agencies and environmental NGOs.”

Leo Goldstein ( has recently posted two guest essays on Watts Up With That:; and, Goldstein has coined two new terms in these essays: Climate Alarmism Governance (CAG); and, Climintern (Climatist International). Goldstein traces the history of CAG from the founding of the Climate Action Network (CAN) in 1989. CAN now has 1100+ members globally. CAN, the UN and numerous foundations provide CAG; and, are referred to collectively as the Climintern.

The Climintern refers to the analyses provided by Halperin and Goldstein as conspiracy theories. Others might refer to them as conspiracy exposes or histories. The documentation they provide in these essays clearly establishes the nature and scope of CAG and the influence of the Climintern. Their analyses are well worth reading. They provide a historical record of how climatism has proceeded from its earliest days to today; and, how it works to influence global and national climate policy through the UNFCCC and the UN IPCC.

“If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.” (HT: James Whitcomb Riley)

Tags: Climate Skeptics

Another One Bites the Dust – Judith Curry Resigns

Dr. Judith Curry has resigned her position at Georgia Tech, in frustration over “the politics and propaganda that beset climate science”. Dr. Curry explained,

“the deeper reasons have to do with my growing disenchantment with universities, the academic field of climate science and scientists… I no longer know what to say to students and postdocs regarding how to navigate the CRAZINESS in the field of climate science. Research and other professional activities are professionally rewarded only if they are channeled in certain directions approved by a politicized academic establishment — funding, ease of getting your papers published, getting hired in prestigious positions, appointments to prestigious committees and boards, professional recognition, etc.”
“How young scientists are to navigate all this is beyond me, and it often becomes a battle of scientific integrity versus career suicide.” (

Dr. Curry’s concerns regarding university climate science education do not bode well for the future of climate science over the next 30-40 years.

Dr. Curry follows Dr. Roger Pielke, Jr., who did not resign his faculty position at the University of Colorado, but has redirected his research efforts away from climate science. (

Other climate scientists, including Dr. David Legates (formerly Delaware State Climatologist) and Dr. Noelle Metting (formerly US DOE) were terminated for failing to adhere to the political climate change narrative. (


Dr. Wei-Hock Soon and Dr. Sallie Baliunas, both of the Harvard-Smithsonian Center for Astrophysics have been under attack since 2003 for their work on the solar contribution to climate change. (Dr. Baliunas has since retired.)

The Climategate e-mails released in 2009 and 2010 exposed efforts on the part of members of the consensed climate science community to destroy the careers of other climate scientists, including Dr. Christopher deFrietas, Dr. Christopher Landsea and Dr. Patrick Michaels. Fortunately, these efforts were unsuccessful.

The life of a non-consensed climate scientist is hardly a bed of roses.

Tags: Consensus

“Mann Overboard”

Much has been written this year about the 2015/2016 El Nino and about the apparent record global temperature anomalies. Professor Michael Mann of Penn State University was quick to provide his opinion that the El Nino contributed only about 0.1oC, or about 15%, to the 2015/2016 global average temperature anomaly increase. Others provided estimates ranging from 0.07oC to 0.2oC. The balance of the temperature anomaly increases was attributed to the continuing increase in atmospheric CO2 concentrations as the result of human fossil fuel combustion.

However, the 2015/2016 El Nino is now over; and, global temperature anomalies have dropped sharply: by approximately 0.4oC overall; and, by approximately 1.0oC over land only. The sea surface temperature anomalies are expected to decrease further, although more slowly, especially if a significant La Nina develops in 2017. The equatorial Pacific is in a weak La Nina condition at present, but La Nina conditions appear to be weakening.

Regardless, Mann and others who minimized the potential contribution of the 2015/2016 El Nino to the rapid global temperature anomaly increases in those years are now faced with explaining the large, rapid decreases in the global average anomalies following the end of the El Nino. It would be difficult enough to explain rapid anomaly increases in association with slow increases an atmospheric CO2 concentrations; but, even more difficult to explain rapid anomaly decreases in association with slow increases in atmospheric CO2 concentrations.

Tags: Temperature Record, Global Temperature

“Just the facts, ma’am.”

“Politics is a battle of ideas; in the course of a healthy debate, we’ll prioritize different goals, and the different means of reaching them. But without some common baseline of facts; without a willingness to admit new information, and concede that your opponent is making a fair point, and that science and reason matter, we’ll keep talking past each other, making common ground and compromise impossible.”

– President Obama (Jan. 10, 2017)

Simple Definition of fact

  • : something that truly exists or happens : something that has actual existence
  • : a true piece of information

Source: Merriam-Webster's Learner's Dictionary


“Just the facts, ma’am.” #1

Some simple words are often used imprecisely. In discussions related to climate science, the simple word “fact” is a case in point. For example, a temperature measurement taken by an observer from a particular instrument, in a particular enclosure, at a particular location and at a particular time is frequently referred to as a “fact”. However, it is only a “fact”, as defined above, in those particular circumstances. It is not necessarily and not even likely “a true piece of information”, in the broader sense, since it is affected by those circumstances.

Temperature measurements taken “near-surface” are “selected” for inclusion in the temperature record; and, are then “adjusted” to account for the particular instrument, enclosure, location and time of observation. These “adjusted” measurements are not “something that truly exists or happens”, but rather an estimate of something that might “truly exist or happen”.

An “ideal” near-surface temperature measurement site is identified as follows: Climate Reference Network (CRN) Site Information Handbook

Class 1– Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19º). Grass/low vegetation ground cover <10 centimeters high.    Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots.  Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away.  No shading when the sun elevation >3 degrees.

Such a site is estimated to be able to produce a near-surface temperature measurement with an error of less than 1oC, assuming proper instrument selection and calibration, proper enclosure and timely reading. Such a measurement is a “fact”, subject to those limitations.

Climate science deals with these errors of “fact” regarding near-surface temperature measurements by using temperature anomalies, or the differences between temperature measurements taken at a particular site. These anomalies are “facts” only if there have been no changes in any of the circumstances which affect the measurements; and, they cease to be facts if the measurements are “adjusted”, rendering them merely estimates.



“Just the facts, ma’am.” #2

Above I discussed the limitations on “facts”; and, the difference between facts and estimates related to individual temperature measurements, whether analyzed as discrete temperatures or temperature anomalies.

Once near-surface temperature measurements have been recorded, selected and “adjusted”, the next step in the process is to combine these selected, “adjusted” temperature estimates, expressed as anomalies from previous selected, “adjusted” temperature estimates, into an estimated global average near-surface temperature anomaly. While it might be argued that errors in the recorded temperature measurements might be random, it cannot be argued that the selection of the temperature measurements to be included in the global average calculation or the “adjustments” made to these temperature measurements are random. There could be no rational explanation for making random “adjustments” to measurements.

The estimated global average surface temperature anomaly is reported to two decimal place “precision”; and, used to calculate decadal rates of temperature increase to three decimal place “precision”. This level of “precision” is highly questionable, bordering on ridiculous, considering the inaccuracy of the underlying temperature measurements. The underlying temperature measurements are estimated to be in absolute error by an average of more than 2oC in the US, where they have been surveyed and their siting compared to the US CRN1 siting requirements. The expected inaccuracy of the remaining global temperature measuring stations is assumed to be similar, though they have not been surveyed and their siting compared to the US CRN1 siting requirements.

Finally, the estimated “adjusted” global average temperature is reported to two decimal place “precision”. This estimate is reported as a “fact”, though the particular circumstances under which the estimate might have been a “fact” are ignored.



“Just the facts, ma’am.” #3

“Just the facts, ma’am.” (1 & 2) discussed “facts” in the context of individual near-surface temperature measurements and global average temperature anomaly calculations. The final step in the climate change analysis process is the creation of future climate change scenarios using climate models.

There are numerous climate models, none of which have been verified. Therefore, it cannot be said that there is a climate model which is a “fact”, in the sense that it accurately models “something that truly exists or happens”, rather than hypothesizes something which might happen if the model were accurate.

The climate models are run using a range of inputs for climate sensitivity and climate forcings, because there are no specific, verified values for the various sensitivities and forcings. Therefore, not only are the climate models not “facts” (“something that truly exists or happens”), the inputs which feed the models are not “facts” either, in the sense that they are “a true piece of information”. It is not even a “fact” that the actual climate sensitivity or actual climate forcings are within the ranges of the values used as inputs to the models.

Therefore, the modeled scenarios of future climate change (temperature change) are not “facts”, or arguably even based on facts. Rather, they are estimates, based on estimates, of the potential change in current estimates over some future period.

Based on the “facts”, as discussed in these commentaries, there is only a tenuous basis for concern about catastrophic anthropogenic climate change.

That’s “Just the facts, ma’am.” (HT: Sgt. Joe Friday, LAPD)

Tags: Temperature Record, Estimates as Facts, Global Temperature


One of the principal concerns raised regarding climate change is its potential effects on agriculture. There is continuing discussion that the potential combination of increased temperatures with drought or increased rainfall might result in reduced crop yields or crop failure in some or all of the traditional crop production regions. There is also continuing discussion of the perceived need to reduce meat consumption, so that grazing land could be converted to food production.

There is little discussion of the likelihood that production of these food crops would move to areas which have been too cold or had too short growing seasons in the past. There is also little recognition of the contribution of plant genetics to increased plant tolerance and yield.

However, in the face of all of this concern about food production, at least in the United States, the number one cash crop in ten US states is marijuana, as shown in the attached table. Marijuana is among the top five cash crops in a total of 39 of the 50 states; and, among the top ten cash crops in all but 2 states.

This is not to suggest that marijuana is a major crop in any of these states by volume or weight, or that it is crowding out production of other crops for human or animal consumption, including export. Rather, it is to suggest that a very high value has been placed on a crop which has no food value (even when baked into brownies), in the face of vocal concern about the adequacy of future food production.

Current efforts to legalize the consumption of marijuana for other than medicinal purposes will likely increase the demand for the product, increasing the productive acreage dedicated to its production, though it might also result in corresponding reductions in its commercial value. One has to question whether this agricultural product should have such a high priority.


Marijuana Rank as Cash Crop, by State

Alabama 1 Louisiana 6 Ohio 4
Alaska NA Maine 1 Oklahoma 3
Arizona 3 Maryland 5 Oregon 4
Arkansas 4 Massachusetts 2 Pennsylvania 5
California 1 Michigan 5 Rhode Island 1
Colorado 4 Minnesota 6 South Carolina 3
Connecticut 1 Mississippi 3 South Dakota 9
Delaware 3 Missouri 4 Tennessee 1
Florida 2 Montana 4 Texas 6
Georgia 3 Nebraska 9 Utah 2
Hawaii 1 Nevada 2 Vermont 2
Idaho 5 New Hampshire 2 Virginia 1
Illinois 4 New Jersey 7 Washington 5
Indiana 3 New Mexico 2 West Virginia 1
Iowa 4 New York 2 Wisconsin 6
Kansas 6 North Carolina 5 Wyoming 8
Kentucky 1 North Dakota ?    


Source: NORML (USDA data)

Tags: Agriculture

A Little Perspective

The angst-ridden, consensed climate science community is focused on an increase in global average near-surface temperature of approximately 0.7oC (1.3oF) per century, or a total increase of approximately 0.9oC (1.6oF) since 1880, according to NOAA.

To provide some perspective on the cause of this angst, I have selected Wichita, Kansas, a city located very near the geographic center of the contiguous 48 states of the US. The data source for this analysis is

The record high temperature in Wichita is 114oF. The record low is -22oF. That is a difference of 136oF between the record high and low temperatures over the same period that NOAA reports a global average near-surface temperature increase of approximately 1.6oF.

The typical range of daily high and low temperature in Wichita is approximately 20oF throughout the year. Assuming that the transition from the daily low temperature to the daily high temperature occurs over a period of approximately 12 hours, the rate of diurnal temperature change in Wichita is approximately 1.7oF per hour, or approximately the same as the total change in global average near-surface temperature over the 136 years since 1880.

NOAA reports the global average near-surface temperature as approximately 57oF. Wichita average temperatures range from approximately 32oF in January to approximately 80oF in July, relatively close to the global average near-surface temperature. This is a local average temperature change of approximately 0.3oF per day, or approximately one fifth of the total reported change of global average near-surface temperature over the 136 years since 1880.

It is also interesting to compare the rates of change of temperature. The approximately 0.3oF daily rate of local average seasonal temperature change in Wichita is approximately 10 thousand times the reported rate of global average near-surface temperature change over the 136 year period since 1880. The approximately 1.7oF per hour rate of change of diurnal temperature in Wichita is approximately 1.2 million times the reported rate of change of global average near-surface temperature over the same period.

Similar analyses in other areas of the globe would produce similar, though not identical, results. Clearly, all life forms on earth experience far more rapid temperature changes on a daily and seasonal basis than the earth has experienced on a global basis over the past 136 years. Also, the global change has manifested predominantly as warmer nights and milder winters, rather than as increased maximum temperatures, thus reducing the stress imposed by the increase.

Tags: Global Temperature, Temperature Record

Personal Precautions

One of the idyllic images cherished by many environmentalists concerned about the climate is life “off the grid”, free of utilities, living off the land, minimizing their impact on the planet. However, reality frequently rears its ugly head, blurring the idyllic image.

I recently had the opportunity to spend several days visiting with friends who live on a quarter section in-lot (160 acres), completely surrounded by Bureau of Land Management land. They are not connected to the electric grid, to natural gas distribution, or to municipal or private water service. They do have radio-telephone service; and, satellite service for the internet and television.

They use both dual-axis tracking and fixed solar photovoltaic collectors to provide their electricity; and, they store excess electricity generated during the day in a battery bank to meet their needs when the sun isn’t shining. They also had, but have since removed, a wind turbine, which proved to be both inefficient and problematic. However, as a precaution, they also have a propane-powered generator, equipped with an automatic transfer switch to pick up the load when necessary.

They use solar thermal collectors to produce hot water, both for domestic use and as the heat source for the in-floor hydronic system, which provides the primary source of space heating for their home. However, as a precaution, they have both a propane-fueled instantaneous water heater and a propane-fueled furnace, as well as two wood stoves.

Their home is located in an area which receives relatively little rain and snow, so the availability of water is a prime concern. They collect their water from the roofs of their home and garages; and, store several thousand gallons of water in four large storage tanks. They also use composting toilets to reduce water consumption and avoid sanitary water (black water) disposal issues.

Their vehicles are all gasoline-fueled. Electric vehicles would require installation of additional solar or wind generation capacity; and, far greater useful vehicle range.

This is not to suggest that the idyllic life “off the grid” is not possible, but rather that it requires extensive and careful precautionary planning to assure continuous quality of life; and, technological evolution to “fill in the blanks”.

Tags: Backup Power
Search Older Blog Posts