Call or complete the form to contact us for details and to book directly with us
888-854-5871 (Toll-free USA)


Contact Owner

Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

A Look Back: The Coming Ice Age

The Coming Ice Age - Harper's Magazine - Sept. 1958

A friend brought this to my attention today. It is almost 60 years old. (Note the name of the authoress of the article.)

Very interesting. I had not seen it previously. I don’t think it is inconsistent with other things I have read.

Note, though, that the previous warming and subsequent ice age occurred without a human-induced increase in atmospheric CO2; and, that there was no discernable human contribution to the emergence from the ice age. It is only in the past ~65 years that human CO2 emissions are thought to have had any influence; and, that the extent of human influence is neither discernable nor quantifiable.

It is research like this which causes me to question the assertions that man is the sole, or even the principal, cause of the recent warming. The very rapid increase in temperature in 2015, followed by the very rapid decrease in 2016, suggests that the principal cause of those events was not a slow increase in atmospheric CO2 concentrations.

Tags: History, Global Temperature

The Party’s Over – US Commitments at COP22

The 22nd annual global ecotourism conference, officially known as the UNFCCC Conference of the Parties 22, in Marrakech, Morocco has ended. William Shakespeare might have described the conference results as: “full of sound and fury, signifying nothing”.

However, COP22 presented an interesting study in political polarization, at both the national and international levels.

US President Obama, represented by US Secretary of State John Kerry, presented a new “commitment” to reduce US annual CO2 emissions by 80% by 2050, compared with 2005 emission levels. This “commitment” was almost certainly developed in anticipation of a Hillary Clinton presidency, which would have been anticipated to preserve, enhance and expand upon the Obama climate “legacy”. This “commitment” was presented to COP22 after it was obvious that there would be no Clinton presidency, but rather a Donald Trump presidency in combination with a Republican controlled Congress.

There is little doubt that this “commitment” was presented, not only to enhance President Obama’s climate “legacy”, but also to attempt to embarrass President-elect Trump, who is not supportive of the previous US “commitment” to reduce US CO2 emissions by 26-28% by 2025, made at COP21 in Paris, France in 2015; and, would not be expected to be supportive of an even more far reaching “commitment”. Both of these US “commitments” were made as “Executive Agreements”, on President Obama’s sole authority, rather than as treaty commitments ratified by a two-thirds vote of the US Senate. Therefore, both of these “commitments” are also subject to abrogation by executive action.

The Secretary General of the UN and the Chair of the UNFCCC both stated essentially that the movement toward clean energy committed to at COP21 was “unstoppable”, suggesting that President-elect Trump might somehow try to stop this movement. However, there is no indication of any intent on the part of President-elect Trump to stop such a global movement. Rather, the President-elect has merely indicated that he does not support the existing US “commitment”; and, almost certainly, would not support this additional “commitment”. There is no reason to believe that Trump’s lack of support would stop other nations from pursuing current and potential future “commitments” to reduce annual CO2 emissions.

President-elect Trump has stated that he opposes the Obama EPA “Clean Power Plan”, which effectively requires utilities to shutter many existing coal-fired power plants and replace their capacity with natural gas, solar, wind, biomass or other capacity as required. Trump has not suggested that he would interfere in utility decisions to replace existing coal-fired generating capacity, but merely that he was opposed to forcing those decisions by EPA regulation. The environmental community has been quick to label Trump as a “climate denier” or “climate change denier”, though he neither believes that the earth has no climate, nor that the earth’s climate has not and does not change.

Meanwhile, at COP22, as has been the case since COP15, the developing and undeveloped nations have continued the refrain: “Show me the money.” They demand “commitment” from the developed nations of $100 billion per year, apparently in perpetuity, to assist them in adapting to the adverse effects of global climate change, none of which have yet occurred. The current level of global funding toward this “commitment” is far less than the level demanded; and, is likely to remain so for the foreseeable future. President Obama has provided some funding, without congressional approval, though it appears unlikely that President-elect Trump will seek to expand that funding to the level demanded; and, even less likely that he would attempt to do so without congressional authorization.

Tags: United Nations, Clean Power Plan

An Engineer’s Observations

Several aspects of government-funded climate science appear both curious and disturbing. The current level of government funding of climate science is certainly adequate to support rigorous scientific investigation, data gathering and data analysis. Regrettably, it is not doing so uniformly, comprehensively and consistently. That situation cannot be allowed to persist if we are to significantly expand our understanding of the earth’s climate; and, of our potential impact on that climate.

The current concern regarding climate change is based on two principal factors: temperature change and sea level rise. Therefore, the two foundational focuses of climate research should be accurate temperature measurement and accurate measurement of the rate of sea level rise. However, it appears that unjustified precision in reporting results is given greater priority than accuracy of physical measurement.

Near-surface temperature anomaly calculations are produced by multiple agencies, all using subsets of the same suspect data but differing approaches to “adjusting” that data; and, in some cases, “infilling” missing data. Tropospheric temperature anomaly calculations are also produced by multiple agencies, using the same data from the same satellites. There is a complex physical relationship between the tropospheric temperatures and the near-surface temperatures, but there appears to be little effort to understand this relationship, though that understanding appears to be critical to a thorough understanding of earth’s climate.

Sea level rise is measured directly at numerous locations along the sea shore, as well as from satellites. The rate of sea level rise reported from the satellite observations is approximately twice the rate measured by the shore-based instruments. The satellites and the shore-based instruments are measuring the rate of sea level rise of the same oceans, though the satellites measure virtually the entire ocean surfaces, rather than just the levels at the shore.  It is important to know which, if either, of these measurements is correct.

Also, the concern for the future of climate change is based on numerous unverified models, which are used to generate potential future temperature scenarios based on uncertain input factors, which are then used in other unverified models to generate potential future extreme weather, crop failure, disease, habitat loss and species extinction scenarios. Climate science would do well to focus on verifying one model with one set of accurate input conditions; and, then determining its predictive abilities.

Tags: Sea Level Rise, Sea Level Change, Temperature Record, Satellites

Tough Love - Open Letter to Trump Transition Team

Open letter to the President-elect Trump Transition Team – Climate Change

Ladies and Gentlemen:

Serious research focused on understanding the climate and the forces which cause it to change is worthwhile and important. However, it is long past time to apply a strong dose of “tough love” to US climate change research.

The United States is currently spending approximately $2.5 billion per year on climate change research, out of a total climate change related budget of approximately $20 billion per year. Much of this climate change research budget is being expended on duplicative and/or speculative activities, rather than on resolving several fundamental issues involving climate change. The US can hardly afford to waste federal research funding while ignoring these fundamental issues.

This letter addresses four fundamental issues of climate change research:  (i) data collection and analysis; (ii) understanding relationships and resolving differences between surface and satellite sources; (iii) determining the correct values of climate sensitivity and climate forcing factors used as inputs to climate models; and (iv) identifying or developing and then verifying a single climate model which actually models the global climate. It seems incredible that these fundamental issues have not been resolved, if the “science is settled”.


(i) Data Collection and Analysis

The instrumental global temperature data which underlies the concerns regarding global warming and global climate change are collected from near-surface temperature sensors, sea surface floating buoys, ships “passing in the night”, balloon-borne radiosondes and satellites. The near-surface temperatures are collected, aggregated, “adjusted” and analyzed by numerous government agencies around the globe. These agencies each produce monthly anomaly calculations, which differ among themselves as the result of differing selection, “adjustment” and analysis protocols. The satellite and radiosonde temperatures are analyzed by two organizations (UAH and RSS), which produce monthly anomaly calculations, which also differ one from the other. The US also operates a network of 114 state-of-the-art near-surface measuring stations: the Climate Reference Network. However, the data from the CRN is not included in the collection and analysis of the temperatures from other sources, though it is not clear why that is the case.

There is no need for continuing analysis of near-surface temperatures by multiple agencies. However, the reasons for the differences among the several analyses must be understood and resolved before any single agency is tasked with continuing the near-surface temperature analysis effort, if that effort is to be continued. The quality of the temperature data collected, aggregated, “adjusted” and analyzed by NCEI, NASA GISS and The Hadley Center are significantly lower than the quality of the data from the US CRN. However, rather than improve the quality of the temperature data, the agencies “adjust” the data, producing estimates of what the data should have been.

Satellite temperature data are far more comprehensive than the near-surface temperature data. The satellite temperature data and radiosonde data are also used to produce two different and differing monthly temperature anomaly products. Again, there is no need for continuing analysis of this data by multiple organizations. However, the reasons for the differences between the analyses must be understood and resolved before any single organization is tasked with continuing the satellite temperature analysis effort.

The recent surface sea level rise measurements, taken both with tide gauges in contact with the sea surface and microwave radar systems mounted above the sea surface near the shore, show a relatively stable rate of sea level rise over a period of approximately 200 years. Recent satellite-based sea level rise measurements, taken over a period of approximately 23 years, also show a relatively stable rate of sea level rise, though at about twice the rate measured by the surface-based sensors. Again, the satellite measurements are far more comprehensive than the land-based measurements, though they might not be more accurate. 

(ii) Understanding Relationships and Resolving Differences

There are significant differences between the near-surface temperature anomalies and the satellite temperature anomalies. The reasons for these differences must be clearly understood, should the decision be made to abandon the near-surface temperature anomaly products in favor of the far more comprehensive satellite measurements.

There are significant differences between the surface sea level rise data and the satellite sea level rise data. The reasons for these differences must also be clearly understood, should the decision be made to abandon the land-based sea surface measurements in favor of the satellite measurements.

(iii) Determining the Correct Values of Climate Sensitivities and Climate Forcing Factors

The scenarios for future climate produced by the climate models are driven by data on the rate of increase of global atmospheric carbon dioxide and other “greenhouse gas” concentrations and assumptions regarding the sensitivity of the climate to these increases. The climate models are also driven by assumptions regarding several other climate forcing factors. These sensitivities and forcing factors are not well understood, so climate modelers use a range of values in their model runs. The result is a range of potential future scenarios. It is not known whether any one of these scenarios is correct, or even if the actual future scenario falls within the range of scenarios output by the models.

Developing an accurate understanding of how the climate responds to human influences on the atmosphere requires determination of the actual climate sensitivity and the actual magnitude and direction of the forcing factors. This is a fundamental issue.

(iv) Verify a Single Climate Model Which Actually Models the Global Climate

There is currently no climate model which has been verified to accurately and comprehensively model the earth’s climate. Therefore, there is no climate model which can reasonably be expected to predict the future responses of the global climate. As a result, all of the climate research studies which are being used to create scenarios of various types of potential future climate catastrophes are highly speculative. These highly speculative studies are consuming significant climate research resources, to no demonstrably useful scientific purpose. Those resources could be used instead to improve the climate models; and, ultimately, to verify a single climate model.


It is clear that the science is hardly settled, since at least the above four fundamental issues remain unresolved. However, it appears that the practical politics have been settled, until very recently, largely in line with H. L. Menken’s perception.

“The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” H. L. Mencken

The national and international political class which has been driving and funding the climate change issue is desperately in need of some tough love and the imposition of some priorities to address fundamental issues, rather than continuing to fund the production of “hobgoblins”.

Tags: Politics, Policy, Satellites, Temperature Record, Donald Trump

Ultimate Goal

The Movement toward the ultimate goal of a global vegan commune continues apace.

Global Governance:

Zero GHG Emissions:

Global Veganism:

Wealth and Income Redistribution:

Population Control:


The Adjustocene

The earth is currently experiencing the Holocene, the period of 11,700 years following the last major ice age. Some in the climate science community have begun referring to the most recent years of this period as the Anthropocene, suggesting that human activity is responsible for much of the change which has occurred since the Industrial Revolution, or since the Little Ice Age.

NASA has recently published a study suggesting that errors in observational data in the early years of the instrumental temperature record caused approximately 20% of the global warming which has occurred over that period to be “missed” due to quirks in the measurements. They have concluded that it is necessary to “adjust” the observations to correct these Quirks; and, that these “adjustments” bring the observations more in line with the scenarios produced by the climate models. These “adjustments” are in addition to the adjustments which had already been made by NCEI, NASA and Hadley Center/UEA CRU in the process of constructing the temperature anomaly products, both currently and retrospectively. The combination of all of these “adjustments” led one wag to rename this period the “Adjustocene”.

Two quotations regarding this issue bear repeating here, one serious and the other in jest:

       --“It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong.”  Richard P. Feynman

       --“95% of Climate Models Agree: The Observations Must Be Wrong.” Roy Spencer

NASA’s recent study seems to suggest that NASA has ignored Feynman and taken Spencer seriously.

Fundamentally, the temperature data are not “adjusted” because of their superb quality, accuracy or precision. Rather, they are adjusted because they lack those attributes. However, once adjusted, they are merely estimates of what the data might have been, had they been collected timely from properly selected, calibrated, sited, installed and maintained instruments. What the climate science community prefers to refer to as “datasets” are therefore, in fact, estimate sets. The climate science community acknowledges that the data are inaccurate, but insists that the estimates are both accurate and precise.

Tags: Temperature Record

Anomalous Anomalies

NASA GISS and NCEI have released their June Climate anomalies; and, they are anomalous. GISS and NCEI both have access to all of the same near-surface temperature data; and, both use the same sea surface temperature anomaly product, ERSSTv4, developed by NCEI and frequently referred to as the “pause buster” reconstruction. Therefore, any difference between the NASA GISS and NCEI land plus ocean anomalies must be based on the near-surface temperatures.

In June, the GISS temperature anomaly declined by 0.14°C to 0.79°C, a decline of more than 0.5°C since its peak during the 2015/2016 El Nino in February 2016. However, the NCEI temperature anomaly increased by 0.02°C to 0.90°C, though still a decline of more than 0.3°C since its El Nino peak. Therefore, the two anomaly changes for the month of June vary by 0.16°C, or more than 20% of the residual GISS anomaly. Also, the declines in the two anomalies since the 2016 El Nino peak vary by 0.2°C. or approximately 40% of the decline in the GISS anomaly. These are very large differences for anomalies produced from the same dataset.

These differences highlight the significance of the subsets of the monthly near-surface temperature data selected for “adjustment”; and, the significance of the “adjustments” made to the data by the various producers of the near-surface temperature anomaly products.

The near-surface temperature anomaly product from the Hadley Center and the University of East Anglia Climate Research Unit Is not yet available for June, 2016. HadCRUT4 had decreased by approximately 0.4°C from its El Nino peak through May 2016, to 0.68°C.

The tropospheric temperature anomalies produced by the University of Alabama – Huntsville and by Remote Sensing Systems are produced from the data collected by the same satellite-based instruments. The changes in these anomalies are also notably anomalous for June, 2016. The UAH anomaly decreased by 0.21°C, to +0.34°C. The RSS anomaly decreased by 0.06°C, to +0.47°C. Therefore, the difference in the calculated anomaly changes between UAH and RSS is almost as large as the difference between the GISS and NCEI anomalies for June, 2016.

These anomalous anomalies suggest that reporting global temperature anomalies to two decimal place precision represents either inaccurate precision or precise inaccuracy.

Tags: Temperature Record, Global Temperature

Narratives and Coincidences

Global temperatures began rising in 2014, leading ultimately to the proclamation of 2014 and later 2015 as the warmest years in the instrumental record. The narrative propounded by the consensed climate science community was that this temperature increase was a continuation of the CO2-driven warming documented in the global temperature anomalies, particularly since approximately 1950. There was acknowledgement that there was also an El Nino underway, though its importance in the observed warming was largely ignored or downplayed in the ongoing narrative, which included projections that 2016 would likely displace 2015 as the warmest year in the instrumental record.

However, global temperatures began falling in February, 2016; and, falling very rapidly in May and June, 2016. The temperature trend is moving toward a moderate to strong La Nina. This temperature pattern conflicts with the narrative attribution to continuation of the CO2-driven warming. Rather, it would seem to suggest that the primary driver of the 2014-2015 rapid warming was the strong El Nino, since there has been no accompanying rapid reduction in atmospheric CO2 concentration.

Coincidentally (?), one of the most recognized names in climate science, Dr. Michael Mann, recently attempted to deflect attention from this deviation from the narrative by declaring to a hearing of the Democratic Platform Drafting Committee that: “What is disconcerting to me and so many of my colleagues is that these tools that we’ve spent years developing increasingly are unnecessary because we can see climate change, the impacts of climate change, now, playing out in real time, on our television screens, in the 24-hour news cycle.”

Mann might more accurately have said that we see extreme weather events being played up in real time in the media; and; attributed, without any scientific basis, to climate change. This is particularly true since the actual frequency and severity of “extreme weather events” has been stable or declining in recent decades.

I am reminded of a quote from Carl Sandburg: “If the facts are against you, argue the law. If the law is against you, argue the facts. If the law and the facts are against you, pound the table and yell like hell”. Dr. Mann’s Twitter history would suggest that he is a screamer.

Tags: La Nina, Michael Mann

Not So Fast!

In my commentary from September 6, 2016 “Climate Deniers”, I discussed some of the numerous efforts currently underway to silence and punish “Climate Change Deniers” (catastrophic anthropogenic climate change skeptics) and those who fund “Climate Change Denial” (big energy). Interestingly, some of the targets of these efforts have begun to fight back against what they see as efforts to deny them their First Amendment rights under the US Constitution; and, as harassment, intended to impede their research and/or impose substantial legal costs to defend themselves. Others, regrettably, have chosen to abandon the field to the harassers.

The first target to throw down the gauntlet was Canadian climatologist Dr. Tim Ball, followed closely by Canadian-born author and comedian Mark Steyn, who were sued for defamation by Pennsylvania State University Professor Michael Mann after commenting that Mann’s “Hockey Stick” was “fraudulent”. Both Ball and Steyn have each since countersued Mann for $10 million. Mann continues to resist discovery in his suit against Steyn, as he did in his suit against Ball and in the suit filed by then Commonwealth of Virginia Attorney General Kenneth Cuccinelli. Steyn has been far more aggressive in his response to Mann than either National Review or the Competitive Enterprise Institute, both of which were co-defendants in the Mann suit against Steyn.

The Competitive Enterprise Institute has been more aggressive in response both to the letter from the “RICO 20” to President Obama, Attorney General Loretta Lynch and Presidential Science Advisor John Holdren requesting prosecution of “Climate Change Deniers” under the federal Racketeer Influenced Corrupt Organizations law; and, to US Virgin Island Attorney General Walker’s subpoena for 10 years of documents related to CEI’s climate change policy. CEI recently succeeded in an FOIA request for internal e-mails among the “RICO 20”, which reveal discussions among the “RICO 20”, several environmental groups which receive federal funding and the “Green 20” group of Democrat state Attorneys General. Walker has since abandoned his subpoena, though he has threatened the possibility of future action. CEI has asked the DC Superior Court to fine US Virgin Islands AG Walker for his attempt to undermine CEI’s First Amendment rights.

CEI also sponsored a full page ad in the New York Times on May 18th to call attention to the efforts by New York Attorney General Schneiderman, US Virgin Islands Attorney General Walker and the other “AGs United for Clean Power” (“Green 20”) to stifle skeptical research and commentary.

The Republican members of the US House of Representatives Committee on Science, Space and Technology have sent a letter to the 17 Democrat Attorneys General and 8 environmental activist organizations requesting documents related to their coordinated efforts to deprive corporations and individuals of their First Amendment rights and interfere in their ability to fund and conduct climate research and comment on the climate research of others.

Jagadish Skukla. A professor at George Mason University and President of the government co-funded Institute of Global Environment and Society (IGES), the principal author of the “RICO 20” letter, is currently being investigated by both George Mason University and the US House of Representatives Committee on Science, Space and Technology for “double dipping”, for working “full time” for both the university and IGES.

Texas Attorney General Ken Paxton is calling for an end to the US Virgin Island probe of ExxonMobil regarding the alleged suppression of internal research regarding climate change.

Professor Wei Hok (Willie) Soon of the Harvard-Smithsonian Center for Astrophysics has been cleared of conflict of interest charges regarding his research on the influence of the sun on climate. Soon had been supported by a petition signed by 500 scientists, colleagues and friends submitted to the Smithsonian.

Regrettably, the efforts by Representative Raul Grijalva (D, TX) to harass skeptical climate researchers, Professor Roger Pielke, Jr. of the University of Colorado has decided to redirect his efforts to research in areas other than climate change.

All of these efforts have been far more public than the efforts revealed by the Climategate e-mails to destroy the careers of Patrick Michaels, Chris Landsea and Chris DeFrietas; and, the subsequent effort to remove Delaware State Climatologist David Legates.

Tags: RICO, Mann v Steyn, Silencing the Skeptics

“Enquiring minds …”

“Enquiring minds want to know.”

Why, if “the science is settled”, is it still necessary to fund ongoing climate science research at the same funding levels which prevailed before the science was settled?

Why, if “the science is settled”, are there multiple groups producing global near-surface temperature anomalies, using different subsets of the available data, differing data “adjustment” approaches, differing approaches to missing data and producing differing results?

Why, if “the science is settled”, are there multiple groups producing global sea surface temperature anomaly products, using differing subsets of the available data, differing “adjustment” approaches, differing approaches to missing data and producing differing results?

Why, if “the science is settled”, are there multiple climate models producing differing potential future scenarios using the same input from “settled science”?

Why, if “the science is settled”, are the individual climate models being run with differing climate sensitivity, climate feedback and climate forcing assumptions producing differing potential future scenarios?

Why, if “the science is settled”, are virtually all of the climate models, fed with the full range of differing climate sensitivity, climate feedback and climate forcing assumptions producing scenarios which exceed the “adjusted” temperature anomalies produced by the several producers of near-surface temperature anomaly products?

Why, if “the science is settled” and the warming effects of increasing concentrations of atmospheric gases such as CO2 and CH4 appear first in the atmosphere, specifically in the tropical tropospheric “hot spot”, are the satellites measuring tropospheric temperature anomalies unable to find the tropical tropospheric “hot spot”?

Why, if “the science is settled”, are there multiple groups analyzing the satellite temperature data and producing differing results?

Why, if “the science is settled” and satellite sea surface temperature data are comprehensive for the liquid oceans and the satellite sea surface temperature measurements are regularly calibrated against readings from purpose-built floating buoys, are climate scientists still “adjusting” sea surface temperature measurements relative to readings taken by “ships passing in the night”, which are known to be fraught with error?

Why, if “the science is settled”, are the rates of sea level rise reported based on the satellite measurements approximately twice the rates measured by ground-mounted instruments?

Why, if “the science is settled” and global warming is believed to be causing increased rates of sea level rise, is the rate of sea level rise not increasing?

Why, if “the science is settled” and global warming is believed to be causing droughts and crop failures, does the satellite data show that the globe is greening, largely as the result of the effects of increased atmospheric CO2 concentrations?

Why, if “the science is settled” and global warming is believed to be causing increased extreme weather frequency and intensity, does the observational record not show these effects?

Why, if “the science is settled”, do I find all of these inconsistencies so unsettling?

Tags: Settled Science

Satellites and Climate

The near-surface instrumental temperature record began with the Central England Temperature (CET) record in 1659, which roughly coincided with the trough of the Little Ice Age. The application of temperature measuring instruments spread over the following 200 years, leading to the collection of a “global” temperature record. However, even today, this “global” temperature record involves sparse and uneven coverage in much of the southern hemisphere and over the global oceans. The satellite era has offered the opportunity to greatly expand the scope and coverage of climate observation.

Satellites were first used to measure the temperature of the earth’s troposphere beginning in 1979. The satellites are equipped with carefully calibrated platinum resistance temperature devices (RTDs). Their calibration is confirmed regularly by comparing the sensor readings with readings taken from balloon-borne radiosondes. The original Microwave Sounding Units (MSUs) have been replaced by more advanced units as newer satellites have been developed and launched. While these satellite-borne instruments do not measure near-surface temperatures, they do measure the atmosphere in which infrared absorption by atmospheric gases and vapors occurs. The atmospheric response to absorption and reflection of incoming and outgoing infrared radiation are the primary influences on the near-surface temperature of the earth.

Satellites were first used to measure sea surface temperatures beginning in the 1980s. Prior to the satellite era, sea surface temperatures were measured by near-shore sensors, ship-borne sensors and floating buoys. While these earlier measurement approaches are still in use, the satellites provide far greater and more consistent coverage. The satellites use both infrared and microwave radiometers to measure sea surface temperatures. The satellites sensor readings are regularly compared to readings taken by selected buoy-borne sensors. More recently, the ARGO buoys have made it possible to measure ocean temperatures at depth, providing additional information unavailable from any of the previous ocean temperature measurement approaches.

Satellites are now also being used to measure sea level rise over the entire surface of the global oceans, beginning in the 1990s. Until the satellite era, sea level measurement was limited to shore-based measuring stations (tide gauges), some of which have been measuring sea level since the late 1800s. The satellite measurements of sea level rise are approximately twice the measurements reported by tide gauges. There is currently no explanation for this discrepancy.

More recently, satellites are being used to measure the greening of the earth. A recent report by NASA states that the earth has been “greening” over the last 35 years, largely as the result of the increased atmospheric CO2 concentration. The increased CO2 not only increases plant growth, but also increases plant water use efficiency.  This result confirms the results of previous studies of more limited regions.

Finally, satellites are now being used to track the emissions and dispersion of CO2 in the atmosphere. These CO2 visualizations clearly demonstrate that, while CO2 might eventually become a “well mixed trace gas”, it is initially concentrated downwind of its emissions sources, primarily in the northern hemisphere.

The enthusiasm of the climate science community for the far more comprehensive data available from the satellites has been predictably uneven. The climate science community has remained focused on the gap-laden, error-prone, multiply-“adjusted”, UHI-corrupted near surface temperature anomaly records while virtually ignoring the satellite temperature anomaly records, which have recorded more limited anomalies. The climate science community has also focused on the buoy and ship-collected sea surface temperature anomalies, to the virtual exclusion of the satellite SST anomaly records.

On the other hand, the climate science community appears to prefer the satellite sea level rise data, which shows rates of sea level rise approximately double the rates measured by traditional tidal gauges. These satellite sea level rise data are more consistent with the sea level rise concerns put forward by the UNFCCC with support from the climate science community. However, even the satellite sea level rise records do not support the “increasing rate of sea level rise” meme put forward by the IPCC.

The climate science community has yet to provide any strong indication of its acceptance of the satellite-measured greening of the planet. Certainly, all other things being equal, a general “greening” of the planet would be expected as the result of increased atmospheric CO2. However, the climate catastrophe meme would suggest that the CO2 benefits would be offset by drought or flooding.

Similarly, the climate science community has yet to provide any strong indication of its reaction to the CO2 concentration visualizations provided by the NASA Orbiting Carbon Observatory.

The satellite era has provided multiple tools for collecting comprehensive global data with potential application to the analysis of climate and climate change. It is crucial that this data be collected diligently, analyzed objectively and reported openly, to maximize the advancement of climate science.

Science must inform policy, rather than policy leading science.

Tags: Temperature Record, Satellites

Perspective is Important

The graph below displays the entire history of the satellite temperature record, as analyzed by Drs. John Christy and Roy Spenser at the University of Alabama-Huntsville on behalf of NASA.

UAH Satellite-Based Temperature

The graph clearly shows the very strong El Nino events in 1997-1998 and 2015-2016, as well as the somewhat weaker El Nino event in 2009-2010. The graph suggests that the current El Nino has peaked; and, that temperatures will likely fall over the next several months. There is no reliable way to predict how far temperatures will fall. However, the transition toward La Nina conditions appears to have begun, with NOAA forecasting a moderate to strong La Nina, beginning in the late Spring to early Summer of 2016.

The current El Nino has, at least temporarily, halted the temperature “hiatus” or “pause” experienced over the past ~18 years. Discussion of the global average near-surface temperature increases provided by NCEI, NASA GISS and The Hadley Center have focused almost exclusively on the “warmest year ever” narrative, largely to the exclusion of discussion of the impact of the major El Nino event on those temperatures. However, as can be seen in the above graph, the temperature increases associated with the 1997-1998 and 2009-2010 El Ninos were essentially sharp spikes of relatively short duration, after which temperatures returned to very nearly the levels reported before the onset of the El Ninos. The anomaly increase in response to the current El Nino has followed the same characteristic sharp spike pattern; and, the fall toward the transition appears to be following the same pattern as well.

The graph below displays the El Nino to La Nina transitions as reflected in sea surface temperature anomalies in the eastern Pacific. Note that SSTs for both the 1997-1998 and the 2015-2016 El Nino events appear to have peaked in November of the year in which they developed. The 1997-1998 El Nino transitioned to La Nina conditions in June; and, the La Nina reached its negative peak in December of 1998. The current El Nino has, so far, followed a very similar pattern. The pace and extent of the decline in the SST anomaly will determine whether 2016 will replace 2015 as “the warmest year ever”.

Nino 3.4 Sea Surface Temperature Anomalies


The Conscience of a Skeptic

One of the principal foundations of science is measurement – the ability to determine what has happened or what is happening. The accuracy required of the measurement is frequently determined by the technical or economic significance of the phenomenon being measured. The precision required of the measurement is frequently determined by the magnitude or volatility of the phenomenon being measured. The reliability of the measuring system becomes a critical factor when the phenomenon being measured cannot be reproduced; or, can be reproduced only at great expense or over a long period of time.

Climate science involves the study of a number of phenomena which are of potentially great technical and economic significance, occur over relatively long periods of time, are of relatively limited magnitude and cannot be reproduced. Therefore, climate science would appear to be a field in which measurement accuracy, precision and reliability are all critical to an accurate understanding of the phenomena being studied. It is difficult not to be skeptical of a field of science in which measurement accuracy, precision and reliability are not taken seriously, despite the potential impacts of the phenomena being studied.

Global governments are spending billions of dollars each year on climate research, yet there is still no global network of accurate and reliable near-surface temperature measuring stations or sea surface temperature measuring stations capable of providing accurate and reliable measurement of the temperature change referred to as global warming. The US government has been operating a network of more than 100 highly accurate and reliable near-surface temperature measuring stations (the US Climate Reference Network) for more than 10 years. However, this network has yet to be replicated globally. The state-of-the-art floats and buoys used for sea surface and sub-surface temperature measurement are very limited in number; and, are not uniformly distributed to assure proper global coverage. Continued reliance on ship engine cooling water inlet temperature measurements and temperatures measured in buckets of sea water lifted to the ships’ decks is hardly consistent with the rigorous pursuit of important science.

The satellite era has brought new measurement technologies for temperature measurement, sea level measurement, land and sea ice measurement and other factors of interest to climate science. These satellite-based measurements offer nearly global coverage. However, the focus of the climate science community remains on the surface and near-surface measurements, despite their documented inaccuracies and non-uniform coverage. The exception is the measurement of sea level change, for which the satellite-based measurements are the primary focus, despite the very large and unexplained differences in the rate of sea level rise measured by the satellite-based systems compared to the ongoing measurements provided by coastal measuring stations, some of which have been providing measurements for more than 100 years.

The conscience of this skeptical engineer cannot condone basing a vast scientific “consensus”, with worldwide implications for the lives and livelihoods of billions of people, on half-vast measurements.

Tags: Temperature Record

“Climate Deniers”

Punishing “Climate Change Deniers” – Legality Be Damned

There is a growing clamor in some circles to do something about “Climate Deniers”. The clamor arguably began more than 10 years ago, as documented here. However, it has reached a “fever pitch” in the United States over the past year. Repeated searches have not uncovered any demands for “stoning”, or “burning at the stake”, or “drawing and quartering”, or “crucifixion”. However, it is still early times.

The focus of those experiencing this “fever” is not actually “climate denial”, or “climate change denial”, or “anthropogenic climate change denial”, or even “catastrophic anthropogenic climate change denial”. Rather, their focus is simple catastrophic anthropogenic climate change skepticism; that is, merely questioning whether climate change, especially the anthropogenic contribution to climate change, is likely to result in a catastrophic change in the climate of the earth.


Pursuing the Evil Sources Funding “Climate Change Deniers”

On November 5th, 2015, New York State Attorney General Eric Schneiderman subpoenaed 40 years of records regarding climate change from Exxon Mobil. This is a “What did they know?”, “When did they know it?”, “What did they tell the SEC?”, “What did they tell shareholders?” and “What did they tell the public?” fishing expedition. The period of interest extends back to before the mid-70s “global cooling” scare.

On April 4th, 2016 the Attorney General of the US Virgin Islands subpoenaed the Competitive Enterprise Instutute in connection with the Exxon Mobil investigation being conducted by the Attorney General of New York State. This subpoena covers only the period from 1997 to 2007.


Cooler(?) Heads Prevail – Potential Legal Approach – RICO

There has been much recent discussion, from a number of sources, regarding potential prosecution of “Climate Change Deniers” under the federal Racketeer Influenced Corrupt Organization (RICO) law. The most vocal of the advocates of RICO prosecution is Senator Sheldon Whitehouse (D, VT). He has been joined by Senators Diane Feinstein (D, CA) and Edward Markey (D, MA).The most recent occurred during congressional testimony by US Attorney General Loretta Lynch, who stated that she had referred the issue to the FBI to determine whether adequate basis existed for action.


The RICO Statute

The RICO statute was passed into law in 1961, primarily as a vehicle to deal with the activities of organized crime syndicates involved in the commission of a variety of actions which were already in violation of existing law. The statute was clearly not intended to apply to individuals and organizations which questioned the validity of scientific studies or the positions taken by government based on those studies; or, to individuals or groups which conducted scientific research which reached conclusions different from the conclusions of previous studies, or questioned the validity of positions taken by government based on previous scientific studies. Stated differently, the RICO statute did not and does not render the practice of science and the pursuit of the scientific method illegal.


When RICO Applies – What Must Be Proven

Applying RICO in cases of scientific disagreement would appear to require that the individuals or organizations funding the scientific studies and/or the individuals or groups conducting the scientific studies, funded or conducted those studies with the intent of producing fraudulent results. It would not be sufficient merely to demonstrate that these studies produced results which differed from the results of studies conducted by other scientists, since falsifiability of results is one of the principal foundations of science.

Applying RICO against companies which failed to alert their stockholders to the potential adverse impacts of the companies’ activities, or the potential adverse impact of government actions based on the government’s assessment of the impacts of the companies’ activities would appear to require that the companies KNEW the nature and extent of the adverse impacts or KNEW the nature and extent of the potential government actions. However, it is clearly not possible to KNOW the future, though it is possible to hypothesize about the future.

Applying RICO against companies or organizations which funded research studies which reached conclusions differing from the conclusions of previous research would appear to require that these studies were funded and performed with the intention to defraud the public regarding the implications of their activities. In this case, it would appear to be necessary to prove fraudulent intent, rather than merely demonstrating that the companies’ or organizations’ interests would be supported or advanced by the results of the studies.


So Who’s on the “Hit List”

Climate Change Deniers Hit List:

  • Companies involved in exploration, production and distribution of fossil energy
  • Organizations involved in climate change which have received fossil energy industry funding
  • Researchers in climate change who have received fossil energy industry funding
  • Researchers who have performed studies with results inconsistent with the “consensus”
  • Researchers and statisticians who have questioned studies consistent with the “consensus”
  • Scientific journals which have published studies with results inconsistent with the “consensus”
  • Web bloggers critical of the “consensus”
  • Journalists critical of the “consensus”


Why Limit RICO to “Deniers? – Potential “Climate Affirmer” Targets

Interestingly, though not surprisingly, there has been no such discussion regarding individuals and organizations which might be referred to as “Climate Change Affirmers” for purposes of symmetry; or, more accurately, as catastrophic anthropogenic climate change promoters. These include US EPA, NOAA, NASA, NCAR, NCEI and a host of other federal bureaucracies; and, individuals such as James Hansen, Thomas Karl, Kevin Trenberth, Michael Mann, Al Gore, etc.

The RICO statute apparently does not conceive of the possibility that the federal government, or an agency of the federal government, could participate in racketeering or be a corrupt organization; or, that an individual or organization conducting scientific research funded by the federal government could be corrupt, or acting under the influence of racketeers. Federal government funding of climate change research is apparently assumed to be “as pure as new fallen snow”, while private funding of similar research is apparently viewed as day old slush.

Former US Vice President Al Gore is a consistent “climate change affirmer”. He has fronted a movie entitled “An Inconvenient Truth” regarding climate change. The movie, which has been used to propagandize school children in the US and other countries, was determined to contain scientific errors by a British High Court. The film has not been revised to correct those errors; and, therefore, showings of the film in British schools must be accompanied by a review of the noted errors.. Gore’s efforts regarding climate change have enriched him significantly, while propagating inaccurate science.

Professor Michael Mann is the creator of the “Hockey Stick” representation of potential future temperature change. While Mann still defends the “Hockey Stick”, it has been broadly criticized regarding the source of some of the data and the statistical techniques used to create the “Hockey stick”. The “Hockey Stick” was prominently featured in the IPCC AR3 and AR4 reports, but was reduced in prominence in AR5 report. Mann is currently suing the Competitive Enterprise Institute, National Review and author Mark Steyn regarding Steyn’s description of the “hockey Stick” as a fraud. Mann is currently “slow walking” the legal process, apparently in an effort to delay discovery, increase defendants’ legal costs and force a settlement. Steyn has since countersued Mann for $10 million, just to make it interesting.

Climategate illustrates one potential situation in which government agencies conspired to exclude certain researchers and research studies from inclusion in the reports produced under the auspices of the United Nations Framework Convention on Climate Change (UNFCCC), by the International Panel on Climate Change (IPCC). While the UN representatives and their staffs appear to be protected by diplomatic immunity, scientists employed by government agencies and scientists working on government-funded research do not appear to have such immunity.

The reports produced by the IPCC were then relied upon by US EPA in developing the CO2 Endangerment Finding, despite the requirement in the Clean Air Act that EPA fund and conduct its own research. Arguably, this reliance constitutes influence by a corrupt organization.

Government also funds numerous environmental NGOs, many of which then lobby EPA for more restrictive environmental regulations. In some cases, EPA assists the NGOs in their lobbying efforts. In other cases, the NGOs sue EPA to achieve their desired results; and, EPA settles the suits by developing and implementing the desired regulations. Certainly no hint of corruption in these activities.


The Essential Difference between Denying FACTS and Questioning Hypotheses

The primary, potentially catastrophic, manifestations of climate change of concern to those experiencing this “fever” are increasing near-surface temperatures, rising sea levels and an increase in “extreme weather events”. The singular focus regarding causation is on emissions of “greenhouse gases”, primarily carbon dioxide and, to a lesser extent, methane. The primary focus regarding emissions sources is on the fossil fuel industry, with secondary focus on agriculture and animal husbandry.

The factual evidence for global warming is limited to the record of temperature change over time, which consists of early proxy records and the instrumental temperature record. There is also factual evidence of changes in atmospheric chemistry and sea level, which are correlated with temperature change, though no causative relationship has been proven.

The primary focus of the climate science community regarding temperature change is on the near-surface temperature records and the sea surface temperature records, largely to the exclusion of the high altitude balloon/radiosonde and satellite temperature records. This primary focus is quite strange, since the effects of increased infrared absorption resulting from increased concentrations of infrared absorbers in the atmosphere would appear first in the atmosphere, which the balloon/radiosondes and satellites measure directly.

The data provided by the near-surface and sea surface temperature records is tainted by a number of factors, including non-uniformity of distribution, lack of comprehensiveness of coverage, instrument selection, instrument degradation and siting issues. The siting issues with the near-surface temperature evidence include installation of the measuring stations in non-ideal locations and encroachment of urban environments on measuring stations installed in previously ideal or near-ideal locations.

The data provided by the near-surface temperature records has been “adjusted” for the stated purpose of correcting for the issues which taint the data, as listed above. This action constitutes data tampering. The data has been adjusted by multiple producers of global near-surface temperature anomaly records, using differing “adjustment” methods and producing differing results. The same is true of the sea surface temperature records. As the result of these “adjustments”, we are no longer dealing with data, but rather with estimates of what the data might have been, had they been collected timely from properly selected, calibrated, sited, installed and maintained instruments.

Further, NASA GISS “infills” temperature estimates for areas for which data does not exist, making coverage appear more comprehensive. This action constitutes fabrication of “data”.

The questionable quality of the tainted climate change data and the questionable treatment of the data of climate change by the organizations collecting, tampering with, fabricating and analyzing the data would appear to justify skepticism on the part of climate scientists and others frequently referred to as “Climate Change Deniers”. It might also justify investigation of the actions of the organizations and individuals collecting the tainted data, tampering with that data, fabricating missing data; and, using that data to support scenarios of catastrophic climate change which appear to demand national and international action.

The concerns regarding climate catastrophe are based on the scenarios output by the various climate models, which vary by model and by the assumptions input to drive the modeled scenarios. These models have clearly demonstrated that they are not accurately modeling the real environment. Rather, they are used to create numerous hypothetical future outcome scenarios. None of the models have been verified, so their various output scenarios cannot be construed as FACTS, nor can the future(s) they output be considered as KNOWN. Therefore, those who question the modeled scenarios cannot be fairly accused of denying FACTS or KNOWN outcomes, but merely of being skeptical of the hypothetical scenarios output by the models.

Today, rather than being falsified by research studies conducted by skeptical scientists funded by skeptical funding sources, the climate models are being falsified by the passage of time and their failure to model even the “adjusted” temperature data collected by the government-funded climate science community. Meanwhile, the IPCC continues to profess increasing certainty regarding their conclusions, in the face of increasing divergence of even the “adjusted” data and the modeled scenarios. Therefore, despite the feverish efforts to discredit and potentially punish the skeptics and those who fund their research, the skeptics and their funders are merely guilty of pointing out that “the emperor has no clothes”. That is hardly justification for RICO prosecution of the observing skeptics, though it might be grounds for such prosecutions of those who “fabricated” the emperor’s new clothes.

Tags: ExxonMobil, RICO
Search Older Blog Posts