Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Seriousness of the Charge

Precautionary Principle

“When the health of humans and the environment is at stake, it may not be necessary to wait for scientific certainty to take protective action.”

--Science and Environmental Health Network

 

"Even though there is no evidence, the seriousness of the charge is what matters. The seriousness of the charge mandates that we investigate this."

--Thomas Foley (D, MA), Speaker, US House of Representatives

 

In many ways, the above statements rationalize the global governmental “rush to judgement” regarding climate change and the ongoing efforts to end debate and punish skeptics.

There is certainly evidence that: atmospheric CO2 concentrations have increased since the Industrial Revolution; atmospheric temperatures have increased; sea level has risen; and, glaciers have lost mass. However, there is no evidence that any of these changes, with the exception of the increase in atmospheric CO2 concentrations, have been exclusively, or even primarily, the result of human activity. There is also no evidence that these changes have had an adverse impact on human health or the environment. Finally, there is no evidence that these changes would have an adverse impact on human health or the environment in the future, were they to continue unabated.

However, the Precautionary Principle is frequently used to argue that there is no need to wait for evidence, since the potential adverse impacts portrayed in the scenarios produced by the climate models are perceived to be so potentially devastating. The seriousness of the charge of obstructing movement toward controlling climate change by even questioning the absence of evidence, or the failure to validate the climate models, is asserted as the rationalization for stifling debate and threatening skeptics.

Similarly, the seriousness of the charge of defrauding shareholders and the public on the part of energy companies and consulting companies which have not accepted essential nature of the governmental “rush to judgement”, communicated the essential nature of this governmental “rush to judgement” to their shareholders and the public-at-large and publicly donned “sackcloth and ashes” to atone for their previous “sins” is perceived as “mandating investigation”.

Tags: Precautionary Principle

“Fake News”

There has been much discussion recently about “fake news”, a concept which has nearly as many definitions as it has observers and commenters. Some “fake news” is totally made up, with no basis in fact. Some “fake news” is actually very clever satire. Some “fake news” is actually real news, blown totally out of proportion. Some “fake news” is real news, partially reported, slanted or skewed. Arguably, some “fake news” replaces real news which remains unreported as a result, for a variety of reasons. All “fake news” is intended primarily to influence, rather than to inform. It is stealth commentary.

Much of what causes some observers to refer to the purported threat of catastrophic anthropogenic climate change as a “hoax” is the result of various types of “fake news”, typically intended: to portray “estimates” as “facts”;  to portray what is merely “believed” as “known”; and, to portray modeled potential future scenarios as climate projections. Some, however, is factual misstatements and distortions intended to deceive. Reporting regarding increasing frequency and intensity of hurricanes, tornados, flooding and droughts falls into the latter category.

The most obvious example of “fake news” which attempts to portray estimates as facts is the near-surface temperature anomaly records. These records are based on readings taken from measuring instruments estimated to be in error by an average of more than 2oC, read to a precision of 0.1oC, reported as anomalies to 0.01oC and as decadal trends to 0.001oC. The actual temperature measurements are “facts”, though of questionable accuracy. Then these “facts” are “adjusted”, converting them into estimates, though still of questionable accuracy. Since the errors in the temperature measurements appear not to be random and the “adjustments” made to those measurements are definitely not random, the Law of Large Numbers cannot be applied to the estimates to produce a mean expressed to greater precision than the underlying estimates, since the estimates are clearly not random. Therefore, global temperature anomalies expressed to two digit precision are “fake news”, as are decadal anomaly trends expressed to three digit precision. Consequently, announcements of “the warmest year ever” are also “fake news”, since they are based on estimates which are either inaccurately precise or precisely inaccurate, or both.

The most obvious example of portraying what is merely “believed” as “known” are statements about the predominance of human influence, particularly human CO2 emissions, on recent climate change. No data exist to confirm that relationship. There is also no data which permits quantifying the impact of natural variability on climate change. Assertions to the contrary are “fake news”.


All studies which rely on unverified climate models to “predict” future trends in temperature, rainfall, storms, species extinctions, etc. are also “fake news” because none of these models has been verified, no less established to have any predictive skill.

The political science of climate change ignores this “fake news” and disparages those who question it. Interestingly, in another time and in another place, “fake news” used to be referred to as propaganda. In the more common vernacular, it was referred to as the “mushroom treatment”. It is quite dark in the climate science community; and, it stinks.

Tags: Bad Science, Estimates as Facts

A Look Back: The Coming Ice Age

The Coming Ice Age - Harper's Magazine - Sept. 1958

A friend brought this to my attention today. It is almost 60 years old. (Note the name of the authoress of the article.)

Very interesting. I had not seen it previously. I don’t think it is inconsistent with other things I have read.

Note, though, that the previous warming and subsequent ice age occurred without a human-induced increase in atmospheric CO2; and, that there was no discernable human contribution to the emergence from the ice age. It is only in the past ~65 years that human CO2 emissions are thought to have had any influence; and, that the extent of human influence is neither discernable nor quantifiable.

It is research like this which causes me to question the assertions that man is the sole, or even the principal, cause of the recent warming. The very rapid increase in temperature in 2015, followed by the very rapid decrease in 2016, suggests that the principal cause of those events was not a slow increase in atmospheric CO2 concentrations.

Tags: History, Global Temperature

The Party’s Over – US Commitments at COP22

The 22nd annual global ecotourism conference, officially known as the UNFCCC Conference of the Parties 22, in Marrakech, Morocco has ended. William Shakespeare might have described the conference results as: “full of sound and fury, signifying nothing”.

However, COP22 presented an interesting study in political polarization, at both the national and international levels.

US President Obama, represented by US Secretary of State John Kerry, presented a new “commitment” to reduce US annual CO2 emissions by 80% by 2050, compared with 2005 emission levels. This “commitment” was almost certainly developed in anticipation of a Hillary Clinton presidency, which would have been anticipated to preserve, enhance and expand upon the Obama climate “legacy”. This “commitment” was presented to COP22 after it was obvious that there would be no Clinton presidency, but rather a Donald Trump presidency in combination with a Republican controlled Congress.

There is little doubt that this “commitment” was presented, not only to enhance President Obama’s climate “legacy”, but also to attempt to embarrass President-elect Trump, who is not supportive of the previous US “commitment” to reduce US CO2 emissions by 26-28% by 2025, made at COP21 in Paris, France in 2015; and, would not be expected to be supportive of an even more far reaching “commitment”. Both of these US “commitments” were made as “Executive Agreements”, on President Obama’s sole authority, rather than as treaty commitments ratified by a two-thirds vote of the US Senate. Therefore, both of these “commitments” are also subject to abrogation by executive action.

The Secretary General of the UN and the Chair of the UNFCCC both stated essentially that the movement toward clean energy committed to at COP21 was “unstoppable”, suggesting that President-elect Trump might somehow try to stop this movement. However, there is no indication of any intent on the part of President-elect Trump to stop such a global movement. Rather, the President-elect has merely indicated that he does not support the existing US “commitment”; and, almost certainly, would not support this additional “commitment”. There is no reason to believe that Trump’s lack of support would stop other nations from pursuing current and potential future “commitments” to reduce annual CO2 emissions.

President-elect Trump has stated that he opposes the Obama EPA “Clean Power Plan”, which effectively requires utilities to shutter many existing coal-fired power plants and replace their capacity with natural gas, solar, wind, biomass or other capacity as required. Trump has not suggested that he would interfere in utility decisions to replace existing coal-fired generating capacity, but merely that he was opposed to forcing those decisions by EPA regulation. The environmental community has been quick to label Trump as a “climate denier” or “climate change denier”, though he neither believes that the earth has no climate, nor that the earth’s climate has not and does not change.

Meanwhile, at COP22, as has been the case since COP15, the developing and undeveloped nations have continued the refrain: “Show me the money.” They demand “commitment” from the developed nations of $100 billion per year, apparently in perpetuity, to assist them in adapting to the adverse effects of global climate change, none of which have yet occurred. The current level of global funding toward this “commitment” is far less than the level demanded; and, is likely to remain so for the foreseeable future. President Obama has provided some funding, without congressional approval, though it appears unlikely that President-elect Trump will seek to expand that funding to the level demanded; and, even less likely that he would attempt to do so without congressional authorization.

Tags: United Nations, Clean Power Plan

An Engineer’s Observations

Several aspects of government-funded climate science appear both curious and disturbing. The current level of government funding of climate science is certainly adequate to support rigorous scientific investigation, data gathering and data analysis. Regrettably, it is not doing so uniformly, comprehensively and consistently. That situation cannot be allowed to persist if we are to significantly expand our understanding of the earth’s climate; and, of our potential impact on that climate.

The current concern regarding climate change is based on two principal factors: temperature change and sea level rise. Therefore, the two foundational focuses of climate research should be accurate temperature measurement and accurate measurement of the rate of sea level rise. However, it appears that unjustified precision in reporting results is given greater priority than accuracy of physical measurement.

Near-surface temperature anomaly calculations are produced by multiple agencies, all using subsets of the same suspect data but differing approaches to “adjusting” that data; and, in some cases, “infilling” missing data. Tropospheric temperature anomaly calculations are also produced by multiple agencies, using the same data from the same satellites. There is a complex physical relationship between the tropospheric temperatures and the near-surface temperatures, but there appears to be little effort to understand this relationship, though that understanding appears to be critical to a thorough understanding of earth’s climate.

Sea level rise is measured directly at numerous locations along the sea shore, as well as from satellites. The rate of sea level rise reported from the satellite observations is approximately twice the rate measured by the shore-based instruments. The satellites and the shore-based instruments are measuring the rate of sea level rise of the same oceans, though the satellites measure virtually the entire ocean surfaces, rather than just the levels at the shore.  It is important to know which, if either, of these measurements is correct.

Also, the concern for the future of climate change is based on numerous unverified models, which are used to generate potential future temperature scenarios based on uncertain input factors, which are then used in other unverified models to generate potential future extreme weather, crop failure, disease, habitat loss and species extinction scenarios. Climate science would do well to focus on verifying one model with one set of accurate input conditions; and, then determining its predictive abilities.

Tags: Sea Level Rise, Sea Level Change, Temperature Record, Satellites

Tough Love - Open Letter to Trump Transition Team

Open letter to the President-elect Trump Transition Team – Climate Change

Ladies and Gentlemen:

Serious research focused on understanding the climate and the forces which cause it to change is worthwhile and important. However, it is long past time to apply a strong dose of “tough love” to US climate change research.

The United States is currently spending approximately $2.5 billion per year on climate change research, out of a total climate change related budget of approximately $20 billion per year. Much of this climate change research budget is being expended on duplicative and/or speculative activities, rather than on resolving several fundamental issues involving climate change. The US can hardly afford to waste federal research funding while ignoring these fundamental issues.

This letter addresses four fundamental issues of climate change research:  (i) data collection and analysis; (ii) understanding relationships and resolving differences between surface and satellite sources; (iii) determining the correct values of climate sensitivity and climate forcing factors used as inputs to climate models; and (iv) identifying or developing and then verifying a single climate model which actually models the global climate. It seems incredible that these fundamental issues have not been resolved, if the “science is settled”.

 

(i) Data Collection and Analysis

The instrumental global temperature data which underlies the concerns regarding global warming and global climate change are collected from near-surface temperature sensors, sea surface floating buoys, ships “passing in the night”, balloon-borne radiosondes and satellites. The near-surface temperatures are collected, aggregated, “adjusted” and analyzed by numerous government agencies around the globe. These agencies each produce monthly anomaly calculations, which differ among themselves as the result of differing selection, “adjustment” and analysis protocols. The satellite and radiosonde temperatures are analyzed by two organizations (UAH and RSS), which produce monthly anomaly calculations, which also differ one from the other. The US also operates a network of 114 state-of-the-art near-surface measuring stations: the Climate Reference Network. However, the data from the CRN is not included in the collection and analysis of the temperatures from other sources, though it is not clear why that is the case.

There is no need for continuing analysis of near-surface temperatures by multiple agencies. However, the reasons for the differences among the several analyses must be understood and resolved before any single agency is tasked with continuing the near-surface temperature analysis effort, if that effort is to be continued. The quality of the temperature data collected, aggregated, “adjusted” and analyzed by NCEI, NASA GISS and The Hadley Center are significantly lower than the quality of the data from the US CRN. However, rather than improve the quality of the temperature data, the agencies “adjust” the data, producing estimates of what the data should have been.

Satellite temperature data are far more comprehensive than the near-surface temperature data. The satellite temperature data and radiosonde data are also used to produce two different and differing monthly temperature anomaly products. Again, there is no need for continuing analysis of this data by multiple organizations. However, the reasons for the differences between the analyses must be understood and resolved before any single organization is tasked with continuing the satellite temperature analysis effort.

The recent surface sea level rise measurements, taken both with tide gauges in contact with the sea surface and microwave radar systems mounted above the sea surface near the shore, show a relatively stable rate of sea level rise over a period of approximately 200 years. Recent satellite-based sea level rise measurements, taken over a period of approximately 23 years, also show a relatively stable rate of sea level rise, though at about twice the rate measured by the surface-based sensors. Again, the satellite measurements are far more comprehensive than the land-based measurements, though they might not be more accurate. 

(ii) Understanding Relationships and Resolving Differences

There are significant differences between the near-surface temperature anomalies and the satellite temperature anomalies. The reasons for these differences must be clearly understood, should the decision be made to abandon the near-surface temperature anomaly products in favor of the far more comprehensive satellite measurements.

There are significant differences between the surface sea level rise data and the satellite sea level rise data. The reasons for these differences must also be clearly understood, should the decision be made to abandon the land-based sea surface measurements in favor of the satellite measurements.

(iii) Determining the Correct Values of Climate Sensitivities and Climate Forcing Factors

The scenarios for future climate produced by the climate models are driven by data on the rate of increase of global atmospheric carbon dioxide and other “greenhouse gas” concentrations and assumptions regarding the sensitivity of the climate to these increases. The climate models are also driven by assumptions regarding several other climate forcing factors. These sensitivities and forcing factors are not well understood, so climate modelers use a range of values in their model runs. The result is a range of potential future scenarios. It is not known whether any one of these scenarios is correct, or even if the actual future scenario falls within the range of scenarios output by the models.

Developing an accurate understanding of how the climate responds to human influences on the atmosphere requires determination of the actual climate sensitivity and the actual magnitude and direction of the forcing factors. This is a fundamental issue.

(iv) Verify a Single Climate Model Which Actually Models the Global Climate

There is currently no climate model which has been verified to accurately and comprehensively model the earth’s climate. Therefore, there is no climate model which can reasonably be expected to predict the future responses of the global climate. As a result, all of the climate research studies which are being used to create scenarios of various types of potential future climate catastrophes are highly speculative. These highly speculative studies are consuming significant climate research resources, to no demonstrably useful scientific purpose. Those resources could be used instead to improve the climate models; and, ultimately, to verify a single climate model.

 

It is clear that the science is hardly settled, since at least the above four fundamental issues remain unresolved. However, it appears that the practical politics have been settled, until very recently, largely in line with H. L. Menken’s perception.

“The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” H. L. Mencken

The national and international political class which has been driving and funding the climate change issue is desperately in need of some tough love and the imposition of some priorities to address fundamental issues, rather than continuing to fund the production of “hobgoblins”.

Tags: Politics, Policy, Satellites, Temperature Record, Donald Trump

Ultimate Goal

The Movement toward the ultimate goal of a global vegan commune continues apace.

Global Governance:

Zero GHG Emissions:

Global Veganism:

Wealth and Income Redistribution:

Population Control:

Tags:

The Adjustocene

Adjustocene - Cartoons by Josh

Cartoons by Josh

The earth is currently experiencing the Holocene, the period of 11,700 years following the last major ice age. Some in the climate science community have begun referring to the most recent years of this period as the Anthropocene, suggesting that human activity is responsible for much of the change which has occurred since the Industrial Revolution, or since the Little Ice Age.

NASA has recently published a study suggesting that errors in observational data in the early years of the instrumental temperature record caused approximately 20% of the global warming which has occurred over that period to be “missed” due to quirks in the measurements. They have concluded that it is necessary to “adjust” the observations to correct these Quirks; and, that these “adjustments” bring the observations more in line with the scenarios produced by the climate models. These “adjustments” are in addition to the adjustments which had already been made by NCEI, NASA and Hadley Center/UEA CRU in the process of constructing the temperature anomaly products, both currently and retrospectively. The combination of all of these “adjustments” led one wag to rename this period the “Adjustocene”.

Two quotations regarding this issue bear repeating here, one serious and the other in jest:

       --“It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong.”  Richard P. Feynman

       --“95% of Climate Models Agree: The Observations Must Be Wrong.” Roy Spencer

NASA’s recent study seems to suggest that NASA has ignored Feynman and taken Spencer seriously.

Fundamentally, the temperature data are not “adjusted” because of their superb quality, accuracy or precision. Rather, they are adjusted because they lack those attributes. However, once adjusted, they are merely estimates of what the data might have been, had they been collected timely from properly selected, calibrated, sited, installed and maintained instruments. What the climate science community prefers to refer to as “datasets” are therefore, in fact, estimate sets. The climate science community acknowledges that the data are inaccurate, but insists that the estimates are both accurate and precise.

Tags: Temperature Record

Anomalous Anomalies

NASA GISS and NCEI have released their June Climate anomalies; and, they are anomalous. GISS and NCEI both have access to all of the same near-surface temperature data; and, both use the same sea surface temperature anomaly product, ERSSTv4, developed by NCEI and frequently referred to as the “pause buster” reconstruction. Therefore, any difference between the NASA GISS and NCEI land plus ocean anomalies must be based on the near-surface temperatures.

In June, the GISS temperature anomaly declined by 0.14°C to 0.79°C, a decline of more than 0.5°C since its peak during the 2015/2016 El Nino in February 2016. However, the NCEI temperature anomaly increased by 0.02°C to 0.90°C, though still a decline of more than 0.3°C since its El Nino peak. Therefore, the two anomaly changes for the month of June vary by 0.16°C, or more than 20% of the residual GISS anomaly. Also, the declines in the two anomalies since the 2016 El Nino peak vary by 0.2°C. or approximately 40% of the decline in the GISS anomaly. These are very large differences for anomalies produced from the same dataset.

These differences highlight the significance of the subsets of the monthly near-surface temperature data selected for “adjustment”; and, the significance of the “adjustments” made to the data by the various producers of the near-surface temperature anomaly products.

The near-surface temperature anomaly product from the Hadley Center and the University of East Anglia Climate Research Unit Is not yet available for June, 2016. HadCRUT4 had decreased by approximately 0.4°C from its El Nino peak through May 2016, to 0.68°C.

The tropospheric temperature anomalies produced by the University of Alabama – Huntsville and by Remote Sensing Systems are produced from the data collected by the same satellite-based instruments. The changes in these anomalies are also notably anomalous for June, 2016. The UAH anomaly decreased by 0.21°C, to +0.34°C. The RSS anomaly decreased by 0.06°C, to +0.47°C. Therefore, the difference in the calculated anomaly changes between UAH and RSS is almost as large as the difference between the GISS and NCEI anomalies for June, 2016.

These anomalous anomalies suggest that reporting global temperature anomalies to two decimal place precision represents either inaccurate precision or precise inaccuracy.

Tags: Temperature Record, Global Temperature

Narratives and Coincidences

Global temperatures began rising in 2014, leading ultimately to the proclamation of 2014 and later 2015 as the warmest years in the instrumental record. The narrative propounded by the consensed climate science community was that this temperature increase was a continuation of the CO2-driven warming documented in the global temperature anomalies, particularly since approximately 1950. There was acknowledgement that there was also an El Nino underway, though its importance in the observed warming was largely ignored or downplayed in the ongoing narrative, which included projections that 2016 would likely displace 2015 as the warmest year in the instrumental record.

However, global temperatures began falling in February, 2016; and, falling very rapidly in May and June, 2016. The temperature trend is moving toward a moderate to strong La Nina. This temperature pattern conflicts with the narrative attribution to continuation of the CO2-driven warming. Rather, it would seem to suggest that the primary driver of the 2014-2015 rapid warming was the strong El Nino, since there has been no accompanying rapid reduction in atmospheric CO2 concentration.

Coincidentally (?), one of the most recognized names in climate science, Dr. Michael Mann, recently attempted to deflect attention from this deviation from the narrative by declaring to a hearing of the Democratic Platform Drafting Committee that: “What is disconcerting to me and so many of my colleagues is that these tools that we’ve spent years developing increasingly are unnecessary because we can see climate change, the impacts of climate change, now, playing out in real time, on our television screens, in the 24-hour news cycle.”

Mann might more accurately have said that we see extreme weather events being played up in real time in the media; and; attributed, without any scientific basis, to climate change. This is particularly true since the actual frequency and severity of “extreme weather events” has been stable or declining in recent decades.

I am reminded of a quote from Carl Sandburg: “If the facts are against you, argue the law. If the law is against you, argue the facts. If the law and the facts are against you, pound the table and yell like hell”. Dr. Mann’s Twitter history would suggest that he is a screamer.

Tags: La Nina, Michael Mann

Not So Fast!

In my commentary from September 6, 2016 “Climate Deniers”, I discussed some of the numerous efforts currently underway to silence and punish “Climate Change Deniers” (catastrophic anthropogenic climate change skeptics) and those who fund “Climate Change Denial” (big energy). Interestingly, some of the targets of these efforts have begun to fight back against what they see as efforts to deny them their First Amendment rights under the US Constitution; and, as harassment, intended to impede their research and/or impose substantial legal costs to defend themselves. Others, regrettably, have chosen to abandon the field to the harassers.

The first target to throw down the gauntlet was Canadian climatologist Dr. Tim Ball, followed closely by Canadian-born author and comedian Mark Steyn, who were sued for defamation by Pennsylvania State University Professor Michael Mann after commenting that Mann’s “Hockey Stick” was “fraudulent”. Both Ball and Steyn have each since countersued Mann for $10 million. Mann continues to resist discovery in his suit against Steyn, as he did in his suit against Ball and in the suit filed by then Commonwealth of Virginia Attorney General Kenneth Cuccinelli. Steyn has been far more aggressive in his response to Mann than either National Review or the Competitive Enterprise Institute, both of which were co-defendants in the Mann suit against Steyn.

The Competitive Enterprise Institute has been more aggressive in response both to the letter from the “RICO 20” to President Obama, Attorney General Loretta Lynch and Presidential Science Advisor John Holdren requesting prosecution of “Climate Change Deniers” under the federal Racketeer Influenced Corrupt Organizations law; and, to US Virgin Island Attorney General Walker’s subpoena for 10 years of documents related to CEI’s climate change policy. CEI recently succeeded in an FOIA request for internal e-mails among the “RICO 20”, which reveal discussions among the “RICO 20”, several environmental groups which receive federal funding and the “Green 20” group of Democrat state Attorneys General. Walker has since abandoned his subpoena, though he has threatened the possibility of future action. CEI has asked the DC Superior Court to fine US Virgin Islands AG Walker for his attempt to undermine CEI’s First Amendment rights.

CEI also sponsored a full page ad in the New York Times on May 18th to call attention to the efforts by New York Attorney General Schneiderman, US Virgin Islands Attorney General Walker and the other “AGs United for Clean Power” (“Green 20”) to stifle skeptical research and commentary.

The Republican members of the US House of Representatives Committee on Science, Space and Technology have sent a letter to the 17 Democrat Attorneys General and 8 environmental activist organizations requesting documents related to their coordinated efforts to deprive corporations and individuals of their First Amendment rights and interfere in their ability to fund and conduct climate research and comment on the climate research of others.

Jagadish Skukla. A professor at George Mason University and President of the government co-funded Institute of Global Environment and Society (IGES), the principal author of the “RICO 20” letter, is currently being investigated by both George Mason University and the US House of Representatives Committee on Science, Space and Technology for “double dipping”, for working “full time” for both the university and IGES.

Texas Attorney General Ken Paxton is calling for an end to the US Virgin Island probe of ExxonMobil regarding the alleged suppression of internal research regarding climate change.

Professor Wei Hok (Willie) Soon of the Harvard-Smithsonian Center for Astrophysics has been cleared of conflict of interest charges regarding his research on the influence of the sun on climate. Soon had been supported by a petition signed by 500 scientists, colleagues and friends submitted to the Smithsonian.

Regrettably, the efforts by Representative Raul Grijalva (D, TX) to harass skeptical climate researchers, Professor Roger Pielke, Jr. of the University of Colorado has decided to redirect his efforts to research in areas other than climate change.

All of these efforts have been far more public than the efforts revealed by the Climategate e-mails to destroy the careers of Patrick Michaels, Chris Landsea and Chris DeFrietas; and, the subsequent effort to remove Delaware State Climatologist David Legates.

Tags: RICO, Mann v Steyn, silencing the sceptics

“Enquiring minds …”

“Enquiring minds want to know.”

Why, if “the science is settled”, is it still necessary to fund ongoing climate science research at the same funding levels which prevailed before the science was settled?

Why, if “the science is settled”, are there multiple groups producing global near-surface temperature anomalies, using different subsets of the available data, differing data “adjustment” approaches, differing approaches to missing data and producing differing results?

Why, if “the science is settled”, are there multiple groups producing global sea surface temperature anomaly products, using differing subsets of the available data, differing “adjustment” approaches, differing approaches to missing data and producing differing results?

Why, if “the science is settled”, are there multiple climate models producing differing potential future scenarios using the same input from “settled science”?

Why, if “the science is settled”, are the individual climate models being run with differing climate sensitivity, climate feedback and climate forcing assumptions producing differing potential future scenarios?

Why, if “the science is settled”, are virtually all of the climate models, fed with the full range of differing climate sensitivity, climate feedback and climate forcing assumptions producing scenarios which exceed the “adjusted” temperature anomalies produced by the several producers of near-surface temperature anomaly products?

Why, if “the science is settled” and the warming effects of increasing concentrations of atmospheric gases such as CO2 and CH4 appear first in the atmosphere, specifically in the tropical tropospheric “hot spot”, are the satellites measuring tropospheric temperature anomalies unable to find the tropical tropospheric “hot spot”?

Why, if “the science is settled”, are there multiple groups analyzing the satellite temperature data and producing differing results?

Why, if “the science is settled” and satellite sea surface temperature data are comprehensive for the liquid oceans and the satellite sea surface temperature measurements are regularly calibrated against readings from purpose-built floating buoys, are climate scientists still “adjusting” sea surface temperature measurements relative to readings taken by “ships passing in the night”, which are known to be fraught with error?

Why, if “the science is settled”, are the rates of sea level rise reported based on the satellite measurements approximately twice the rates measured by ground-mounted instruments?

Why, if “the science is settled” and global warming is believed to be causing increased rates of sea level rise, is the rate of sea level rise not increasing?

Why, if “the science is settled” and global warming is believed to be causing droughts and crop failures, does the satellite data show that the globe is greening, largely as the result of the effects of increased atmospheric CO2 concentrations?

Why, if “the science is settled” and global warming is believed to be causing increased extreme weather frequency and intensity, does the observational record not show these effects?

Why, if “the science is settled”, do I find all of these inconsistencies so unsettling?

Tags: Settled Science

Satellites and Climate

The near-surface instrumental temperature record began with the Central England Temperature (CET) record in 1659, which roughly coincided with the trough of the Little Ice Age. The application of temperature measuring instruments spread over the following 200 years, leading to the collection of a “global” temperature record. However, even today, this “global” temperature record involves sparse and uneven coverage in much of the southern hemisphere and over the global oceans. The satellite era has offered the opportunity to greatly expand the scope and coverage of climate observation.

Satellites were first used to measure the temperature of the earth’s troposphere beginning in 1979. The satellites are equipped with carefully calibrated platinum resistance temperature devices (RTDs). Their calibration is confirmed regularly by comparing the sensor readings with readings taken from balloon-borne radiosondes. The original Microwave Sounding Units (MSUs) have been replaced by more advanced units as newer satellites have been developed and launched. While these satellite-borne instruments do not measure near-surface temperatures, they do measure the atmosphere in which infrared absorption by atmospheric gases and vapors occurs. The atmospheric response to absorption and reflection of incoming and outgoing infrared radiation are the primary influences on the near-surface temperature of the earth.

Satellites were first used to measure sea surface temperatures beginning in the 1980s. Prior to the satellite era, sea surface temperatures were measured by near-shore sensors, ship-borne sensors and floating buoys. While these earlier measurement approaches are still in use, the satellites provide far greater and more consistent coverage. The satellites use both infrared and microwave radiometers to measure sea surface temperatures. The satellites sensor readings are regularly compared to readings taken by selected buoy-borne sensors. More recently, the ARGO buoys have made it possible to measure ocean temperatures at depth, providing additional information unavailable from any of the previous ocean temperature measurement approaches.

Satellites are now also being used to measure sea level rise over the entire surface of the global oceans, beginning in the 1990s. Until the satellite era, sea level measurement was limited to shore-based measuring stations (tide gauges), some of which have been measuring sea level since the late 1800s. The satellite measurements of sea level rise are approximately twice the measurements reported by tide gauges. There is currently no explanation for this discrepancy.

More recently, satellites are being used to measure the greening of the earth. A recent report by NASA states that the earth has been “greening” over the last 35 years, largely as the result of the increased atmospheric CO2 concentration. The increased CO2 not only increases plant growth, but also increases plant water use efficiency.  This result confirms the results of previous studies of more limited regions.

Finally, satellites are now being used to track the emissions and dispersion of CO2 in the atmosphere. These CO2 visualizations clearly demonstrate that, while CO2 might eventually become a “well mixed trace gas”, it is initially concentrated downwind of its emissions sources, primarily in the northern hemisphere.

The enthusiasm of the climate science community for the far more comprehensive data available from the satellites has been predictably uneven. The climate science community has remained focused on the gap-laden, error-prone, multiply-“adjusted”, UHI-corrupted near surface temperature anomaly records while virtually ignoring the satellite temperature anomaly records, which have recorded more limited anomalies. The climate science community has also focused on the buoy and ship-collected sea surface temperature anomalies, to the virtual exclusion of the satellite SST anomaly records.

On the other hand, the climate science community appears to prefer the satellite sea level rise data, which shows rates of sea level rise approximately double the rates measured by traditional tidal gauges. These satellite sea level rise data are more consistent with the sea level rise concerns put forward by the UNFCCC with support from the climate science community. However, even the satellite sea level rise records do not support the “increasing rate of sea level rise” meme put forward by the IPCC.

The climate science community has yet to provide any strong indication of its acceptance of the satellite-measured greening of the planet. Certainly, all other things being equal, a general “greening” of the planet would be expected as the result of increased atmospheric CO2. However, the climate catastrophe meme would suggest that the CO2 benefits would be offset by drought or flooding.

Similarly, the climate science community has yet to provide any strong indication of its reaction to the CO2 concentration visualizations provided by the NASA Orbiting Carbon Observatory.

The satellite era has provided multiple tools for collecting comprehensive global data with potential application to the analysis of climate and climate change. It is crucial that this data be collected diligently, analyzed objectively and reported openly, to maximize the advancement of climate science.

Science must inform policy, rather than policy leading science.

Tags: Temperature Record, Satellites

Perspective is Important

The graph below displays the entire history of the satellite temperature record, as analyzed by Drs. John Christy and Roy Spenser at the University of Alabama-Huntsville on behalf of NASA.

UAH Satellite-Based Temperature

The graph clearly shows the very strong El Nino events in 1997-1998 and 2015-2016, as well as the somewhat weaker El Nino event in 2009-2010. The graph suggests that the current El Nino has peaked; and, that temperatures will likely fall over the next several months. There is no reliable way to predict how far temperatures will fall. However, the transition toward La Nina conditions appears to have begun, with NOAA forecasting a moderate to strong La Nina, beginning in the late Spring to early Summer of 2016.

The current El Nino has, at least temporarily, halted the temperature “hiatus” or “pause” experienced over the past ~18 years. Discussion of the global average near-surface temperature increases provided by NCEI, NASA GISS and The Hadley Center have focused almost exclusively on the “warmest year ever” narrative, largely to the exclusion of discussion of the impact of the major El Nino event on those temperatures. However, as can be seen in the above graph, the temperature increases associated with the 1997-1998 and 2009-2010 El Ninos were essentially sharp spikes of relatively short duration, after which temperatures returned to very nearly the levels reported before the onset of the El Ninos. The anomaly increase in response to the current El Nino has followed the same characteristic sharp spike pattern; and, the fall toward the transition appears to be following the same pattern as well.

The graph below displays the El Nino to La Nina transitions as reflected in sea surface temperature anomalies in the eastern Pacific. Note that SSTs for both the 1997-1998 and the 2015-2016 El Nino events appear to have peaked in November of the year in which they developed. The 1997-1998 El Nino transitioned to La Nina conditions in June; and, the La Nina reached its negative peak in December of 1998. The current El Nino has, so far, followed a very similar pattern. The pace and extent of the decline in the SST anomaly will determine whether 2016 will replace 2015 as “the warmest year ever”.

Nino 3.4 Sea Surface Temperature Anomalies

Tags:
Search Older Blog Posts