Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

The Truth, the Whole Truth and… Climate Change

Thousands of near-surface temperature data points are collected globally each day. These data points are freely available to each of the global near-surface temperature anomaly producers: NOAA, NASA GISS, Hadley Center/UEA and the Japanese Meteorological Agency. Each of these anomaly producers selects from among the data, “adjusts” the data, homogenizes the data and, in the case of NASA GISS, “Infills” missing data. Each agency produces a monthly global temperature anomaly calculation based on this data.

The global temperature anomaly calculations differ among the agencies because of difference in the reference period, the data selected for inclusion, the “adjustment” process and possible “infilling” of missing data. The anomalies are reported to two decimal place “precision”, typically with an uncertainty band of +/- 0.10ºC. However, according to a study by Dr. Patrick Frank of the Stanford Synchrotron Radiation Lightsource/SLAC at Stanford University, the reported uncertainty bands “have not properly addressed measurement noise and have never addressed the uncontrolled environmental variables that impact sensor field resolution”.

The uncontrolled environmental variables which affect sensors in field locations in the United States were surveyed and reported upon by the Surface Stations Project. Dr. Frank’s analysis focuses on the effects of measurement noise. Dr. Franks concludes that, for the common MMTS sensor, the total noise plus resolution lower-limit 1σ measurement uncertainty in an annual temperature anomaly referenced to a 30-year mean is ±0.46ºC, for a well sited and maintained installation. He further estimates that issues with actual field installations in the Global Historical Climatology Network, “a globally complete assessment of current air temperature sensor field resolution seems likely to reveal a measurement uncertainty exceeding ±0.46ºC by at least a factor of 2.”

The significance of these uncertainties is illustrated in the following graph, in which the black line is the NASA GISS reported historical temperature anomaly, as reported in 2010; and, the gray area represents the measurement uncertainty of +/-0.46ºC about the reported historical temperature anomaly. “The trend in averaged global surface air temperature from 1880 through 2000 is statistically indistinguishable from zero (0)º Celsius at the 1σ level when this lower limit uncertainty is included, and likewise indistinguishable at the 2σ level through 2009.”

Assuming Dr. Franks is correct that actual field measurement uncertainty is =/> 2 x 0.46ºC, the gray area would be twice as wide as shown in the graph at the1σ level, rendering the trend in averaged global near-surface air temperature indistinguishable from zero (0)º Celsius through 2017.


Temperature Anomally

Dr. Franks concludes as follows:

“Future noise uncertainty in monthly means would greatly diminish if the siting of surface stations is improved and the sensor noise variances become known, monitored, and empirically verified as stationary. The persistent uncertainty due to the effect of uncontrolled microclimatic variables on temperature sensor resolution has, until now, never been included in published assessments of global average surface air temperature. Average measurement noise and the lower limit of systematic sensor errors combined to yield a representative lower limit uncertainty of ±0.46ºC in a 30-year mean annual temperature anomaly. In view of the problematic siting record of USHCN sensors, a globally complete assessment of current air temperature sensor field resolution seems likely to reveal a measurement uncertainty exceeding ±0.46ºC by at least a factor of 2. The ±0.46ºC lower limit of uncertainty shows that between 1880 and 2000, the trend in averaged global surface air temperature anomalies is statistically indistinguishable from 0ºC at the 1σ level. One cannot, therefore, avoid the conclusion that it is presently impossible to quantify the warming trend in global climate since 1880.


Finally, the relatively large uncertainty attending the global surface instrumental record means that the centennial temperature trend is not a precision target for validation tests of climate models. Likewise, the current surface instrumental record cannot credibly be used to train or renormalize any physically valid proxy reconstruction of paleo-temperature with sufficient precision to resolve any temperature difference less than at least 1ºC, to 95% confidence. It is thus impossible to know whether the rate of warming during the 20th century was climatologically unprecedented, or to know the differential magnitude of any air temperature warmer or cooler than the present, within ±1ºC, for any year prior to the satellite era. Therefore previous suggestions, that the rate or magnitude of present climate warming is recently or millennially unprecedented, must be vacated.”


These conclusions paint a very different picture of the state of our understanding of global temperature change than the anomaly changes reported to two decimal place “precision” by the producers of the global near-surface temperature anomaly products. They strongly support the arguments for development of a global land surface climate fiducial reference measurements network.


Tags: Global Historical Climate Network, Global Temperature, Temperature Record

Highlighted Article: Opening Up the Climate Policy Envelope

From: Issues in Science and Technology

By: Roger Pielke Jr.

Opening Up the Climate Policy Envelope

Fudged assumptions about the future are hampering efforts to
deal with climate change in the present. It’s time to get real.


Tags: Highlighted Article

Replication and Climate Science

One of the potential approaches to dealing with the Irreproducibility Crisis of Modern Science is replication of the experiment. Replication of an experiment is far more expensive than merely attempting to reproduce the results of an experiment by reanalyzing the existing data and the methods used to analyze that data; and, reaching conclusions based upon the data and methods.

Climate science does not offer the opportunity to replicate the “experiment” which is the ongoing, chaotic climate. The fundamental data required to analyze the earth’s weather and ultimately its climate are collected “on-the-fly”, using a variety of different instruments to measure specific aspects of the current conditions. The scientist has no control over the ongoing “experiment”; and, thus, has no ability to replicate the conditions at any prior time to permit replication of data collection or collection of additional data which might improve the analysis. Once this minute, hour, day, week, month or year has passed, it is over, never to be repeated.

Climate science can improve its ability to analyze the ongoing “experiment” by expanding instrument coverage, utilizing more accurate instruments, deploying multiple sensors at each sensor location to monitor for instrument drift and failure, collecting data more frequently to increase the granularity of the data, etc. The US Climate Reference Network is an example of the deployment of more accurate, multiple sensor monitoring sites with more frequent data collection. The application of satellite-based technology to the measurement of temperatures and sea level is an example of expanding instrument coverage.

However, data which is not collected because there is no instrument deployed, or because an instrument has failed, cannot be replicated. It is forever unavailable. Similarly, data which is inaccurate because of changes in the site metadata or instrument drift is forever inaccurate. Climate science routinely “adjusts” data known or believed to be inaccurate, creating estimates of what the data might have been, had it been collected from properly selected, calibrated, sited, installed and maintained instruments. Some climate science also “infills” missing data, in instances where no sensor has been installed or an installed sensor has failed, again creating estimates of what the data might have been, had it existed. Even if “adjustments” and “infilling” produce accurate estimates, they cannot produce data.

The satellite era has introduced yet another challenge to climate science, which bears aspects of both reproducibility and replication. Satellites are being used to measure atmospheric temperature, sea surface temperature and sea level. The two primary groups analyzing atmospheric temperature, University of Alabama, Huntsville (UAH) and Remote Sensing Systems (RSS) have access to the same data from the same satellites, yet produce results of their analyses which differ from the results produced by the other team. Climate science is currently unable to resolve differences between near-surface and satellite-based temperature measurements. Climate science is also currently unable to resolve the differences between the sea level rise measurements made by surface-based instruments and the contemporaneous measurements made by satellite-based instruments.

The one aspect of climate science which is capable of both reproducibility and replication is exercising of the ensemble of climate models. Climate scientists can analyze the outputs of individual climate model runs and arrive at reproducible conclusions. Climate scientists can also replicate individual model runs using the same inputs and replicate the conclusions. However, that is a trivial result.

Climate science has produced an ensemble of climate models. However, these models, provided the same inputs, will not produce the same results. Therefore, it is not possible that more than one of these models actually models the real climate; and, it is possible that none of the models actually models the real climate. Also, any individual model, provided the range of values of climate sensitivity, forcings and feedbacks, will not produce the same results. Therefore, it is not possible that all of the values of climate sensitivity, forcings and feedbacks are accurate; and, it is possible that none of the values within the ranges are accurate.

The current ensemble of climate models has not been in use for the 30-year period which defines climate according to the World Meteorological Organization, so it is not yet possible to determine whether any combination of model and model inputs is accurate or has any predictive ability. The best known climate model which has been in use for the 30 year climate period is the model by Dr. James E. Hansen, former Director of NASA GISS, which accompanied his 1988 presentation to the Congress. This model projected future temperatures significantly higher than the temperature observations over the 30 year period.

In the shorter term, it has become obvious that the current ensemble of climate models is “running hot”, creating scenarios of potential future warming which are inconsistent with the observations made since the models were run. It does not reflect well upon climate scientists, climate science, or the organizations funding climate studies that these unverified models and their uncertain inputs are being used to produce “scary scenarios” of potential future climate catastrophes. Arguably, the creation of these “scary scenarios” is an exercise in climate research as a self-fulfilling prophesy.

             “In the last forty years governments have become interested in universities’ finding academic support for what they are proposing or have in place. We are in an era of ‘policy-based evidence’. We are also in an era of a particular political correctness, where it is very difficult indeed to get funds for research if the purpose of the research seems antithetical to current government policy. ‘Curiosity-directed research’ now comes with some serious barriers.” Don Aitkin, former Chairman of Australia’s National Capital Authority and former Vice-Chancellor and President of the University of Canberra


Tags: Climate Science, Climate Models, Estimates as Facts

Highlighted Article: Paris Lives! “Deep Decarbonization” at DOE

  • From: MasterResource
  • By: Mark Krebs

Paris Lives! 'Deep Decarbonization" at DOE


"Sir Isaac Newton’s laws of motion include: Every object will remain at rest or in uniform motion in a straight line unless compelled to change its state by the action of an external force. This article examines that force within the “swamp” of climate change policies in DOE.

"Despite President Trump’s announcement that the U.S. would withdraw for the Paris Agreement, the basis of that agreement–“deep decarbonization” through “beneficial electrification”–is proceeding virtually unabated. This is because deep decarbonization serves the purposes of the electric utility industry and their environmentalist allies, e.g., the Natural Resources Defense Council (NRDC)."


Paris Lives! 'Deep Decarbonization" at DOE

Tags: Highlighted Article

Some Irreverent Independent Thoughts on Independence Day 2018

The United States was on the “bleeding edge” of the end of colonialism in 1776. The US is the most successful of the ex-colonies, but has not attempted to use that success to become a colonial power. The US voluntarily participated in several wars in opposition to foreign nations seeking to take control of other nations by force. The US also played a major role in the breakup of the Soviet Union, which had “colonialized” Eastern Europe and had aspirations of “colonializing” the rest of the world. The citizens of the US have paid a heavy price to secure their own freedom and to assist the citizens of other nations to gain and secure their freedom.

Colonialism has begun to rear its ugly head again, though it now calls itself globalism and seeks global governance. The driving force for globalization and global governance is the United Nations, which was founded to assist nations in resolving conflicts peacefully. However, the UN has aggressively pursued “mission creep” and, at times, “mission leap”, as is common for unelected bureaucracies seeking greater influence and power.

The current “cause celebre” for the UN is climate change. Its primary vehicle in pursuit of this cause is the UN Framework Convention on Climate Change (UNFCCC). Its primary tool is the Intergovernmental Panel on Climate Change (IPCC), which studies the science regarding anthropogenic influences on global climate and prepares a periodic Assessment Report and accompanying Summary for Policymakers. Their efforts have led to development and promulgation of a political / scientific “consensus” that emissions of CO2, CH4 and other ”greenhouse gases” are the sole, or at least the principal, cause of recent global warming; that this anthropogenic warming and other potential climate changes caused or aggravated by the warming would inevitably lead to a global climate catastrophe if not abated; and, that urgent actions must be taken to reduce the emissions of these gases to avert the potential catastrophe.

The UN membership has developed a series of accords intended to unite the member nations in this effort, beginning with the Rio Accords, followed by the Kyoto Accords and most recently by the Paris Accords. The UN Secretariat and the UNFCCC have sought to cast these accords as binding treaties, but US administrations have chosen not to submit these accords to the US Senate for ratification, because they were certain that the Senate would not ratify them. The UN Secretariat and the UNFCCC remain convinced that the most recent Paris Accords must become binding on all parties, not only regarding commitments to reduce emissions, but also regarding technology transfers and funding commitments by the developed nations.

The UN Green Climate Fund was originally to begin transferring $100 billion per year from the developed nations to the developing and the not-yet-developing nations by 2020, of which approximately 25% was to be provided by the US. The UNFCCC envisions that commitment growing to $425 billion per year, again with the US providing approximately 25%. The UN Secretariat envisions that administration of compliance with the emissions reductions commitments of the parties and of the collection and distribution of the Green Climate Fund would require a level of global governance by the UN Secretariat. This global governance would obviously require surrender of a degree of national independence on the part of the participating nations, particularly those transferring technology and providing funding.

While the previous US administration appeared more than willing to surrender this degree of sovereignty to the UN, the current US administration appears totally unwilling to do so. US withdrawal from the Paris Accords and defunding of the UNFCCC and the Green Climate Fund is a symbolic declaration of the US intention to remain a sovereign nation. It is also a refusal to participate in the surrender of sovereignty by other nations to the UN bureaucracy. Regrettably, the Administration has not yet terminated its participation in the UNFCCC, as required by current US law as the result of the UNFCCC recognition of the Palestinian Authority as a “state level” participant. This step is essential to communicate to the UN that the US will function as a sovereign state and will participate in UN activities only to the extent that they are incompliance with US law.

US sovereignty and freedom were too hard won to be squandered on globalism.


Tags: Paris Agreement, Global Governance, Green Climate Fund, United Nations

Highlighted Article: William Niskanen on Climate Change

  • From: MasterResource
  • By: Robert Bradley Jr.

William Niskanen on Climate Change


A six-part series on the climate views of the late William Niskanen, taken from his Fall 1997 symposium essay, “Too Much, Too Soon: Is a Global Warming Treaty a Rush to Judgment?” as well as his 2008 postscript. Previous posts are:


William Niskanen on Climate Change


Tags: Highlighted Article

How shall I call thee? “catastrophic anthropogenic climate change skeptic”

The consensed climate science community has been searching for the most effective epithet to use to refer to those who question their consensus regarding the human causation of climate change and the climate catastrophe they believe will result. However, until now, none of the epithets they have elected to use have had the desired effect of humiliating and isolating their intended victims.

The most accurate term for describing those who question the consensus is skeptics. Some are skeptical that CO2 and other “greenhouse gases” are the cause of the recent climate warming. Others are skeptical that those gases are a significant cause of the current warming. Yet others are skeptical that the recent warming would lead to a climate catastrophe. Their skepticism is reasonable, since there is no observational evidence to support the assertions of which they are skeptical.

There is clear paleoclimatic evidence that global climate changed, both warming and cooling, prior to the existence of instrumental measurements of temperature and other aspects of climate; and, prior to the period in which human emissions of CO2 and other ”greenhouse gases” were sufficient to have any measurable influence on global climate. There is also clear instrumental evidence that global climate has changed, both warming and cooling, since the advent of instrumental records. There is, however, no observational evidence which permits measurement of the fraction of these changes which is the result of human activity, whether emissions or land use changes.

There is also clear evidence in the instrumental temperature record of the persistence of natural changes in climate, particularly temperature. The rapid increases and decreases in global average temperature anomalies resulting from El Nino and La Nina events are perhaps the strongest evidence of the persistence of natural variation. The effects of longer time scale variations, such as the reversals of the Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation are also becoming more obvious as they reoccur during the instrumental measurement period.

Arguably, the least accurate term used to describe skeptics is “climate denier”, since virtually none of the skeptics actually deny that the earth has a climate. The term “climate change denier” is almost as inaccurate, since very few of those skeptical of the climate change consensus actually deny that earth’s climate has changed in the past and continues to change in the present. However, the consensed climate science community apparently cannot be bothered to use the expression “catastrophic anthropogenic climate change skeptic”, which is probably the most accurate and comprehensive description of their intended victims.

Recently, climatologist Katharine Hayhoe has suggest that the consensed climate science community switch to the term “climate dismissives”. “I think that’s the perfect term,” Hayhoe said, “because a dismissive person will dismiss any evidence, any arguments with which they’re presented, because dismissing the reality of climate change and the necessity for action is such a core part of their identity that it’s like asking them to almost cut off an arm. That’s how profound the change would be for them to change their minds about climate change.” She believes that the term “skeptic” suggests a scientific willingness to learn and accept, while the term “denier” is almost toxic.

Hayhoe apparently believes that the science is “settled”, that the evidence for human causation is compelling and that the catastrophe is inevitable without drastic action. She appears to be “dismissive” of alternative opinions and the observations which support them. In light of the recent recognition of the poor quality of the climate data and the inaccuracy of the climate models, one wonders if Hayhoe is in “denial”.


Tags: Climate Skeptics, Climate Change Debate, Silencing the Skeptics

Full Disclosure of Climate Research

One of the hallmarks of scientific research is the reproducibility of research results by other researchers. However, reproducibility is extremely difficult, if not impossible, if all of the data, all of the analytical approaches, all of the assumptions and all of the computer models employed in the research project are not thoroughly documented and made available with the results of the research. Research is currently facing a reproducibility crisis.

This can be a very difficult issue with privately funded research intended to lead to commercial sale of products and/or services based on the research, since the individual or organization funding the research is seeking competitive advantage in its markets. There is no obvious benefit to assisting potential competitors in achieving the same research results and thus positioning them to compete at far lower research risk and cost. US patent law is intended to protect the results of such privately funded research.

However, this issue should not be at all difficult in the case of government funded research, since the results of the research become the property of the funding government. Government funding agencies should insist that all research program documentation be delivered by the contractor prior to payment for the research. That requirement would assure the opportunity for other researchers to reproduce the research results, or to falsify the research results.

Climate science has been plagued with a reproducibility issue, which was highlighted in the Climategate e-mails. Perhaps the most egregious example from that time was the suggestion by Dr. Phil Jones, Director of the Climate Research Unit at the University of East Anglia, that he would destroy data rather than provide it to the team of Steve McIntyre and Ross McKitrick for analysis of the validity of the statistical analyses used in the research.

Climate scientists have frequently forced those seeking to reproduce or falsify their research results to resort to FOIA (Freedom of Information Act) requests and even lawsuits to obtain the documentation of their research. It seems unsupportable and ridiculous that such efforts are required to obtain documentation of research projects funded by government agencies. Perhaps the most egregious recent example is the ongoing efforts to obtain the documentation supporting the development of the hockey stick by Dr. Michael Mann and Dr. Mann’s ongoing efforts to delay discovery in his lawsuit against Rand Simberg, Mark Steyn, National Review and the Competitive Enterprise Institute.

This issue also extends to research conducted by government agencies, such as NOAA, NCEI and NASA GISS. Dr.Thomas Karl of NCEI has been accused of failure to properly archive the documentation supporting Karl et al 2015, “Possible artifacts of data biases in the recent global surface warming hiatus”. NCEI management initially resisted providing information regarding the study to a committee of the US House of Representatives, though NCEI is a federal government agency funded by congressional action.

The most recent related controversy regarding this issue involves the use of “secret science” by US EPA and Administrator Pruitt’s intent to end the use of such science in formulating EPA’s environmental regulations.

There appears to be no obvious justification for restricting access to the documentation supporting government funded research of any type, though it is reasonable to restrict access to the personal data of individuals who were the subjects of the research, which has been an issue in the recent EPA controversy.


Tags: Climate Science, Policy, Peer Review

Highlighted Article: Judith Curry - State of the Climate Debate

State of the Climate Debate

By: Judith Curry

  1. Cover
  2. Agreement / Disagreement
  3. Disagreement: Causes of climate change
  4. Elephant
  5. Disagreement: Cause of climate change
  6. Policy cart before scientific horse
  7. You find what you shine a light on
  8. The sea level rise alarm
  9. Is CO2 the control knob for global sea level rise?
  10. What is causing recent sea level rise?
  11. Variations in Greenland glacier mass balance
  12. IPCC AR5 quotes on sea level rise
  13. To what extent are man-made CO2 emissions contributing to climate change?
  14. Should we reduce emissions to prevent warming?
  15. Climate pragmatism
  16. Madhouse effect
  17. Personal statement

State of the Climate Debate


Tags: Highlighted Article

Climate Change Messaging

The consensed climate science community continues to search for an effective messaging approach which would convince the general public of the rightness and urgency of its cause and propel a concerted public effort to control the climate. The various messaging approaches pursued to date have been ineffective in achieving a sufficient level of climate hysteria to overcome continued apathy and skepticism.

Once the transition from the global cooling concerns of the 1970s to global warming concerns had been completed, it rapidly became obvious that the public was no more concerned about a little warming than they had been about a little cooling. Obviously, The Day After Tomorrow and An Inconvenient Truth were just not adequate to the task.

Global warming then became global climate change, which broadened the concept from temperature to inclusion of any and all abnormal or extreme weather events, including heat waves, cold waves, droughts, heavy rains, heavy snows, floods, hurricanes, typhoons, tornadoes, glacial retreat, rising sea level, ocean “acidification” and coral bleaching. Virtually all unusual weather events were attributed to climate change, which led to the frustrated observation that “Weather is only climate when it’s hot or when people die.”

It has been common practice for decades to assign names to tropical cyclones. However, the enhanced focus on weather and climate has now resulted in the assignment of names to winter storms. It has also introduced new terms to the weather lexicon, including Polar Vortex and Bomb Cyclone; and, the application of pejoratives, such as Snowmageddon. Climate change has also been referred to as Climate Weirding, Climate Apocalypse and Climategeddon.

We have been told of climate tipping points, beyond which recovery to “normal” conditions would be impossible. We have heard various brief periods of time referred to deadlines for dramatic climate action to avoid imminent catastrophe. We have been regaled with aspirational “goals”, such as keeping warming below 2°C, or even better below 1.5°C.

We have been told that the science is settled, though it has recently become abundantly clear that it is very unsettled. Those who question the orthodoxy of the consensed climate science community are referred to with pejoratives such as climate denier, climate change denier, anti-science and climate zombie, though they typically deny nothing. There have even been calls to silence, institutionalize, prosecute, persecute and kill “deniers”, based on the assertion that they represent a danger to public health and safety.

There was great hype associated with the runup to the Paris Accords, which were proclaimed to be the last, best hope of salvation from the impending climate change catastrophe. However, following the US declaration of intent to withdraw from the Paris Accords, the consensed climate science community is now raising concerns that the commitments contained in the Paris Accords are insufficient to avoid climate catastrophe. Interestingly, these concerns are accompanied by assurances that the objectives of the Accords can be achieved without the participation of the United States.

The most recent changes to the messaging suggest that the climate change discussion must be divorced from politics, which is judged to be a divisive influence. The “elites” admired by the unconvinced must be enlisted to gain their acceptance and support. However, it is difficult to separate politics from climate change when the ultimate goal of the consensed political class is a transition from capitalism to a socialist/communist global cooperative.


Tags: Climate Change Debate

Highlighted Article: A Crusade in Pursuit of a Fantasy

A Crusade in Pursuit of a Fantasy


Executive Summary


The sun at the center of our solar system is the source of virtually all of the thermal energy the planets receive. Their distance from the sun determines the incident solar energy received by each of them. Their atmospheres determine the fraction of the incident solar radiation which reaches each planet’s surface; and, the fraction of the incident solar radiation which is reradiated by each planet’s surface. This energy balance determines the temperature near each planet’s surface. Our focus is on the Earth, the “water planet”.

The earth rotates around its own axis, ... Read More

A Crusade in Pursuit of a Fantasy


Tags: Highlighted Article

Climate Change Attribution

Earth’s climate system is extremely complex, chaotic and not particularly well understood. The factors which have caused climate change over the past millennia continue to cause climate change today and will likely continue to cause climate change in the future do not operate independently, but rather interact with each other over differing timescales to produce the effects we are able to detect and measure.

The consensed climate science community is focused on the anthropogenic factors which it believes cause or contribute to climate change, primarily the increased atmospheric concentrations of CO2 and other greenhouse gases. However, these anthropogenic factors operate against a background of numerous, pre-existing, complex natural factors which also cause or contribute to climate change. Therefore, as complex as detecting climate change might be, attributing climate change to the plethora of natural and anthropogenic causative or contributing factors is far more complex.

Anthropogenic factors are not generally believed to cause hurricanes, typhoons, tornadoes, severe storms, droughts, floods, heat or cold waves, heavy snowfalls, sea level rise, etc. because all these weather events existed prior to the period since ~1950 during which increased atmospheric CO2 concentrations are believed to affect climate. Rather, anthropogenic effects are typically alleged to make these events more frequent or more severe or more damaging.

Since there are no identified climate change effects which demonstrably have anthropogenic factors as their sole cause, there is growing interest in attribution studies intended to identify or estimate the relative impacts of the various natural and anthropogenic factors which contribute to climate change. However, the primary weakness of this approach is its reliance on unverified climate models and undefined climate sensitivities, forcing and feedbacks used as inputs to those models. It is not currently possible to measure or otherwise document anthropogenic impacts.

Attribution has been an issue recently regarding the extent to which anthropogenic climate change might have worsened the effects of Hurricanes Sandy, Harvey, Irma and Maria. Interestingly, no such issues were raised regarding the 12-year period with no landfalling strong hurricanes. Arguably, a 12-year period with no landfalling major hurricanes is at least as unusual as a year (2017) with 3 major landfalling hurricanes.

The weakness of the current state of attribution studies is highlighted by the range of estimates of anthropogenic impacts on various events studied. World Weather Attribution scientists have estimated that climate change made Hurricane Harvey 3 times more likely and its rainfalls 15% more intense. Other sources estimate that tropical cyclones might be 2-11% more intense by 2100.

Attribution studies attempt to evaluate the likelihood that anthropogenic climate change might impact the frequency or severity of natural weather events; and, the likely magnitude of the impacts. However, these studies are severely limited by their reliance on unverified climate models for their estimates. This again emphasizes the importance of improving and verifying climate models, to assure that they are actually modeling the real climate. Estimates of the potential impacts of anthropogenic effects on weather events in a make believe climate are worse than useless.


Tags: Climate Models, Climate Science

Climate Change: “Predictions are hard …”

            “Predictions are hard, especially about the future.” Yogi Berra, American philosopher

The consensed climate science community and its animated spokespersons have made numerous predictions regarding future climate change and its impacts on the earth and its population. Initially, many of these predictions were made for the near future, within the expected lifespans of the predictors and their audiences. Many of these short-time-frame predictions have proven to be erroneous; and, have become a significant embarrassment to those who made the predictions.

Some notable examples of such erroneous short-term predictions include:

  • an ice-free Arctic Ocean and an ice-free North Pole;
  • inundation of coastal areas and islands;
  • massive crop failures and starvation;
  • more frequent, longer and more severe droughts;
  • more frequent heavy rain events and more severe flooding;
  • more numerous and intense hurricanes, typhoons and tornados;
  • massive numbers of climate refugees; and,
  • widespread climate change induced deaths.

I suspect that only those who made the erroneous predictions might regret that they were erroneous.

The consensed climate science community has responded to this record of erroneous short-term predictions by vastly extending the time frame of its predictions to periods beyond the expected lifespans of the predictors and their audiences. Predictions of potential conditions or events in 2100 and beyond are far less likely to embarrass those making the predictions. However, these long-term predictions have also proven to be far less effective in inducing action on the part of their audiences.

All these predictions, regardless of time-frame, are based on various climate models using various climate sensitivity, climate forcing and climate feedback assumptions. However, members of the climate science community have recently acknowledged that the climate models are “running hot”; and, that the data against which the models have been hindcast to “tune” them are suspect. “Re-tuning” the climate models, even to the existing “adjusted” data, would result in reduced magnitude of any predicted results of climate change.

Recent research suggests that, at the current rate of increase of atmospheric CO2 concentrations, the feared doubling of atmospheric CO2 concentrations would not occur until approximately 2100. Most of the “scary scenarios” currently predicted by the climate models are based on the IPCC Representative Concentration Pathways 8.5 (RCP8.5), which projects a far more rapid increase in atmospheric CO2 concentrations. However, it is becoming increasingly obvious that RCP8.5 is unrealistic; and, perhaps impossible.

Regrettably, the climate models also failed to predict numerous positive climate-related events, including:

  • the ~20 year “hiatus” or “pause” in global temperatures;
  • the recent 12 year dearth of land-falling major hurricanes;
  • the declines in weather/climate related damage and death;
  • the documented greening of the globe; and,
  • the positive effects of increased CO2 on plant growth and production.

Hopefully, the recent recognition of the shortcomings of the climate data and the climate models will result in serious efforts to improve the comprehensiveness and quality of the climate data and to improve and eventually validate the climate models.


Tags: Climate Models, CO2 Emissions
Search Older Blog Posts