Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Climate Change Questions

Judge William H. Alsup of the United States District Court for the Northern District of California, San Francisco Division will hear the case “The People of the State of California vs. B.P. P.L.C., et al. This case is a civil suit against five major oil companies for alleged present and alleged potential damages caused by global warming and climate change.

The judge has invited a tutorial on the best science available on global warming and climate change as a prelude to the trial. This tutorial is to be provided by both the plaintiff(s) and the defendants. The court has also received an Administrative Motion for leave to submit a tutorial presentation from climate scientists William Happer, Stephen Koonin and Richard Lindzen. The Science and Environmental Policy Project (SEPP) is pursuing the possibility of filing an amicus brief with the court as well, since it believes it has been slandered in the City of Oakland complaint which is part of the suit.

The Happer / Koonin / Lindzen (HKL) presentation responds point by point to the questions raised by the judge to be responded to in the tutorial. The overview section of the HKL presentation makes the following points:

  1. “the climate is always changing; changes like those of the past half century are common in the geological record, driven by powerful phenomena;
  2. human influences on the climate are a small (1%) perturbation to natural energy flows;
  3. it is not possible to tell how much of the modest recent warming can be ascribed to human influences; and,
  4. there have been no detrimental changes observed in most salient climate variables and projections of future changes are highly uncertain.

 

The presentation then provides detailed responses to the questions posed by the judge.

 

The SEPP brief would likely focus on the “standards of evidence: direct or indirect; physical or bureaucratic.”

 

“Physical evidence is hard data showing CO2 is the primary cause of global warming. Increasing emissions, changing climate, etc. are not physical evidence of cause. Bureaucratic evidence includes global climate models that fail basic testing, and group think such as organizations that fail to address the key issue in their reports, etc. The key issue is: do carbon dioxide emissions cause dire warming of the atmosphere? SEPP’s answer is no, and CO2 emissions are beneficial to humanity and the environment.”

 

SEPP would be expected to stress that:

 

  1. there are no hard data on the contribution of anthropogenic CO2 emissions to recent climate warming or sea level rise;
  2. the available temperature data are of questionable accuracy and provide incomplete global coverage;
  3. the available climate models are unverified, are acknowledged to be unrepresentative of the actual climate and have demonstrated no predictive ability; and,
  4. the available sea level rise data show no acceleration during the period of interest (post 1950).

 

Viscount Christopher Monckton of Brenchley and a group of scientist colleagues also filed a brief with the court, asserting that: ““the Court should reject Plaintiff’s case and should also reject those of Defendants’ submissions that assert that global warming is a serious problem requiring urgent mitigation.” This brief is most likely based on a recent scientific paper the group has submitted for publication.

 

Dr. Judith Curry has posted a detailed response to the eighth question posed by the judge for the tutorial at her website. It is unclear whether Dr. Curry intends to file a response with the court.

 

There is no current information regarding potential filings by other skeptical climate scientists, though that is certainly a possibility. Since much of the concern expressed by the plaintiff regards the potential future impacts of sea level rise, separately and in combination with more frequent and stronger storms, a presentation on these issues by Dr. Roger Pielke, Jr. might be a significant contribution.

 

“I don’t know of any judge who has asked for a tutorial like this,” said Steven E. Koonin, a physicist and former Energy Department undersecretary known for his contrarian views on global warming research. “I think it is a great idea. Anybody having to make a decision about climate science needs to understand the full spectrum of what we know and what we don’t know.”

 

“There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know.”, Donald Rumsfeld
 

 

 

Tags: Climate Change Lawsuits

Climate Change Highlights and Lowlights from our first 100

Recent research suggests far lower climate sensitivity to CO2 doubling.

Temperature anomaly producers still “adjusting” flawed temperature measurements.

Climate modelers finally acknowledge that models are “running hot”.

No climate model has yet been verified or demonstrated predictive skill.

Satellites document global greening. CO2 improves growth rates and water management.

Large disparity between satellite and tide gauge sea level rise measurements persists.

Recent research shows no linkage between climate change and extreme weather events.

Climate model based “scary scenario” studies dominate media coverage of climate change.

Recent research documents solar influence on earth’s climate and cloud formation.

UNFCCC already claiming that the Paris Accords do not go far enough to reduce CO2.

The US has begun its exit from the Paris Accords and the Green Climate Fund (GCF).

The UNFCCC seeks to increase GCF funding from $100 Billion to $425 Billion per year.

The US has terminated contributions to the Green Climate Fund.

The US has not begun its exit from the UNFCCC, as required by US law.

US EPA has proposed an open debate on the status of climate science.

US EPA has not begun efforts to remove the 2009 Endangerment Finding.

US EPA has disavowed “Sue and Settle” approach to environmentalist lawsuits.

Environmentalists are threatening numerous lawsuits against US EPA over Clean Power Plan.

Recent research documents that renewables increase electricity costs, despite claims.

Meeting US energy demand with wind turbines would require ~2.3 million 3MW turbines.

Energy efficiency of US economy continues to increase.

Meeting US energy demand with solar would require ~12 million square miles of collectors.

Solar and wind equipment efficiency is increasing; and, equipment cost is decreasing.

Storage technology to support full renewable transition is not currently economically available.

 

Tags:

Climate Change Dissension

The consensed climate science community and the globalist political community have been partners in a mutual adoration society for several decades. This relationship culminated in the signing of the Paris Accords, which committed the signatories to a globalist response to the impending climate crises envisioned by the climate model scenarios generated by the consensed climate science community. The dream of decades appeared to be within reach.

However, the inauguration of an openly skeptical Administration in the US and its subsequent decision to withdraw the US from the Paris Accords has caused dissension in the ranks of the mutual adoration society. The US has been the primary source of funding for both the consensed climate science community and the globalist political community. However, the current US Administration has reduced funding for the advancement of both the climate science consensus and the globalist political consensus.

The new skepticism on the part of the major source of climate research funding has caused the consensed climate science community to acknowledge the technical weaknesses of both the Global Historical Climate Network (GHCN) process for collecting and analyzing near-surface temperatures and the use of the current ensemble of climate models for the production and analysis of potential future climate scenarios.

The consensed climate science community has recently called for the construction of a global near-surface temperature measurement network similar to the US Climate Reference Network. The consensed climate science community has also recently acknowledged that the current ensemble of climate models is “running hot”, producing potential future scenarios with temperature anomalies two to three times the anomalies present in the “adjusted” near-surface observations.

It is unlikely that the call for a global Climate Reference Network is totally altruistic, in that there has been no significant effort to extend the existing GHCN measurement program to areas where there is currently no measurement activity. It appears unlikely that nations which have been unwilling to install and operate the far less expensive GHCN measurement stations would be willing to install the far more expensive Climate Reference Network measurement stations. However, the pursuit of such a program would continue the efforts of the consensed climate science community to ignore or minimize the importance of the satellite tropospheric temperature measurements, which are made in the region of the atmosphere in which CO2 absorption of solar radiation actually occurs.

It appears equally unlikely that the acknowledgment of the shortcomings of the current climate models is altruistic. Rather, it appears that this acknowledgment is a response to the growing realization that the models are not, in fact, modeling the real climate, largely as the result of the use of unrealistically large climate sensitivity estimates; and, perhaps also, the use of incorrect cloud forcings.

Recent research suggests that climate sensitivity to CO2 is at or perhaps even below the low end of the range of values used by the IPCC. Recent research also suggests that the Representative Concentration Pathway (RCP 8.5) used in the climate models to produce many of the failed “scary scenarios” is totally unrealistic.

 

Tags: Temperature Record, Global Temperature, Global Governance, Paris Agreement, Global Historical Climate Network (GHCN)

Sea Level Rise “Settled Science”

“There is a total absence of data supporting the notion of a present sea level rise; on the contrary all available facts indicate present sea level stability. On the centennial timescale, there was a +70 cm high level in the 16th and 17th centuries, a -50 cm low in the 18th century and a stability (with some oscillations) in the 19th, 20th and early 21st centuries.” Dr. Nils-Axel Mörner

“Over 1000 of the world’s Tide Gauges show pure linear trends, along with minimal (mostly thermal expansion and glacial melt) increases. There are none showing any acceleration of Sea Level rise rate in tectonically inert areas.” Thomas Wysmuller, former NASA meteorologist

“The rate of sea level change was found to be larger in the early part of last century (2.03 ± 0.35 mm/yr 1904–1953), in comparison with the latter part (1.45 ± 0.34 mm/yr 1954–2003).” Dr. Simon Holgate

Global sea level has been rising over the past century, and the rate has increased in recent decades.” NOAA

“Satellite altimetry has shown that global mean sea level has been rising at a rate of 3 ± 0.4 mm/y since 1993. Using the altimeter record coupled with careful consideration of interannual and decadal variability as well as potential instrument errors, we show that this rate is accelerating at 0.084 ± 0.025 mm/y2, which agrees well with climate model projections.” Nerem et al

It is not unreasonable to assume that actual global sea level rise lies somewhere between zero and ~3 millimeters per year. It is also not unreasonable to assume that the rate of acceleration of global sea level rise lies somewhere between zero and ~0.084 millimeters per year per year. However, those ranges are quite large; and, they are hardly reflective of “settled science”.

Data from tide gauges documents sea level rise of ~1.45 millimeters per year, only half as rapid as the ~3.0 millimeter per year rise calculated from satellites measurements. It is certainly possible that one of these rates of sea level rise is correct, but it is not possible that both are correct; and, it is possible that neither is correct and that the actual rate of sea level rise does not lie between these two rates.

This large discrepancy in estimates of global sea level rise would seem to argue for an expanded focus on resolving these differences. Research to resolve the differences is certainly more important and more valuable than numerous “scary scenario” studies based on unverified climate models, fed with the higher of the rate of rise estimates and the unrealistic IPCC Representative Concentrations Pathway (RCP) 8.5 scenario. These “scary scenario” studies attract the interest of the media and the public, but do nothing to advance the state of the science.

This large discrepancy in rate of rise of sea level estimates should be a subject of the Red Team / Blue Team debate proposed by EPA Administrator Pruitt. The procedures used to convert the satellite data to estimates of the rate of sea level rise should also be subject to analysis by Tiger Teams at NOAA and NASA. It is long past time to settle this aspect of climate change science.

 

Tags: Settled Science, Sea Level Rise, Sea Level Change

Highlighted Article: State of the Climate 2017

State of the Climate 2017

The Global Warming Policy Foundation

By: Ole Humlum - former Professor of Physical Geography at the University Centre in Svalbard, Norway, and Emeritus Professor of Physical Geography, University of Oslo, Norway.

 

1. It is likely that 2017 was one of the warmest years, according to temperature records from the instrumental period (since about 1850). However, it was cooler than 2016.

2. At the end of 2017 the average global air temperature was dropping back towards the level characterising the years before the strong 2015–16 oceanographic El Niño episode. This underscores that the global surface temperature peak of 2015–16 was caused mainly by this Pacific oceanographic phenomenon. It also suggests … Complete Report

 

Tags: Highlighted Article

Toward A Vegan Future

One aspect of the Ultimate Goal of the Catastrophic Anthropogenic Global Warming movement is a global conversion to a vegan diet. The sustainability movement asserts that the growing global population is already stressing the available food supply; and, that reduction of the land area dedicated to grazing and growing grains to feed animals for both meat and milk production is essential to avoid mass food deprivation and, ultimately, starvation.

“Researchers from the Sustainability Research Institute at University of Leeds in England and the Mercator Research Institute on Global Commons and Climate Change in Berlin” assert that: “Right now, there's not a single country on Earth that provides its people a good, sustainable life.” Essentially, they conclude that those currently enjoying a “good life” are not doing so sustainably; and, that those currently living a sustainable life are not enjoying a “good life”. They suggest that, for all to live a good life sustainably, "we need to become two to six times more efficient at transforming resource use into human well-being." We would also need to massively redistribute wealth and income.

A global shift to veganism is unlikely to occur quickly or voluntarily. Despite concerns about existing stresses on the available food supply, meat consumption and dairy consumption continue to grow as nations become more prosperous. This has led researchers to focus on the various measures of the efficiency of land use for meat production and the efficiency of conversion of resources to edible meat protein. Beef is the least efficient in both land use and protein conversion efficiency, followed by lamb, pork and poultry. Beef production is also responsible for most of the greenhouse gas emissions associated with animal husbandry.

Politicians and environmental activists, always eager to maximize their contributions toward a sustainable future, have begun advocation for a meat tax. The environmental activists are concerned about land use, resource consumption and environmental emissions. The politicians envision another source of funding, since they have so far been unsuccessful in instituting a carbon tax. The meat tax would be a form of “sin tax”, intended ultimately to drive meat from the market. As with all “sin taxes”, the tax rate would be increased, as required, to maintain the revenue stream, until there was no meat left to tax.

Beef production has grown far less rapidly than poultry and pork production, likely because of the far greater land and feed requirements and the longer maturation period for beef cattle. Dairy milk production has remained relatively flat in recent years, while consumption of soy milk and nut and grain milks have increased.

Textured vegetable protein meat analogues have been available in the US market for at least 50 years, though their market penetration has been largely limited to those with religious and / or dietary constraints. However, major new public / private partnership efforts are underway in an attempt to increase the acceptability and acceptance of meat analogues as a climate-friendly alternative to commercial animal husbandry. The challenges of approximating taste, texture and chew have largely been overcome for fully cooked products. However, the challenge of successfully approximating the aesthetic qualities of a piece of medium rare beef steak, hamburger, leg of lamb, etc. remains.

 

Tags: Sustainability

Highlighted Article: Sea level rise acceleration (or not)

This is a multi-part series by Judith Curry posted on her blog, Climate Etc.

“I have several clients that are interested in the issue of sea level rise, from a range of perspectives (insurance, engineers, city and regional planning, liability). I am preparing a comprehensive assessment of the topic, with a focus on sea level rise in the U.S. I will be posting draft chapters on the blog for you to critique. I am also hoping that crowdsourcing will help me identify additional resources and information.

  1. Introduction and context
  2. The Geological Record
  3. 19th & 20th Century Observations
  4. Satellite Era Record
  5. Detection & Attribution
  6. Projections for the 21st Century

 

(updated 4/2/18 - added new chapter 6 link and corrrected chapter titles)

 

 

Tags: Highlighted Article, Sea Level Rise, Sea Level Change

The UN Has a Better Idea for Combating Climate Change

The UN Habitat for a Better Urban Future convened the World Urban Forum (WUF9) in Malaysia in February 2018. The broad focus of the Forum is on “sustainable urbanization”. The document which provides the focus for this effort is the “Quito Declaration of the New Urban Agenda”.

The Committee for a Constructive Tomorrow (CFACT) sent a delegation to the UN Habitat III meeting in 2016, at which the Quito Declaration of the New Urban Agenda was developed and adopted. CFACT partially summarizes the contents of the Quito declaration as follows:

  • Regulating businesses to change their profit-motive operations to ones “based on the principles of environmental sustainability and inclusive prosperity …” (i.e. socialism)
  • A commitment to promote “international, national, sub-national, and local climate action … consistent with the goals of the Paris Agreement on climate change.”
  • Creating new rights to government housing and redistributed wealth.
  • The establishment of a “UN-Habitat multi trust fund … in support of sustainable urban development, to be accessed by developing countries.”
  • Expanding “sexual and reproductive health-care services.” (i.e., population control)
  • Using cities as a vehicle to transfer great wealth from tax and ratepayers to "green energy" corporations that make no meaningful difference to the climate.

While the US has been a party to these proceedings in the past, it appears unlikely that the current US Administration would be supportive of any of the objectives listed above. The Administration has focused on easing regulations on businesses. The US has begun withdrawal from the Paris Accords and is obligated by US law to withdraw from the UN Framework Convention on Climate Change. The Administration has shown no inclination to establish new “rights” to government housing or to redistributed wealth. The US has already ceased funding the UN Green Climate Fund and would appear unlikely to commit to funding a UN Habitat multi-trust fund. The Administration has also halted funding to the UN for “sexual and reproductive health-care services” through the UN Population Fund.

The previous US Administration might well have found such programs admirable, but the current Administration would likely find them anathema. The UN maintains its focus on the various aspects of global governance, though still using existing government structures as their vehicle. However, it is unlikely that this approach would achieve the objectives the UN has set, leading ultimately for a push toward global governance through the UN.

A recent study published in the journal Nature, “A Good Life for All Within Planetary Boundariesamplifies some of the issues regarding cities and sustainable development as follows:

 

"We apply a top-down approach that distributes shares of each planetary boundary among nations based on current population (a per capita biophysical boundary approach)...If all people are to lead a good life within planetary boundaries, then our results suggest that provisioning systems must be fundamentally restructured to enable basic needs to be met at a much lower level of resource use."

 

This is clearly an agenda of imposed socialism and should be anathema to all free people.

 

“Socialism is a philosophy of failure, the creed of ignorance, and the gospel of envy, its inherent virtue is the equal sharing of misery.” --Winston Churchill

 

Tags: United Nations, Sustainability

Highlighted Article: California v. The Oil Companies - The Skeptics Response

In the lawsuit between California and the big oil companies, the federal district court has asked for a tutorial on global warming and climate change. Here is the response from Professors William Happer, Steven E. Koonin, and Richard S. Lindzen. Also included is commentary from Marc Morano at Climate Depot.

 

ADMINISTRATIVE MOTION OF WILLIAM HAPPER, STEVEN E. KOONIN, AND RICHARD S. LINDZEN FOR LEAVE TO SUBMIT PRESENTATION IN RESPONSE TO THE COURT’S TUTORIAL QUESTIONS

 

‘Global warming’ on trial: Prominent scientists submit climate skeptics’ case to federal court – Climate Depot, Marc Morano

 

Tags: Highlighted Article, Climate Change Lawsuits, ExxonMobil

Natural vs. Unnatural Temperature Change

The graph below shows the satellite lower troposphere temperature anomaly from inception through January 2018, as prepared by Drs. Roy Spencer and John Christy of the University of Alabama – Huntsville (UAH).

UAH Satellite-Based Temperature Graph

The graph has been highlighted to illustrate the numerous instances of natural variation in the running average temperature anomaly record. The instances of natural variation in the monthly anomalies are both more numerous and more rapid than those shown in the running average. These instances of natural variation are larger and more rapid than the longer term positive variation in the anomaly record, some or all of which is alleged to be anthropogenic.

The graph below shows the NASA GISS near-surface temperature anomaly from inception through the end years shown for the three anomaly plots graphed. Note that the three anomaly plots shown in the graph are based on the same temperature data through 1981; and that the red and blue plots are based on the same data through 2001. Clearly, there are numerous incidences of natural variation evident in each of the three anomaly plots, although they do not appear to be as dramatic as in the graph above because of the longer time frame and larger anomaly range displayed.

The graph below displays two instances of unnatural variation in the anomaly record; that is, variation resulting from anthropogenic “adjustments” to, or “re-analysis” of, the global temperature record made by NASA GISS. The climate over the period from 1880 to 1980 and its actual anomaly from the reference period did not change. However, the reported anomaly over the period did change. The anomaly was reduced by as much as ~0.2oC early in the period, thus increasing the apparent rate of change of the anomaly over the period, as shown in the area highlighted in yellow in the graph. The climate over the period from 1980 to 2001 and its actual anomaly from the reference period also did not change. However, the reported anomaly over the period did change. The anomaly was increased by as much as 0.2oC late in the period, as shown in the area highlighted in green in the graph, again increasing the apparent rate of increase of the anomaly over the period. We cannot determine from the information in the graph the number of times the anomalies were “adjusted” or “re-analyzed”. We can only determine the cumulative effects of the “adjustments” or “re-analyses”, which appear to total ~0.4oC, or approximately 1/3 of the reported anomaly change over the entire 136 year period.

We do not know which, if any, of the anomaly plots contained in this graph is accurate. We do know, however, that they cannot all be accurate.

This issue would be far less significant if we were analyzing the results of a controlled experiment which could be rerun after recalibrating the temperature sensors to assure maximum accuracy. However, we are dealing with an ongoing, non-reproducible experiment. The issue would also be far less significant if the expenses of the experiment, or investments which might be made as a result of the experiment, were minimal. However, this ongoing experiment has involved expenses in the billions of dollars; and, the investments to be made as the result of the experiment would be in the tens of trillion of dollars and would affect the lives of billions of people.

This experiment is a vast program, which should not have been begun and should not be pursued with half-vast ideas. We continue to do so at our expense and at our peril.

 

Tags: Natural Variability

Highlighted Article: The Social Cost of Carbon

By: Reason Foundation - Julian Morris

CLIMATE CHANGE, CATASTROPHE, REGULATION AND THE SOCIAL COST OF CARBON

Executive Summary:

Federal agencies are required to calculate the costs and benefits of new regulations that have significant economic effects. Since a court ruling in 2008, agencies have included a measure of the cost of greenhouse gas emissions when evaluating regulations that affect such emissions. This measure is known as the “social cost of carbon” (SCC). Initially, different agencies applied different SCCs. To address this problem, the Office of Management and Budget and Council of Economic Advisors organized an Interagency Working Group (IWG) to develop a range of estimates of the SCC for use by all agencies. However, the IWG’s estimates were deeply flawed. In April 2017, President Trump issued an executive order rescinding the IWG’s estimates and disbanded the IWG. The question now is what value regulatory agencies should use for the SCC—if any—when evaluating rules that affect greenhouse gas emissions.

CLIMATE CHANGE, CATASTROPHE, REGULATION AND THE SOCIAL COST OF CARBON

 

Tags: Highlighted Article, Cost of Carbon

Climate Change Humpty Dumptyism

“When I use a word, it means just what I choose it to mean—neither more nor less.”, Humpty Dumpty

Humpty Dumptyism: The practice of insisting that a word means whatever one wishes it to.

Fact: Something that has actual existence; an actual occurrence

Data: Factual information (such as measurements or statistics) used as a basis for reasoning, discussion, or calculation

I have previously written about facts in relation to global temperature measurements. Climate scientists recognize that not all of the data they collect are necessarily “facts”, because they have been collected at the wrong time, or are of questionable accuracy for other reasons. However, they are still data, in that they are measurements, even if they are known or suspected to be inaccurate, subject to the conditions surrounding their acquisition.

All of the data collected from measuring stations on a given day, or during a given month, constitute a dataset for that period. However, those datasets might not be complete, if data is not collected from all measurement sites for some reason. When a producer of temperature anomaly products selects data for analysis from a dataset, eliminating missing data, obvious data “outliers”, etc. they create a subset of the data which becomes their dataset for that period.

It is common practice among the producers of temperature anomaly products to “adjust” the data in their data sets to compensate for errors and suspected biases. In making these ‘adjustments”, their datasets become “estimate sets”, since the temperatures in the sets are no longer actual measurements. Rather, they are now estimates of what the data might have been if it had been collected timely from properly sited, calibrated, installed and maintained instruments. However, it is also common practice to continue to refer to these sets of temperatures as datasets.

NASA GISS typically “infills” their estimate sets with estimates of the likely temperatures in areas in which there are no measuring stations, or from which no data were collected for other reasons during the current period. It is typical to refer to the “infilled” estimate sets as datasets, despite the fact that most of the original data has been “adjusted” and missing data has been “infilled” with manufactured estimates.

The producers of the temperature anomaly products then compute a global average temperature value from their estimate sets for the period under analysis. This temperature value is then compared with the average estimated temperature value for the corresponding period (month or year) during a thirty-year reference period. The calculated temperature difference between the current period and the reference period is then recorded as the temperature anomaly estimate for the current period.

The “adjustments” made to the data prior to calculation of the anomaly for the period are the only “adjustments” made to the data. However, it is not uncommon for further adjustments to be made to the estimate sets over time, as discussed here and here. Again, it is certainly possible the estimate sets are accurate; and, that the subsequent anomaly calculations are also accurate at some point in the ongoing “adjustment” process. However, if that is the case, it would be very difficult to determine which of the “adjusted” values is accurate.

 

Tags: Adjusted Data, Estimates as Facts, Global Temperature, Temperature Record

Highlighted Article: Intimidating the “Deniers” to Enforce the “Consensus”

By: Marc Morano

This is a bonus chapter that was not included in Marc Morano's book - The Politically Incorrect Guide to Climate Change.

Intimidating the “Deniers” to Enforce the “Consensus”

 

Tags: Highlighted Article

Temperature “Adjustments” ad Infinitum

The temperature measurements taken to produce the global near-surface temperature anomaly record are “adjusted” for a variety of reasons.  However, the measurements are not “adjusted” once, to achieve a hopefully more accurate value. The graph below, produced by climate4you.com, illustrates the “adjustment” history of calculated temperature anomaly for two specific months in the past, over a period of 10 years, from 2008 to present.

NCDC temperature adjustments

The National Climatic Data Center (NCDC) temperature anomaly value for the month of January 2000 was “adjusted” more than 40 times over the 10-year period from May 2008 to January 2018. While there were both positive and negative “adjustments” made to the calculated temperature anomaly, the net result was an increase of 0.07oC, or more than a 25% increase from the earliest anomaly value shown in the graph.

The NCDC temperature anomaly value for January 1915 was also “adjusted” more than 40 times over the 10-year period from May 2008 to January 2018. Again, there were both positive and negative “adjustments” made to the calculated anomaly. However, in this case, the net result was a decrease of 0.005oC, or only a 4% reduction from the earliest anomaly value shown. It is interesting to note that the calculated anomaly was reduced by as much as 0.065oC during the period, before being increased again in June 2015.

There is no obvious explanation for the apparent need to retrospectively “re-adjust” the temperature anomaly calculations this frequently, or to this extent. The “adjustments” are all in the second decimal place, which means that they are all made at a greater level of “precision” than the underlying measurements.

The graph below illustrates similar retrospective temperature anomaly adjustments made by NASA GISS.

In the January 2000 case, while there are both positive and negative “adjustments”, the net result is again a positive adjustment of 0.07oC. However, in this case, that “adjustment” represents an increase of approximately 40% from the earliest anomaly value shown in the graph. It is important to note that NCDC “adjusts” the temperature measurements before providing them to NASA GISS, which then “re-adjusts” them. The GISS “re-adjustment”, in this case, results in a first anomaly value 0.10oC lower than the anomaly value provided to GISS by NCDC; and, thus, a lower final anomaly value as well.

In the January 1910 case, again there are both positive and negative “adjustments”, but the net result is a negative “adjustment” of 0.17oC. Note that both the January 2000 and the January 1910 values are “re-adjusted” less frequently by NASA GISS than by NCDC.

Both in the case of NCDC and NASA GISS, the net effect of these “re-adjustments” is to increase the temperature anomaly change from the earlier month in 1915 or 1910 to the anomaly in January 2000. In the case of NCDC, the anomaly increase is 0.08oC.  In the case of GISS, the anomaly increase is 0.24oC, or three times the anomaly increase calculated by NCDC. It is also important to note that these changes only reflect the period from 2008 to 2018. There is no corresponding record of “adjustments” made prior to 2008 available from climate4you.com. However, there is little reason to believe that the “adjustments” shown in the graph above were the first “adjustments” made to these anomalies, or that they will be the last.

It is possible that one of the anomaly values shown in each graph line is accurate, but it is certainly not possible that all of the values are accurate; and, it is not certain that any of the values are accurate.

 

Tags: Temperature Record, Global Temperature, National Climatic Data Center (NCDC), NASA

Highlighted Article: Circular Reasoning In Climate Change Research

By: Jamal Munshi

 

 Circular Reasoning In Climate Change Research

 

ABSTRACT: A literature review shows that the circular reasoning fallacy is common in climate change research. It is facilitated by confirmation bias and by activism such that the prior conviction of researchers is subsumed into the methodology. Example research papers on the impact of fossil fuel emissions on tropical cyclones, on sea level rise, and on the carbon cycle demonstrate that the conclusions drawn by researchers about their anthropogenic cause derive from circular reasoning. The validity of the anthropogenic nature of global warming and climate change and that of the effectiveness of proposed measures for climate action may therefore be questioned solely on this basis.

Circular Reasoning In Climate Change Research

 

Tags: Highlighted Article
Search Older Blog Posts