Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Reanalysis Accuracy

The World Meteorological Organization (WMO) recently announced that July 2019 was the warmest July on record. This assessment was based on “adjusted” data provided by NOAA. The WMO attributed the record to climate change driven by anthropogenic CO2 emissions and suggested that it was consistent with what is to be expected from climate change.

Dr. Roy Spencer responded to the WMO announcement, suggesting that July 2019 was not the warmest July ever, but probably the fourth warmest. Spencer stated that these announcements were based on “adjusted” data which shared three major problems: the Urban Heat Island (UHI) effect; inconsistent sea surface temperature measurement approaches; and, incomplete geographic coverage.

Spencer suggested that Global Reanalysis temperature sets would be a more appropriate basis upon which to base pronouncements regarding global temperature. The global reanalyses are only available over the most recent 40 years, rather than the 100+ year records available from the near-surface thermometer records, since they rely in part on the satellite temperature record.

However, Spencer later compared three global reanalysis temperature sets and determined that the ERA5 reanalysis used by the WMO produces a global temperature estimate approximately 0.2°C higher than the other two reanalysis products. While a difference of 0.2° might seem minimal, this difference is between temperature anomaly products and the estimated total global annual average temperature anomaly over the period ranges from 0.38 to 0.94°C, so the difference is approximately 20-50%.

The ERA5 reanalysis also portrays a rate of increase of global average temperature 0.04-0.05°C per decade more rapid than the other two reanalysis products and 0.05°C more rapid than the UAH satellite data record. Again, while 0.05°C per decade might seem minimal, the difference is between temperature anomaly reanalysis products and represents a difference of approximately 38% compared with both the CFSv2 reanalysis product produced by the NOAA National Center for Environmental Prediction and the satellite record produced by UAH.

These differences between temperature reanalysis products are obviously very significant. Perhaps more significant is the fact that we do not know if any of these reanalysis products is accurate. They represent merely best estimates of the temperature anomaly by each of their producers, based on the agglomeration of measurements provided by numerous sources.

In a recent news release, NASA GISS claimed that their temperature anomaly product, GISSTEMP, was accurate to 0.05°C in recent decades, essentially the period over which the reanalysis products have been available. This would suggest that GISSTEMP accuracy is approximately twice as accurate as the average of the temperature reanalysis products, though that cannot be proven.

It is interesting to compare the temperature anomaly estimates from the various producers over the same time period as the temperature reanalyses, or 1979 to present.

                                NASA GISSTEMP                              0.94°C

                                NOAA NCDC                       0.95°C

                                HadCRUT 4                                          0.71°C

                                RSS MSU                                              0.71°C

                                UAH                                                       0.38°C

The first three listed anomalies are based on near-surface temperature measurements. Collectively, they average 0.87 +0.08/-0.16°C. This is a large and very significant range. The final two listed anomalies are satellite-based measurements. They average 0.54 +/- 0.16°C, which is also a large and very significant range. Obviously, not all of these global temperature anomaly estimates can be correct; and, it is not certain that any of them is correct.

Such is the status of the “settled science” of global annual temperature anomaly measurement.

 

Tags: Global Temperature, Temperature Record

Highlighted Article: Propagation of Error and the Reliability of Global Air Temperature Projections, Mark II.

 

From: Watts Up With That?

By: Pat Frank

Date: September 7, 2019

 

Propagation of Error and the Reliability of Global Air Temperature Projections, Mark II.

 

... "Climate modelers are evidently not trained in the scientific method. They are not trained to be scientists. They are not scientists. They are apparently not trained to evaluate the physical or predictive reliability of their own models. They do not manifest the attention to physical reasoning demanded by good scientific practice. In my prior experience they are actively hostile to any demonstration of that diagnosis.

In their hands, climate modeling has become a kind of subjectivist narrative, in the manner of the critical theory pseudo-scholarship that has so disfigured the academic Humanities and Sociology Departments, and that has actively promoted so much social strife. Call it Critical Global Warming Theory. Subjectivist narratives assume what should be proved (CO2 emissions equate directly to sensible heat), their assumptions have the weight of evidence (CO2 and temperature, see?), and every study is confirmatory (it’s worse than we thought)." ...

 

Propagation of Error and the Reliability of Global Air Temperature Projections, Mark II.

 

Tags: Highlighted Article

Sea Surface Temperature (SST) Anomaly Accuracy

NASA GISS recently issued a news release regarding the accuracy of their GISTEMP near-surface temperature anomaly product which is discussed here (GISSTEMP Accuracy). This raised questions about the accuracy of the sea surface temperature anomaly products.

NOAA NCDC focused attention on the sea surface temperature anomaly in the Summer of 2015 with the accelerated release of Karl et al 2015, (ERSST.v4) intended to debunk the idea that there had been a global warming “pause” prior to the climate conference in Paris. This revision to NCDC’s ERSST increased the calculated SST anomaly of approximately 0.6°C (0.92°F) by approximately 0.08°C (0.10°F), or approximately 13%. NCDC has since released ERSST.v5, which reduced the calculated SST anomaly by 0.10°C (0.18°F) below ERSST.v4 and 0.02°C (0.03°F) below ERSST.v3. However, in the process, NCDC increased the rate of increase of SST by a factor of 2.5.

Sea Surface Temerature Trend

NCDC currently calculates the SST anomaly as 0.8°C. HadSST3 currently calculates the SST anomaly at approximately 0.55°C, while UAH shows the anomaly as 0.45°C. This is a very broad range of values for temperature anomalies which are reported to two decimal place accuracy. The temperatures measured by the Argo Buoys are 0.15-0.45°C lower than the calculated global averages. The difference between the Argo temperatures are smallest compared with the UAH values and greatest compared with the NCDC values. Both ERSST and HadSST use a combination of temperature measurements collected by ships, using various measuring methods, and temperatures measured by the Argo buoys. UAH uses satellite measurements. (Source)

NASA GISS stated that their analysis demonstrated that the “the resulting trends are more robust than what can be accounted for by any uncertainty in the data or methods”. However, there is no real question that the globe is experiencing a warming trend. Rather the questions revolve around the actual magnitude of the warming, the historical rate of warming and the potential future rate and magnitude of the warming. While NASA GISS currently estimates the global temperature anomaly as 0.90°C, HadCRUT estimates the anomaly as 0.70°C and UAH calculates the anomaly as 0.38°C. This is a substantial range of values for anomalies reported to two decimal place accuracy

The graph above from NCDC displays significant uncertainty regarding the magnitude and rate of warming for the period shown. The comparisons with HadSST and UAH display even greater uncertainty regarding the magnitude of the warming of the world’s oceans.

The issues with the land surface temperatures confirm the desirability of establishing a global land surface climate fiducial reference measurements network. The issues with the sea surface temperatures suggest the desirability of establishing a global sea surface temperature fiducial reference measurements network as well. Global temperature measurement has been fraught with inaccurate data, data “adjustments” and data “infilling” for far too long. There is no excuse for such casual treatment of documentation of what some describe as an existential threat to humanity.

However, as serious as the issues with temperature measurement appear to be, the issues with the models used to project potential future temperatures are both far more serious and far more vital.

 

Tags: Global Temperature, Temperature Record, Sea Surface Temperature (SST)

GISSTEMP Accuracy

The NASA Goddard Institute for Space Studies recently issued a news release regarding new studies of the accuracy of the GISTEMP global temperature anomaly product. Based on the results of these studies, NASA GISS currently estimates that their estimates of the estimated global annual average near-surface temperature is “ are likely accurate to within 0.09 degrees Fahrenheit (0.05 degrees Celsius) in recent decades, and 0.27 degrees Fahrenheit (0.15 degrees C) at the beginning of the nearly 140-year record”.

This news release is interesting on several levels; and, it appears to raise more questions than it answers. The release identifies the estimated global average temperature increase since 1880 at 2°F, ignoring any “significant” decimal places. The estimated two “significant” decimal place accuracy of 0.09ºF over recent decades, conventionally presented as 2+/- 0.09°F, is approximately +/- 5%. The estimated two “significant” decimal place accuracy of 0.27°F at the beginning of the temperature anomaly record, conventionally presented as 0+/-0.27°F, represents an undefined percentage error.

Estimating the accuracy of an estimated temperature anomaly record requires comparison to an established factual temperature anomaly record, in this case a temperature anomaly record of at least two decimal place accuracy. Unfortunately no such established factual temperature anomaly record exists. NASA GISS does not state the temperature anomaly record used for its GISTEMP accuracy estimates in the news release.

NASA GISS does reference a comparison to the surface temperature record provided by the AIRS instrument aboard the NASA Aqua satellite. However, the AIRS instrument measure surface temperature, rather than the air temperature approximately 2 meters above the surface. Therefore, while AIRS might be useful for confirming trends, it is unsuited to verifying measurements. The AIRS comparison is only applicable to the period since 2003, when the AIRS instrument was placed into orbit.

NASA GISS did not mention whether it had compared the GISTEMP temperature record for the contiguous United States with the temperature record provided by the US Climate Reference Network over the period for which USCRN data are available. While this is neither a long nor a global record, it is a highly accurate, unadjusted near-surface temperature record which could have been used to “ground truth” the GISTEMP estimate for the contiguous United States for the same period.

It should be noted that NASA GISS produces GISTEMP using “adjusted” temperatures provided by NOAA; and, that NASA GISS has ‘readjusted” these “adjusted” temperatures numerous times. The graph below shows plots of the GISTEMP global annual temperature anomaly presented in three different years over the past 35 years. Note that the anomaly estimates have been reduced in the early years and increased in the later years of the temperature record. The cumulative effect of these “readjustments” increased the reported global temperature anomaly over the period by approximately 0.4°C (~0.7°F). These “readjustments” represent approximately one third of the total reported anomaly estimate over the period.

 

GISS TEMP

 

The issues with GISTEMP point again to the desirability of establishing a global land surface climate fiducial reference measurements network.

 

Tags: Temperature Record, Global Temperature

Urban Heat Island Effect 2

“An urban heat island (UHI) is a metropolitan area which is significantly warmer than its surroundings. According to the EPA, many U.S. cities have air temperatures up to 10°F (5.6°C) warmer than the surrounding natural land cover. This temperature difference usually is larger at night than during the day and larger in winter than in summer, and is most apparent when winds are weak. The main causes are changes in the land surface by urban development along with waste heat generated by energy use. As population centers grow, they tend to change greater areas of land which then undergo a corresponding increase in average temperature.”  Source

Recently there has been renewed interest in the Urban Heat Island Effect, particularly how it affects the temperature measurements used to calculate the global average near-surface temperature. Bob Tisdale performed a series of analyses of “adjusted” data obtained from Berkeley Earth. These analyses demonstrated that for many of the most populace nations of the globe and for the globe as a whole, the average minimum daily temperature is rising approximately twice as rapidly as the average daily maximum temperature. Much of this difference is due to mass of various types located adjacent to the measuring stations. In the most populace nations of the globe, this mass is frequently associated with urban structures, including buildings, roadways and sidewalks. Heat rejected by the buildings, vehicles and other energy consuming equipment add to the temperature difference.

Analysis of data from China suggests that daily average minimum temperatures are increasing approximately four times as rapidly as predicted, compared with daily average maximum temperatures. The researchers estimate that approximately 50% of the warming reported for China over the past 80 years is the result of UHI bias. It is reasonable to assume that such UHI bias affects the data collected in other nations, though perhaps not to that extent.

The magnitude of the UHI effect has significant technical policy implications. Accurately measuring global climate change requires that the temperature measurements not be affected by UHI, since UHI is a very localized phenomenon affecting a very small percentage of global land area, while scientific interest is in changes which are global in scope. UHI is a local phenomenon, superimposed upon a global phenomenon. Temperatures in the UHI are obviously of interest to the occupants of the UHI but are not indicative of the temperatures in the massive areas unaffected by the UHI effect.

The magnitude of the UHI effect also has significant social policy implications. Environmentalists are promoting social policies which would cause relocation of suburban and rural populations to cities. However, as noted above, growth of urban areas leads to increased temperatures in the UHI, in addition to temperature changes in the surrounding areas. These effects might be offset to some degree by urban design, more efficient buildings, more reflective structural surfaces, elimination of internal combustion engine vehicles, etc. However, it is unlikely that some level of additional temperature increase in the UHI could be avoided. It is also unlikely that the increases in temperatures in the UHI would be offset by temperature decreases outside the UHI.

It seems logical that the technical and technical policy issues discussed above should be understood and dealt with before significant social policy changes are implemented, to assure that the social policy changes do not produce adverse results.

 

Tags: Urban Heat Island

Highlighted Article: Sustainability: Ideology versus Reality

From: MasterResource

By: Paul Driessen

Date: August 26-28, 2019

 

 

Sustainability: Ideology versus Reality

 

Part I: Biofuels and Solar

Part II: Wind Turbines

Part III: The Big Picture

 

"They could have had a global teleconference to save millions of dollars and millions of gallons of aviation and vehicle fuel. They could have set a good example and avoided massive carbon dioxide emissions. They could have been more honest, ethical and sustainable … and less hypocritical.

But instead, some 20,000 activists, bureaucrats and politicians have flown to Salt Lake City, Utah, from around the world for yet another climate-related conference, this one the 68th United Nations Civil Society Conference (August 26–28, 2019).

The globetrotters are staying at fancy hotels and eating fine food–and debating how the rest of humanity must travel, live, work, drive, farm, eat, and use (or not use) energy. The buzz word is sustainability to save the planet from resource depletion and climate cataclysms.

Conference organizers could have invited ..."

 

Sustainability: Ideology versus Reality

 

Part I: Biofuels and Solar

Part II: Wind Turbines

Part III: The Big Picture

 

Tags: Highlighted Article

USA Climate Priorities 2

The previous commentary on USA Climate Priorities discussed priorities regarding climate science, specifically regarding: accurate temperature measurement; establishment of specific values for climate sensitivity and feedbacks; and, verifying a climate model. Addressing these priorities is essential to understanding the potential future challenges which might be presented by climate change.

This commentary focuses on priorities regarding energy production. The extent to which US and global energy production must move toward zero CO2 emissions is a function of the results of the climate science research discussed above and in the previous commentary. Results confirming low sensitivity and minimal or negative feedbacks would suggest a modest progression toward increased renewable energy production. Results which confirmed high sensitivity and positive feedbacks would suggest a more aggressive progression toward zero emissions technologies for energy production.

There is growing recognition that a sole focus on wind, solar and battery storage to replace the current global energy infrastructure would represent a high cost, low reliability approach which would ultimately prove unacceptable or unachievable, absent some major scientific breakthroughs.

There are three reliable and dispatchable zero emissions technologies employed in the current global energy economy: nuclear, hydroelectric and geothermal steam. There are three additional potentially reliable and dispatchable technologies which have been identified but remain to be developed and implemented: wave energy, ocean thermal energy conversion and dry hot rock geothermal.

Several US thought leaders, including Bill Gates and James Hansen, are convinced that the dramatic reductions in global CO2 emissions envisioned by environmental and climate activists are unachievable without a significant increase in nuclear energy production. Nuclear technology is proven and is capable of significant expansion to meet current and future needs, as are hydroelectric and geothermal generation, though to a lesser extent.

Research priorities for nuclear generation include: inherently safe reactors; modular reactors; reactors capable of using a higher percentage of the energy available from the nuclear materials; and, reactors capable of being fueled with the spent fuel currently stored at nuclear generating facilities. Some research has already been conducted in each of these areas, but these efforts could easily be expanded and accelerated.

Hydroelectric development is currently underway in several countries, including China and India. Attempts to expand  US hydroelectric capacity have met with fierce resistance from environmental activists, many of whom argue for removal of existing hydro facilities. Some environmental organizations do not even include hydro in their lists of renewable technologies.

Geothermal steam generation is currently limited to areas where there are currently steam vents. Dry hot rock geothermal could be far more widely available, since dry hot rock could be accessed globally. However, limited experiments in Europe have triggered earthquakes, resulting in suspension of the research efforts.

Wave energy and ocean thermal energy conversion are also in their infancy but have very significant generation potential if successfully developed and deployed.

Major energy technology research efforts to advance these technologies could result in lower cost, reliable energy supplies; and, in major new industries to build, manage and maintain them. These research efforts could offer the potential to provide reliable electricity to developing and not-yet-developing countries as an alternative to expanded fossil fuel generation.

 

Tags: Renewable Energy, Nuclear Power

Highlighted Article: The Case for a Coercive Green New Deal?

From: American Institute for Economic Research

By: Richard M. Ebeling

Date: August 6, 2019

 

The Case for a Coercive Green New Deal?

 

"Social and economic crises, real and imagined, often seem to bring out the most wrongheaded thinking in matters of government policy. Following the 2008 financial crisis and with the fear of “global warming,” there has been a revival in the case for “democratic” socialism. But now its proponents are “out of the closet” with a clear cut and explicit call for forcefully imposed, authoritarian central planning of the world.

John Feffer is affiliated with the Washington, D.C.–based Institute for Policy Studies, a “progressive” think tank that has never seen a government command or control, regulation or redistribution that they seemingly have not liked – as long as it reflects their version of preferred social engineering compared to anyone else’s, of course. He has ..."

 

The Case for a Coercive Green New Deal?

 

Tags: Highlighted Article

USA Climate Priorities

The UNFCCC, the consensed climate science community and the environmental community assert that the USA should assume the mantle of climate change leadership and prioritize CO2 emissions reductions and financial transfers to the UN Green Climate Fund. Numerous US politicians and self-appointed “thought leaders” assert that climate change is a crisis, an “existential threat” to life on the earth. Numerous US cities, most recently New York City, have declared “climate emergencies”, intended to encourage the federal government and industry to take actions to eliminate or reverse climate change. These suggested priorities and assessments are fundamentally misguided. There is no crisis or emergency. The climate change which has apparently occurred has had a net positive impact. There is no certainty that it will not continue to be net positive.

There remain significant issues with our understanding of climate science which suggest that precipitate action is both unnecessary and unwise. There also remain significant limitations with renewable energy technology which suggest that it is unready to be relied upon to replace fossil fuels in the global energy economy. Turning US and global climate change action into a new “Manhattan Project” or the “Moral Equivalent Of War” would further increase the resulting economic cost and disruption.

The US and other nations have the intellectual and financial resources to address and resolve many of the significant issues in climate science, if they are willing to refocus their efforts on the basic science and assure that all of the data, computer code and analytical tools used in the research programs are freely available to other researchers in the various fields of study. This step is essential to facilitate reproducibility testing and to minimize unnecessary duplication of effort.

The first essential step in the process is the deployment of reliable and accurate temperature measuring stations in sufficient numbers and with sufficient global coverage, both land and sea, to assure that the changes in global temperatures can determined without “adjustment”. The combination of these measuring stations and the existing satellite network would permit both improved measurement accuracy and more comprehensive coverage.

The second essential step in the process is the establishment of definitive values for the sensitivity of the climate, the “greenhouse gases” and the feedbacks in the atmosphere caused by water vapor and clouds. These values are currently expressed as a range of estimated values, though recent research suggests that the actual climate sensitivity is below the lower end of the range of values currently in use.

The third essential step is improvement of the comprehensiveness of the climate models and their accuracy in modeling the real climate. This step would permit progress toward verifying one climate model capable of hindcasting historical climate without “tweaking”.

Government and private research funding sources must act to assure that their research contractors provide open access to their data and methods. Researchers who fail to provide open access should be excluded from further funding, since the accuracy of their research processes and reported results cannot be relied upon with confidence. There are sufficient resources available to fund and conduct this important climate research, but there are not sufficient funds to continue to waste funding on questionable or irreproducible research, or on “political science” intended to scare the populace.

 

Tags: Climate Models, Climate Sensitivity, Global Temperature, Greenhouse Gas

Highlighted Article: July 2019 Was Not the Warmest on Record

From: www.drroyspencer.com

By: Roy W. Spencer, Ph. D.

Date: August 2, 2019

 

July 2019 Was Not the Warmest on Record

 

July 2019 was probably the 4th warmest of the last 41 years. Global “reanalysis” datasets need to start being used for monitoring of global surface temperatures. [NOTE: It turns out that the WMO, which announced July 2019 as a near-record, relies upon the ERA5 reanalysis which apparently departs substantially from the CFSv2 reanalysis, making my proposed reliance on only reanalysis data for surface temperature monitoring also subject to considerable uncertainty].

 

July 2019 Was Not the Warmest on Record

 

Tags: Highlighted Article

Peer Review Anew

The members of the consensed climate science community are very quick to assert that the results of their research have been peer reviewed. However, the peers who have done the reviews are typically other members of the consensed climate science community. Some journals even permit the authors to select the peers who will perform the reviews. Dr. Patrick Michaels has described the peer review process as it is currently performed as “pal review”. We have discussed some of these issues previously (here).

Members of the consensed climate science community and their connections in scientific publishing have also conspired to prevent publication of research results produced by skeptical scientists, including scientists both refusing to function as peer reviewers and recommending rejection of papers for publication. Further discussion of this and related issues is available here.  

Dr. William Happer, a National Security Council science adviser, has recommended the establishment of a President’s Commission on Climate Security composed of climate scientists to conduct a critical review of the federal government reports and research programs related to the potential impacts of climate change on national security. The initial focus of the commission would be on DoD studies related to national security and on the Fourth National Climate Assessment. The commission would also review the science underlying these reports.

Members of the consensed climate science community and their allies have been very critical of the proposed commission, some even referring to it as “Stalinist”.  However, the intended mission of the commission is peer review of government-funded climate science in all its aspects: solicitation; award; conduct; supervision; peer review; and, publication. It is a due diligence review, triggered in part by continued warnings of climate catastrophies which have not occurred.

The Commission has not been established and the climate scientists who might participate have not been selected. However, several scientists have been identified as potential participants, including Dr. Judith Curry, Dr. John Christy and Dr. Richard Lindzen. Other potential participants include Steve McIntyre and Ross McKittrick, who have been critical of the statistical techniques used to analyze research results, including the Mann “Hockey Stick”.

The proposed commission would likely not include any members of the consensed climate science community, since they have conducted the research in question and analyzed and reported its results or have participated in peer review of the research. Therefore, their input is already in the record. However, it is likely that they would be called to meet with the commission if there are questions or concerns about their research and their analysis of the results.

The commission’s efforts would likely lead to questions regarding the provenance of climate data, the accuracy of the climate models, the uncertainties regarding climate sensitivity, forcings and feedbacks. These questions are the subject of skeptical research and analysis, but receive little attention in the scientific literature or the media.

The establishment of such a presidential commission seems little different from a corporate selection of an outside auditor to review its accounting and reporting procedures or employment of an outside consultant to review corporate structure and future business plans.

 

Tags: Peer Review

Barely Measurable?

Several recent media articles have suggested that even if the US adopted the Green New Deal, or somevariant thereof, which reduced or eliminated US CO2 emissions, the impact on global warming would be “barely measurable”. These suggestions betray a fundamental misunderstanding of the status of climate science. In reality, the impact would be unmeasurable and barely calculable.

Global annual CO2 emissions are not measurable, since most of the sources of CO2 emissions, both natural and anthropogenic, are not instrumented. Estimated annual anthropogenic CO2 emissions are calculated based on estimated annual fossil fuel consumption.

Global annual CO2 removal from the atmosphere by the global oceans and growing trees and plants is also estimated, based on changes in ocean temperatures and on estimated plant mass and uptake.

Future global CO2 emissions can only be projected with questionable accuracy, so it would not be possible to measure the impact of even measured reductions in any nation’s emissions on the uncertain estimates of future global emissions, assuming that any nation’s emissions could actually be measured.

The impact of increased atmospheric CO2 concentrations on global temperatures can only be estimated, based on estimates of climate sensitivity, forcings and feedbacks input into unverified climate models. It is not currently possible to separately measure the impacts of natural and anthropogenic changes on global temperatures. Global average temperatures have changed, both positively and negatively, prior to and subsequent to significant anthropogenic CO2 emissions; and, the causes of these changes are not clearly understood. However, it is clearly unreasonable to assume that these natural variations ceased when humans began emitting significant quantities of CO2 into the atmosphere.

Similarly, any impacts of increased atmospheric CO2 concentrations on other aspects of climate, such as droughts, floods, tornadoes and hurricanes cannot be measured, though climate scientists have begun performing computer model-based attribution studies to estimate such impacts. However, these climate models are unverified and the factors entered into the models are estimates, rather than measured quantities. The frequency of occurrence, duration and severity of these natural events can be measured, but the data suggest that there is no clear anthropogenic signal in any aspect of any of the events. Such a signal might exist, but it is far exceeded by the range of historical natural variation in weather and climate.

Finally, the Social Cost of Carbon, which attempts to take into account all of the suspected negative impacts of increased atmospheric CO2 concentrations, is not based on measurements, but again on estimates of the potential adverse impacts produced by entering estimates of climate sensitivity, forcings and feedbacks into unverified climate models. No similar effort has been made to analyze the social benefits of carbon, though it is becoming progressively more obvious that such benefits exist. The greening of the globe documented by NASA satellites is attributed largely to the increase in atmospheric CO2 concentrations, though again the percentage attributable to increased CO2 is an estimate and not a measurement.

Climate science actually measures only atmospheric CO2 concentration, near-surface land and ocean temperatures and sea level; and, the temperature and sea level measurements are of limited accuracy. Climate science also counts weather events and measures their duration and intensity, but can only estimate the impact of increased atmospheric CO2 concentrations on these events.

 

Tags: CO2 Emissions, Estimates as Facts, Climate Models
Search Older Blog Posts