Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Highlighted Article: Obama Carbon Colonialism and Climate Corruption Continue


From: Townhall

By: Paul Driessen and David Wojick



  1. PART 1 - August 11, 2018
  2. PART 2 - August 13, 2018
  3. PART 3 - August 14, 2018


It’s obscene enough when the Multilateral anti-Development Banks do it. But Trump agencies?!?

In a prime example of Deep State revanchism, despite the profound change in administrations, the US Agency for International Development is still funding and advancing anti-energy Obama-era climate change dogmas and policies for developing countries. USAID handles tens of billions of dollars a year, roughly half of all US foreign aid, so this climate alarmism puts literally millions of lives at risk.

USAID calls its “flagship” program “low emissions development.” Emissions of course means plant-fertilizing, life-giving carbon dioxide – but the term is intended to suggest dangerous climate changing pollution. The effect, if not the intent, is to deprive poor countries of the enormous life-enhancing benefits of abundant, affordable electricity and fossil fuels, which created the health and wealth Americans enjoy.


  1. PART 1 - August 11, 2018
  2. PART 2 - August 13, 2018
  3. PART 3 - August 14, 2018


Tags: Highlighted Article

Trump EPA Accomplishments & Challenges

The resignation of EPA Administrator Scott Pruitt and the appointment of Acting Administrator Andrew Wheeler presents an opportunity to review what has been accomplished at EPA and what challenges remain to be addressed.




  • “Secret Science”

Administrator Pruitt terminated the use of “secret science” by the EPA in the regulatory process. This decision raised concerns about the protection of information regarding individuals involved in the research projects. However, it is clear that the data, methods, analytical techniques and models used in the research can be made public to facilitate reproduction and validation without disclosing personal data of the participants.

EPA Ends The Use Of “Secret Science” In Crafting Regulations – Hot Air


  •  “Sue & Settle”       

EPA has terminated the use of the practice of “sue & settle” under which EPA assisted environmental activist groups in suing EPA, then settled the suits by adopting the regulations or practices advocated by the environmental groups.

Sue and Settle: Regulating Behind Closed Doors –US Chamber of Commerce


  • Paris Accords

President Trump notified the UN of the intent of the US to withdraw from the Paris Accords because the Accords uniquely disadvantaged the US; and, would ultimately be ineffective.

U.S. submits formal notice of withdrawal from Paris climate pact - Reuters


  •  “Green Climate Fund”

The US notified the UN that it would cease contributing to the UN Green Climate Fund. The GCF was intended to provide annual funding of ~$100 billion by 2020 to assist developing and not-yet-developing countries to reduce CO2 emissions. The Fund was slated to grow to $400-425 billion per year by 2030. The US contribution was to be ~25% of the total funding.

U.S. withdrawal from the Paris Agreement: Reasons, impacts, and China's response - ScienceDirect


  • EPA Advisory Panel

Administrator Pruitt dissolved the EPA Advisory Panel, which was constituted of scientists, many of whom were also funded by EPA grants, in an apparent conflict of interest.

U.S. Government Disbands Climate-Science Advisory Committee – Scientific American





  • Clean Power Plan

EPA has stated an interest in repealing the Clean Power Plan, which effectively prevents the construction of new coal generating stations and requires the closure of many existing plants. The emissions levels required under the CPP would require installation and operation of carbon capture and storage technology which is not currently commercially available and appears to significantly reduce plant efficiency.

Electric Utility Generating Units: Repealing the Clean Power Plan: Proposal - EPA


  • Endangerment Finding   

Administrator Pruitt had expressed interest in re-evaluating and potentially reversing the 2009 Endangerment Finding regarding CO2. The finding was based on IPCC modeled outputs, which have now been demonstrated to be inaccurate.

Trashing EPA's endangerment finding would be tough – E&E News


  • UNFCCC (United Nations Framework Convention on Climate Change)

The US is required under current US law to exit the UNFCCC, since that body has recognized the Palestinian Authority as a “state level” participant. This action has not yet occurred.

Why the U.S. Should Clexit and Pexit—Exit UNFCCC and Paris Climate Treaty – Cornwall Alliance


  • Discontinue funding international climate efforts

The US continues to fund UN climate change activities through the UNFCCC. It appears that this funding will be terminated in the 2019 federal budget.

Trump Budget Would Cripple U.S. International Climate Change Work – Inside Climate News


  • Tiger Teams           

EPA should work with NOAA and NASA to launch tiger teams to conduct a detailed review of the climate change activities of those agencies. One focus of these teams would be the continual changes in the historical climate record by these agencies.

What is a Tiger Team Approach? - Trextel


  • Red Team - Blue Team

Administrator Pruitt advocated an open climate change debate using a Red Team – Blue Team approach. This open debate has not occurred and there is significant confusion regarding Administration support for such an effort.

Red team-blue team exercise will expose the junk science that filled Obama's EPA – The Hill


The Administration has accomplished much in its first 18 months, particularly at EPA under Administrator Pruitt. However, much remains to be done at EPA under the direction of Acting Administrator Wheeler.


“It ain’t over till it’s over.” - Yogi Berra, American philosopher


Tags: Clean Power Plan, Donald Trump, EPA, EPA Endangerment Finding, Red Team Blue Team Debate, Paris Agreement, United Nations

How Do We Scare Them About Climate Change Now?


What are the committed political science community, the consensed climate science community and the complicit media science community to do if the tens of millions of dollars spent on the creation of climate model based “scary scenarios” is not enough to scare the voting public into demanding drastic measures to save the planet from the projected “climategeddon”?


Commit additional millions to fund additional studies to produce additional, even scarier scenarios to be broadcast with even greater feigned certainty of future devastation. What else?

The following is a mere sampling of the most recent “even scarier scenarios” intended to free us from our malaise and spur us into demanding action.

The consensed climate science community and their funders and cheerleaders apparently do not understand, or simply choose to ignore, that the voting public in the US has reached climate crisis saturation. Previous disaster predictions have failed to materialize, no matter how often and how hysterically the “Chicken Littles” have announced that “the sky is falling”.

Meanwhile, weather persists. Winter is cold, with snow. Summer is hot, with thunder storms. Floods and droughts occur. Hurricanes, tropical storms and tornadoes continue to form. Some climate scientists claim that anthropogenic climate change makes all of these weather events more frequent, more intense, more damaging and more life threatening. However, the data do not support such claims.

Of course, the creation of the “even scarier scenarios” has not stopped the creation of silly “scary scenarios”, such as those listed below.


Meanwhile, what we should be doing now remains undone. Climate data are still suspect, as the result of installation issues, “adjustment” and “infilling”. Climate models remain unverified and continue to “run hot”. Climate sensitivity is still undetermined. Cloud forcings are still uncertain. The differences between surface-based and satellite-based sea level rise measurements remain unresolved, as do the differences between near-surface and satellite temperature measurements.

These unresolved scientific issues are far more important that the “scary scenarios”, which are built on the unresolved scientific issues.

The current situation is reminiscent of the approach demanded by the Queen of Hearts in the trial scene from Alice’s Adventures in Wonderland, by Lewis Carroll”:

                                    “Sentence first – verdict afterward”

It seems hardly scientific and equally silly to demand “verdict first – evidence afterward”, as appears to be the case with current climate science, at least from the perspective of the political science community and their cheerleaders in the media.

However, when all else fails, there always remains the old trial lawyers’approach:

“When the facts are on your side, pound the facts. When the law is on

your side, pound the law. When neither is on your side, pound the table.”


Tags: Climate Science, Climate Change Debate, Climate Change Myths

Temperature Measurement Bias

The evolution of ambient temperature measurement technology has exposed several fundamental biases in the measurements taken to monitor changes in the climate. These biases must be quantified and compensated for to maintain the integrity of the historical temperature measurement record.

The US Climate Reference Network (CRN) is the most advanced network of ambient temperature measurement stations currently in use. The CRN stations were sited remotely to eliminate or minimize the potential impact of their surroundings on the representativeness of the ambient temperature measurements. Each station includes three high precision resistance temperature devices (RTDs) installed in fan aspirated radiation shield enclosures to assure constant air flow over the sensors, regardless of ambient wind conditions. The photo below shows a typical CRN location.


CRN Weather Station


Measurements taken by the CRN stations were compared with measurements taken with the Maximum Minimum Temperature Systems (MMTS) collocated with the CRN stations. These tests demonstrated significant measurement biases with the MMTS, particularly under conditions of high solar radiation and minimum ambient wind speed. These biases are believed to result from the fact that the MMTS relies on natural convection to induce air flow over the sensors. The MMTS are also subject to additional bias as the reflectivity of the enclosure degrades due to weathering and ultraviolet degradation. The photo below shows a typical MMTS.


MMTS Thermometer Enclosure


The MMTS came into common use for climate temperature measurement in the 1980s, replacing the Cotton Region Shelter (CRS) or Stevenson Screen with liquid-in-glass thermometers, which had been the measurement station of choice for most of the 20th century. The photo below shows a typical CRS.


CRS Thremometer Enclosure


The major disadvantages of the CRS include the requirement for a person to manually read the thermometers accurately at the proper time and the weather-driven deterioration of the reflectivity of the enclosure coating. The CRS produced biases are believed to result from its reliance on natural convection to induce air flow over the sensors and the heat storage capacity of the wooden box.

Measurements taken by CRN stations located relatively close to MMTS and CRS measurement stations are relied upon to verify the procedures used to correct for biases present in the MMTS and CRS stations. However, correcting known or suspected biased readings is less desirable than taking accurate readings in the first instance. This has recently resulted in calls for the development of a global land surface climate fiducial reference measurements network.

There has been a somewhat similar evolution in sea surface measurement approaches. Originally, sea surface temperature measurements were taken by lowering buckets of various construction from the decks of ships, allowing the buckets to fill with water, raising the buckets to the deck, immersing thermometers into the buckets to measure water temperature and then removing the thermometers from the water to read them and record the readings. This method was and is fraught with potential errors. More recently, many ships are equipped with temperature measuring devices located in the engine cooling water inlets. However, these readings are biased by the temperature conditions surrounding the cooling water inlets. Also, the water depth at which the samples are taken is a function of how heavily the ship is loaded, since the cooling water inlets must be located sufficiently below the Plimsoll line that they are beneath the water line when the ship is lightly loaded. Therefore, under normal operating conditions, the water temperature measured is not the surface temperature, but rather the temperature several feet below the surface, disturbed by the ship’s bow wake.

Deployment of drifting and moored buoys provides more accurate and reliable temperature measurement at the surface. These buoys have exposed the warm bias in the shipborne measurements. The more recent deployment of the Argo buoys has also facilitated measurement of sea temperatures at depth, providing valuable information regarding heat storage in the oceans.

The principal bias of the US CRN and the buoy-based measurements is accuracy. These stations are also used to check the accuracy of the satellite-based measurements, whose principal bias is comprehensive geographical coverage. These are highly desirable and valuable biases.


Tags: Temperature Record, US Climate Reference Network (CRN), Bias


Atmospheric physics establishes that the impact of incremental additions of CO2 to the earth’s atmosphere approaches zero asymptotically, as shown in the graph below.

Heating Effect of CO2

This means that any increase in the global average surface temperature resulting from increased atmospheric CO2 must ultimately result in the global average surface temperature asymptotically approaching a limiting temperature, all other things being held equal, which of course they are not.

The general shape of the curves produced by the CMIP5 ensemble of climate models should therefore show an asymptotic approach to some temperature. That is clearly not the case, as shown in the graph below.


90 CMIP5 Climate Models vs. Observations


However, as shown in the graph, both the HadCRUT 4 near-surface temperature anomaly product and the UAH lower tropospheric temperature records do appear to be asymptotically approaching some limiting temperature, though the period shown in the graph is shorter than the 30-year period considered to represent climate. Since this graph was created in 2014 the annual observations have recorded a significant temperature anomaly spike resulting from a super El Nino and have now begun declining toward the paths they have followed since about 1998. The magnitude of the spike is attenuated by the 5-year means shown in the graph. The “x”s added to the graph are approximations of the 5-year means for both HadCRUT 4 and UAH. The super El Nino is not reflected in the model projections.

The graph below shows the future absolute temperatures projected by the ensemble of climate models, depending on the Representative Concentration Pathway (RCP) used for the analysis. The models show an asymptotic approach to approximately 15ºC around 2050, or an anomaly of approximately 1.5ºC, using RCP 2.6, then a slight decline by 2100. The absolute temperature asymptotically approaches approximately 16ºC in 2100 using RCP 4.5, or an anomaly of approximately 2.5ºC. The models do not begin to show an asymptotic approach to a temperature by 2100 using RCP 8.5; and, the approximate anomaly at that time has reached approximately 5ºC. RCP 8.5, fortunately, is showing itself to be unrealistic.


HadCRUT3v, CRUTEM3v, CMIP3 ensemble, CMIP5 ensemble


Absolute temperatures from climate model historical realizations and future scenarios. Black line is the HadCRUT3v blended land and ocean temperature dataset and red line is CRUTEM3v land-only temperatures [Brohan et al., 2006]. Blue lines are three historical realizations, while orange, green and brown are future RCP-scenario realizations with the MPI-ESM-LR model, and light gray lines are the first historical realization from each model found in the CMIP3 dataset [Meehl et al., 2007] and dark gray lines the corresponding CMIP5 historical realizations [Taylor et al., 2012]. Some model realizations were started later than 1850. The estimated Last Glacial Maximum temperature range of 4–7 K below present is from Intergovernmental Panel on Climate Change [2007].


Overview of representative concentration pathways (RCPs)



Publication—IA Model


Rising radiative forcing pathway leading to 8.5 W/m2 (~1370 ppm CO2 eq) by 2100.

(Riahi et al. 2007)—MESSAGE


Stabilization without overshoot pathway to 6 W/m2 (~850 ppm CO2 eq) at stabilization after 2100

(Fujino et al. 2006; Hijioka et al. 2008)—AIM


Stabilization without overshoot pathway to 4.5 W/m2 (~650 ppm CO2 eq) at stabilization after 2100

(Clarke et al. 2007; Smith and Wigley 2006; Wise et al. 2009)—GCAM


Peak in radiative forcing at ~3 W/m2 (~490 ppm CO2 eq) before 2100 and then decline (the selected pathway declines to 2.6 W/m2 by 2100).

(Van Vuuren et al., 2007a; van Vuuren et al. 2006)—IMAGE

ª Approximate radiative forcing levels were defined as ±5% of the stated level in W/m2 relative to pre-industrial levels. Radiative forcing values include the net effect of all anthropogenic GHGs and other forcing agents


The growing difference between the model projections and the observations has highlighted the need to improve the climate models, including a reevaluation of the climate sensitivity estimates upon which the model projections are based. This should be a high priority for climate research in the future.


Tags: Climate Models, Temperature Record

Highlighted Article: Climate Change, Fossil Fuels, and Human Well Being

From: Competitive Enterprise Institute

By: Marlo Lewis

Climate Change, Fossil Fuels, and Human Well Being

"Climate campaigners demand ever-greater government control over energy markets, resources, and infrastructure. Many believe the best thing governments can do with fossil energy is “keep it in the ground.” They claim fossil-fueled civilization is “unsustainable” and headed for a climate catastrophe. Are they correct?"


Climate Change, Fossil Fuels, and Human Well Being


Tags: Highlighted Article

The Truth, the Whole Truth and… Climate Change

Thousands of near-surface temperature data points are collected globally each day. These data points are freely available to each of the global near-surface temperature anomaly producers: NOAA, NASA GISS, Hadley Center/UEA and the Japanese Meteorological Agency. Each of these anomaly producers selects from among the data, “adjusts” the data, homogenizes the data and, in the case of NASA GISS, “Infills” missing data. Each agency produces a monthly global temperature anomaly calculation based on this data.

The global temperature anomaly calculations differ among the agencies because of difference in the reference period, the data selected for inclusion, the “adjustment” process and possible “infilling” of missing data. The anomalies are reported to two decimal place “precision”, typically with an uncertainty band of +/- 0.10ºC. However, according to a study by Dr. Patrick Frank of the Stanford Synchrotron Radiation Lightsource/SLAC at Stanford University, the reported uncertainty bands “have not properly addressed measurement noise and have never addressed the uncontrolled environmental variables that impact sensor field resolution”.

The uncontrolled environmental variables which affect sensors in field locations in the United States were surveyed and reported upon by the Surface Stations Project. Dr. Frank’s analysis focuses on the effects of measurement noise. Dr. Franks concludes that, for the common MMTS sensor, the total noise plus resolution lower-limit 1σ measurement uncertainty in an annual temperature anomaly referenced to a 30-year mean is ±0.46ºC, for a well sited and maintained installation. He further estimates that issues with actual field installations in the Global Historical Climatology Network, “a globally complete assessment of current air temperature sensor field resolution seems likely to reveal a measurement uncertainty exceeding ±0.46ºC by at least a factor of 2.”

The significance of these uncertainties is illustrated in the following graph, in which the black line is the NASA GISS reported historical temperature anomaly, as reported in 2010; and, the gray area represents the measurement uncertainty of +/-0.46ºC about the reported historical temperature anomaly. “The trend in averaged global surface air temperature from 1880 through 2000 is statistically indistinguishable from zero (0)º Celsius at the 1σ level when this lower limit uncertainty is included, and likewise indistinguishable at the 2σ level through 2009.”

Assuming Dr. Franks is correct that actual field measurement uncertainty is =/> 2 x 0.46ºC, the gray area would be twice as wide as shown in the graph at the1σ level, rendering the trend in averaged global near-surface air temperature indistinguishable from zero (0)º Celsius through 2017.


Temperature Anomally

Dr. Franks concludes as follows:

“Future noise uncertainty in monthly means would greatly diminish if the siting of surface stations is improved and the sensor noise variances become known, monitored, and empirically verified as stationary. The persistent uncertainty due to the effect of uncontrolled microclimatic variables on temperature sensor resolution has, until now, never been included in published assessments of global average surface air temperature. Average measurement noise and the lower limit of systematic sensor errors combined to yield a representative lower limit uncertainty of ±0.46ºC in a 30-year mean annual temperature anomaly. In view of the problematic siting record of USHCN sensors, a globally complete assessment of current air temperature sensor field resolution seems likely to reveal a measurement uncertainty exceeding ±0.46ºC by at least a factor of 2. The ±0.46ºC lower limit of uncertainty shows that between 1880 and 2000, the trend in averaged global surface air temperature anomalies is statistically indistinguishable from 0ºC at the 1σ level. One cannot, therefore, avoid the conclusion that it is presently impossible to quantify the warming trend in global climate since 1880.


Finally, the relatively large uncertainty attending the global surface instrumental record means that the centennial temperature trend is not a precision target for validation tests of climate models. Likewise, the current surface instrumental record cannot credibly be used to train or renormalize any physically valid proxy reconstruction of paleo-temperature with sufficient precision to resolve any temperature difference less than at least 1ºC, to 95% confidence. It is thus impossible to know whether the rate of warming during the 20th century was climatologically unprecedented, or to know the differential magnitude of any air temperature warmer or cooler than the present, within ±1ºC, for any year prior to the satellite era. Therefore previous suggestions, that the rate or magnitude of present climate warming is recently or millennially unprecedented, must be vacated.”


These conclusions paint a very different picture of the state of our understanding of global temperature change than the anomaly changes reported to two decimal place “precision” by the producers of the global near-surface temperature anomaly products. They strongly support the arguments for development of a global land surface climate fiducial reference measurements network.


Tags: Global Historical Climate Network, Global Temperature, Temperature Record

Highlighted Article: Opening Up the Climate Policy Envelope

From: Issues in Science and Technology

By: Roger Pielke Jr.

Opening Up the Climate Policy Envelope

Fudged assumptions about the future are hampering efforts to
deal with climate change in the present. It’s time to get real.


Tags: Highlighted Article

Replication and Climate Science

One of the potential approaches to dealing with the Irreproducibility Crisis of Modern Science is replication of the experiment. Replication of an experiment is far more expensive than merely attempting to reproduce the results of an experiment by reanalyzing the existing data and the methods used to analyze that data; and, reaching conclusions based upon the data and methods.

Climate science does not offer the opportunity to replicate the “experiment” which is the ongoing, chaotic climate. The fundamental data required to analyze the earth’s weather and ultimately its climate are collected “on-the-fly”, using a variety of different instruments to measure specific aspects of the current conditions. The scientist has no control over the ongoing “experiment”; and, thus, has no ability to replicate the conditions at any prior time to permit replication of data collection or collection of additional data which might improve the analysis. Once this minute, hour, day, week, month or year has passed, it is over, never to be repeated.

Climate science can improve its ability to analyze the ongoing “experiment” by expanding instrument coverage, utilizing more accurate instruments, deploying multiple sensors at each sensor location to monitor for instrument drift and failure, collecting data more frequently to increase the granularity of the data, etc. The US Climate Reference Network is an example of the deployment of more accurate, multiple sensor monitoring sites with more frequent data collection. The application of satellite-based technology to the measurement of temperatures and sea level is an example of expanding instrument coverage.

However, data which is not collected because there is no instrument deployed, or because an instrument has failed, cannot be replicated. It is forever unavailable. Similarly, data which is inaccurate because of changes in the site metadata or instrument drift is forever inaccurate. Climate science routinely “adjusts” data known or believed to be inaccurate, creating estimates of what the data might have been, had it been collected from properly selected, calibrated, sited, installed and maintained instruments. Some climate science also “infills” missing data, in instances where no sensor has been installed or an installed sensor has failed, again creating estimates of what the data might have been, had it existed. Even if “adjustments” and “infilling” produce accurate estimates, they cannot produce data.

The satellite era has introduced yet another challenge to climate science, which bears aspects of both reproducibility and replication. Satellites are being used to measure atmospheric temperature, sea surface temperature and sea level. The two primary groups analyzing atmospheric temperature, University of Alabama, Huntsville (UAH) and Remote Sensing Systems (RSS) have access to the same data from the same satellites, yet produce results of their analyses which differ from the results produced by the other team. Climate science is currently unable to resolve differences between near-surface and satellite-based temperature measurements. Climate science is also currently unable to resolve the differences between the sea level rise measurements made by surface-based instruments and the contemporaneous measurements made by satellite-based instruments.

The one aspect of climate science which is capable of both reproducibility and replication is exercising of the ensemble of climate models. Climate scientists can analyze the outputs of individual climate model runs and arrive at reproducible conclusions. Climate scientists can also replicate individual model runs using the same inputs and replicate the conclusions. However, that is a trivial result.

Climate science has produced an ensemble of climate models. However, these models, provided the same inputs, will not produce the same results. Therefore, it is not possible that more than one of these models actually models the real climate; and, it is possible that none of the models actually models the real climate. Also, any individual model, provided the range of values of climate sensitivity, forcings and feedbacks, will not produce the same results. Therefore, it is not possible that all of the values of climate sensitivity, forcings and feedbacks are accurate; and, it is possible that none of the values within the ranges are accurate.

The current ensemble of climate models has not been in use for the 30-year period which defines climate according to the World Meteorological Organization, so it is not yet possible to determine whether any combination of model and model inputs is accurate or has any predictive ability. The best known climate model which has been in use for the 30 year climate period is the model by Dr. James E. Hansen, former Director of NASA GISS, which accompanied his 1988 presentation to the Congress. This model projected future temperatures significantly higher than the temperature observations over the 30 year period.

In the shorter term, it has become obvious that the current ensemble of climate models is “running hot”, creating scenarios of potential future warming which are inconsistent with the observations made since the models were run. It does not reflect well upon climate scientists, climate science, or the organizations funding climate studies that these unverified models and their uncertain inputs are being used to produce “scary scenarios” of potential future climate catastrophes. Arguably, the creation of these “scary scenarios” is an exercise in climate research as a self-fulfilling prophesy.

             “In the last forty years governments have become interested in universities’ finding academic support for what they are proposing or have in place. We are in an era of ‘policy-based evidence’. We are also in an era of a particular political correctness, where it is very difficult indeed to get funds for research if the purpose of the research seems antithetical to current government policy. ‘Curiosity-directed research’ now comes with some serious barriers.” Don Aitkin, former Chairman of Australia’s National Capital Authority and former Vice-Chancellor and President of the University of Canberra


Tags: Climate Science, Climate Models, Estimates as Facts, Adjusted Data

Highlighted Article: Paris Lives! “Deep Decarbonization” at DOE

  • From: MasterResource
  • By: Mark Krebs

Paris Lives! 'Deep Decarbonization" at DOE


"Sir Isaac Newton’s laws of motion include: Every object will remain at rest or in uniform motion in a straight line unless compelled to change its state by the action of an external force. This article examines that force within the “swamp” of climate change policies in DOE.

"Despite President Trump’s announcement that the U.S. would withdraw for the Paris Agreement, the basis of that agreement–“deep decarbonization” through “beneficial electrification”–is proceeding virtually unabated. This is because deep decarbonization serves the purposes of the electric utility industry and their environmentalist allies, e.g., the Natural Resources Defense Council (NRDC)."


Paris Lives! 'Deep Decarbonization" at DOE

Tags: Highlighted Article

Some Irreverent Independent Thoughts on Independence Day 2018

The United States was on the “bleeding edge” of the end of colonialism in 1776. The US is the most successful of the ex-colonies, but has not attempted to use that success to become a colonial power. The US voluntarily participated in several wars in opposition to foreign nations seeking to take control of other nations by force. The US also played a major role in the breakup of the Soviet Union, which had “colonialized” Eastern Europe and had aspirations of “colonializing” the rest of the world. The citizens of the US have paid a heavy price to secure their own freedom and to assist the citizens of other nations to gain and secure their freedom.

Colonialism has begun to rear its ugly head again, though it now calls itself globalism and seeks global governance. The driving force for globalization and global governance is the United Nations, which was founded to assist nations in resolving conflicts peacefully. However, the UN has aggressively pursued “mission creep” and, at times, “mission leap”, as is common for unelected bureaucracies seeking greater influence and power.

The current “cause celebre” for the UN is climate change. Its primary vehicle in pursuit of this cause is the UN Framework Convention on Climate Change (UNFCCC). Its primary tool is the Intergovernmental Panel on Climate Change (IPCC), which studies the science regarding anthropogenic influences on global climate and prepares a periodic Assessment Report and accompanying Summary for Policymakers. Their efforts have led to development and promulgation of a political / scientific “consensus” that emissions of CO2, CH4 and other ”greenhouse gases” are the sole, or at least the principal, cause of recent global warming; that this anthropogenic warming and other potential climate changes caused or aggravated by the warming would inevitably lead to a global climate catastrophe if not abated; and, that urgent actions must be taken to reduce the emissions of these gases to avert the potential catastrophe.

The UN membership has developed a series of accords intended to unite the member nations in this effort, beginning with the Rio Accords, followed by the Kyoto Accords and most recently by the Paris Accords. The UN Secretariat and the UNFCCC have sought to cast these accords as binding treaties, but US administrations have chosen not to submit these accords to the US Senate for ratification, because they were certain that the Senate would not ratify them. The UN Secretariat and the UNFCCC remain convinced that the most recent Paris Accords must become binding on all parties, not only regarding commitments to reduce emissions, but also regarding technology transfers and funding commitments by the developed nations.

The UN Green Climate Fund was originally to begin transferring $100 billion per year from the developed nations to the developing and the not-yet-developing nations by 2020, of which approximately 25% was to be provided by the US. The UNFCCC envisions that commitment growing to $425 billion per year, again with the US providing approximately 25%. The UN Secretariat envisions that administration of compliance with the emissions reductions commitments of the parties and of the collection and distribution of the Green Climate Fund would require a level of global governance by the UN Secretariat. This global governance would obviously require surrender of a degree of national independence on the part of the participating nations, particularly those transferring technology and providing funding.

While the previous US administration appeared more than willing to surrender this degree of sovereignty to the UN, the current US administration appears totally unwilling to do so. US withdrawal from the Paris Accords and defunding of the UNFCCC and the Green Climate Fund is a symbolic declaration of the US intention to remain a sovereign nation. It is also a refusal to participate in the surrender of sovereignty by other nations to the UN bureaucracy. Regrettably, the Administration has not yet terminated its participation in the UNFCCC, as required by current US law as the result of the UNFCCC recognition of the Palestinian Authority as a “state level” participant. This step is essential to communicate to the UN that the US will function as a sovereign state and will participate in UN activities only to the extent that they are incompliance with US law.

US sovereignty and freedom were too hard won to be squandered on globalism.


Tags: Paris Agreement, Global Governance, Green Climate Fund, United Nations

Highlighted Article: William Niskanen on Climate Change

  • From: MasterResource
  • By: Robert Bradley Jr.

William Niskanen on Climate Change


A six-part series on the climate views of the late William Niskanen, taken from his Fall 1997 symposium essay, “Too Much, Too Soon: Is a Global Warming Treaty a Rush to Judgment?” as well as his 2008 postscript. Previous posts are:


William Niskanen on Climate Change


Tags: Highlighted Article

How shall I call thee? “catastrophic anthropogenic climate change skeptic”

The consensed climate science community has been searching for the most effective epithet to use to refer to those who question their consensus regarding the human causation of climate change and the climate catastrophe they believe will result. However, until now, none of the epithets they have elected to use have had the desired effect of humiliating and isolating their intended victims.

The most accurate term for describing those who question the consensus is skeptics. Some are skeptical that CO2 and other “greenhouse gases” are the cause of the recent climate warming. Others are skeptical that those gases are a significant cause of the current warming. Yet others are skeptical that the recent warming would lead to a climate catastrophe. Their skepticism is reasonable, since there is no observational evidence to support the assertions of which they are skeptical.

There is clear paleoclimatic evidence that global climate changed, both warming and cooling, prior to the existence of instrumental measurements of temperature and other aspects of climate; and, prior to the period in which human emissions of CO2 and other ”greenhouse gases” were sufficient to have any measurable influence on global climate. There is also clear instrumental evidence that global climate has changed, both warming and cooling, since the advent of instrumental records. There is, however, no observational evidence which permits measurement of the fraction of these changes which is the result of human activity, whether emissions or land use changes.

There is also clear evidence in the instrumental temperature record of the persistence of natural changes in climate, particularly temperature. The rapid increases and decreases in global average temperature anomalies resulting from El Nino and La Nina events are perhaps the strongest evidence of the persistence of natural variation. The effects of longer time scale variations, such as the reversals of the Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation are also becoming more obvious as they reoccur during the instrumental measurement period.

Arguably, the least accurate term used to describe skeptics is “climate denier”, since virtually none of the skeptics actually deny that the earth has a climate. The term “climate change denier” is almost as inaccurate, since very few of those skeptical of the climate change consensus actually deny that earth’s climate has changed in the past and continues to change in the present. However, the consensed climate science community apparently cannot be bothered to use the expression “catastrophic anthropogenic climate change skeptic”, which is probably the most accurate and comprehensive description of their intended victims.

Recently, climatologist Katharine Hayhoe has suggest that the consensed climate science community switch to the term “climate dismissives”. “I think that’s the perfect term,” Hayhoe said, “because a dismissive person will dismiss any evidence, any arguments with which they’re presented, because dismissing the reality of climate change and the necessity for action is such a core part of their identity that it’s like asking them to almost cut off an arm. That’s how profound the change would be for them to change their minds about climate change.” She believes that the term “skeptic” suggests a scientific willingness to learn and accept, while the term “denier” is almost toxic.

Hayhoe apparently believes that the science is “settled”, that the evidence for human causation is compelling and that the catastrophe is inevitable without drastic action. She appears to be “dismissive” of alternative opinions and the observations which support them. In light of the recent recognition of the poor quality of the climate data and the inaccuracy of the climate models, one wonders if Hayhoe is in “denial”.


Tags: Climate Skeptics, Climate Change Debate, Silencing the Skeptics
Search Older Blog Posts