Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Additional Perspective - Temperature Anomaly Record

It is generally acknowledged that human influence on global climate was minimal prior to approximately 1950. Therefore, virtually all climate change prior to 1950 is considered to have been the result of natural climate variation or a climate response to natural causes. However, the consensed climate science community asserts that climate changes since 1950 are significantly / largely / predominantly / exclusively the result of human activity.

Prior to the period of the instrumental temperature record, our knowledge of global temperature changes relies on general reconstructions, based on analysis of ice cores, ocean sediments and an increasing number of additional sources. These reconstructions indicate that global temperatures have varied continuously, but over a relatively narrow range, for the past several thousand years.

The period of the global instrumental temperature record is generally agreed to begin in 1880, though the Central England Temperature (CET) record dates back to 1650. The graph below displays the global annual average temperature anomaly product prepared by the British Met Office.

Global Average Temperature

The black diagonal line on the graph illustrates the slope of the change in the global annual temperature anomaly over the entire period on the global instrumental temperature record. The orange diagonal line on the graph illustrates the slope of the change in the global annual average temperature anomaly over the period since 1950, when human influence on global average temperature is generally considered to have begun. The slope of the black line is approximately 0.07oC per decade. The slope of the orange line is approximately 0.14oC per decade. Note that, according to the Hadley Centre, 2016 is the warmest year in the instrumental temperature record; and, that 2016 is also an El Nino year.

It is not possible to separate the natural variation from any human influence in this temperature anomaly record. It is obvious that the rate of global annual temperature increase over the period since 1950 is approximately double the rate of increase over the entire period of the instrumental temperature record. However, since the increase in atmospheric CO2 concentrations over the period has been relatively constant and the use of annual anomalies renders the seasonal variation in atmospheric CO2 concentrations moot, there is obviously significant natural variation in global annual temperature over the period.

The natural fluctuations in the global temperature anomaly are easier to visualize over shorter time periods, using the monthly global temperature anomaly estimate produced by the Hadley Centre.

HADCRUT4 Temperature Anomalies

The slope of the red diagonal line on the graph, drawn from the first to the last monthly global temperature anomaly estimate, is approximately 0.16oC per decade. Again, it is not possible to separate the natural variation from any human influence in this temperature anomaly record. However, it is obvious that there is significant natural variation in the temperature anomalies. This is particularly obvious in the cases of the 1997/1998 and 2015/2016 El Nino events, both of which demonstrate natural event driven anomaly changes of 0.5 – 0.6oC.

The natural variations in the global temperature anomalies are also obvious in the satellite temperature anomaly records, as illustrated by the UAH global lower atmosphere temperature anomaly record (below). The diagonal black line on the graph connects the first and last temperature anomaly estimates in the satellite record. The slope of the line is approximately 0.16oC per decade. As in the previous examples, it is not possible to separate the natural variation from any human influence in this temperature anomaly record. However, the natural variation caused by the 1997/1998 and 2015/2016 El Nino events is very clearly visible.

UAH Satellite-Based Temperature

While it might be possible to attribute a relatively smooth, gradual increase in the global temperature anomaly to the relatively smooth, gradual increase in atmospheric CO2 concentrations, it is clearly not rational to attribute both the rapid increases and the rapid decreases in the anomalies around El Nino events to the gradual increase in atmospheric CO2 concentrations.

The illustrations above of the instrumental and satellite temperature anomaly records are all plotted with a “Y” axis range of 2oC or less. This truncation of the “Y” axis makes the relatively small changes in the temperature anomaly estimates far easier to see. However, the truncated “Y” axis also distorts the significance of the changes in the temperature anomalies. The graph below displays the global annual temperature anomaly over most of the instrumental temperature record, using essentially the same temperature anomaly records as those in the truncated illustrations above. The “Y” axis in this graph ranges from -10oF to +110oF (-23oC to +43oC), a representative annual temperature range for the mid-latitudes. While the increase in the global annual temperature is visible in the orange line in this graph, on close inspection, its relatively minimal significance is also far more clearly displayed.

Wichita, Kansas is a typical mid-latitude city located close to the geographic center of the contiguous United States. The average temperature plotted by the orange line in the above graph is very close to the current average temperature of 57oF reported by the US National Weather Service for Wichita. The red shaded area in the graph above illustrates the typical daily average temperature range during the month of July in Wichita. The blue shaded area in the graph illustrates the typical daily average temperature range during the month of January in Wichita.

The reported increase in global annual temperature over the period of the instrumental temperature record is: approximately 8% of the daily average temperature variation experienced in Wichita throughout the year; approximately 3% of the seasonal average temperature variation; and, approximately 1.2% of the range of maximum and minimum temperature records.

The IPCC AR5 report expresses 95% confidence that more than half of the reported increase in global average temperature is the result of human influences, including gaseous and particulate emissions, land use changes and temperature “adjustments”. It is not currently possible to measure the relative effects of natural variation and human influences on global average temperature; and, it is not currently possible to measure the relative effects of the multiple potential human influences. Regardless, it is difficult to see imminent global catastrophe in the data we currently have available.

Tags: Temperature Record

Climate Science “The time has come, …”

Climate science is important. Climate science is a mess. Climate science must be fixed. The new Environmental Protection Agency (EPA) Administrator and the soon to be appointed new National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) Administrators have an obligation, either to make the science worth the money being spent on it, or to reduce the money being spent on it to a level commensurate with its apparent value.

The Climategate scandal in 2009 and 2011 began the very public questioning of climate scientists and the conduct of climate science. Climategate exposed Frantic Researchers Adjusting Unsuitable Data, attempting to control peer review, attempting to control or discredit peer reviewed journals, attempting to prevent certain scientists and their research from being published and attempting to ruin the careers of certain scientists. The recent revelations about the conduct of NOAA researchers in the preparation and publication of ‘Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus’ by Karl et al., Science 4 June 2015; and, the refusal of the NOAA Administrator to provide materials to a US House committee have rekindled the issue.

The new EPA, NOAA and NASA Administrators should immediately initiate a thorough, joint investigation of the acquisition, analysis and application of climate data by their agencies. This investigation should then expand to include those activities by other nations, the Intergovernmental Panel on Climate Change (IPCC) and the United Nations Framework Convention on Climate Change (UNFCCC). The initial focus of the investigation should be on data quality and data integrity. This should be followed by a focus on climate modeling and the application of climate models, particularly the need to verify climate models before using them as vehicles for producing questionable future scenarios.       

All the agencies using US and global near-surface temperature data to track global warming “adjust” the data because they either KNOW, or have strong reason to BELIEVE, that the data are flawed. The new NOAA and NASA Administrators should immediately question why the agencies have chosen to continue adjusting flawed data, rather than taking steps to ensure that the data they collect for analysis is accurate. Instrument calibration schedules and instrument enclosure and site maintenance schedules should be reviewed and amended as necessary to ensure data quality. Data archiving and storage procedures should also be reviewed and amended as required to ensure data integrity and accessibility. The new Administrators should establish firm policies regarding data and information access, with the expressed intent of eliminating the need for FOIA requests to obtain access.

These investigations and reviews should not be conducted by agency personnel, but rather by “tiger teams” of outside experts in all of the technical fields of concern, with support from agency personnel. Agency personnel should be notified that refusal to cooperate honestly, fully and timely with the tiger teams would be considered a notice of resignation from the agency; and, processed accordingly.

There is general recognition that the data available from satellites is more comprehensive than the data available from surface and near-surface sources. However, there is continuing disagreement regarding the relationship between satellite data and surface and near-surface data. The Administrators should redirect funds currently spent on studies applying unverified models to produce future scenarios of questionable value. These funds should be used instead to fund studies to resolve the differences between the various data sources, since such studies would likely result in an enhanced understanding of the atmosphere and the processes occurring within it.

Tags: Bad Science, Climate Science, 2016 election, Temperature Record, Climate Models

Oxymoron Alert (Carbon Taxes)

The Climate Leadership Council has just released a new study “The Conservative Case for Carbon Dividends” and presented it to the Trump Administration for its consideration. The study identifies Four Pillars of a Carbon Dividends Plan:

            1. A GRADUALLY INCREASING CARBON TAX

            2. CARBON DIVIDENDS FOR ALL AMERICANS

            3. BORDER CARBON ADJUSTMENTS

            4. SIGNIFICANT REGULATORY ROLLBACK

Interestingly, the study focuses on the carbon dividend in its title, rather than on the carbon tax. I would argue that a carbon tax is hardly “conservative”. I would also argue that it is not a “market mechanism”, though it would rely on market mechanisms to adapt to the market distortions caused by the tax. The magnitude of the market distortion which would be caused by a carbon tax of ~$40 per ton of CO2 emissions is estimated to be ~$500 per US social security card holder. It would manifest in the economy as an increase in the cost of every good and service; a tax-driven cost push inflation of prices.

The study begins with the premise that global CO2 emissions are undesirable and must be reduced; that is, it assumes that continued CO2 emissions would lead to catastrophic anthropogenic global warming (CAGW). The study also begins with the premise, shared by many economists, that a carbon tax is the most efficient approach to reducing carbon emissions. The study suggests that the recommended carbon tax would be a Pigouvian tax; that is, a tax intended to discourage activities which lead to negative externalities. However, this suggestion ignores any positive externalities associated with CO2 emissions, though positive externalities exist; and, might well currently dominate at present. It is more likely that the recommended tax is simply a “sin” tax, particularly since it is intended to increase over time until the identified “sin” (CO2 emissions) is eliminated.

A carbon tax at a given level does not reliably produce a CO2 emissions reduction of a specific magnitude or percentage. This is particularly true in the short term, as the market response to the tax is affected both by the availability of economic alternative technologies and by the remaining economic useful life of existing equipment in service. The carbon tax does have the flexibility to be increased progressively, as required, to drive the desired emissions reduction. Clearly, the ultimate goal of the tax is to drive CO2 emissions in the US to zero, which would essentially require a transition to an all-electric energy economy, with all electricity provided by nuclear and / or renewables. It is unclear how high the carbon tax would have to be to achieve that objective. It is clear the tax would penalize the current transition from coal to natural gas as the predominant fuel for electric power generation, which has been largely responsible for the reductions in US CO2 emissions over the past several years. It might also constrain that transition in the short term, by discouraging investment in the additional natural gas pipeline and storage capacity required to supply the growing natural gas generation base.

The carbon dividend is a thinly disguised form of income redistribution, since it would provide a uniform quarterly dividend to all US social security cardholders, regardless of the amount of carbon tax, if any, directly or indirectly paid by that social security cardholder. The carbon dividend would likely be popular, particularly among the ~70% of social security cardholders expected to receive a dividend greater than their tax expense. The carbon tax would be imposed upstream of the consumer and would probably appear to be a price increase imposed by suppliers and service providers, rather than as a flow-through tax. Therefore, many dividend recipients would blame suppliers and service providers for the price increases, though they would thank the federal government for the dividend.

The border carbon adjustments process would be extremely complex and costly. There would likely be massive disagreements over the “carbon content” of manufactured goods and the impacts of any carbon emissions reduction efforts on the part of the government of the country in which the imported goods were manufactured. The numbers of products and the number of countries from which they are imported, combined with technology changes over time, would make this a long term ongoing process.

The “significant regulatory rollback” would fly in the face of the regulatory agencies’ imperative to survive and grow. Certainly the tax would be preferable to the current and growing command and control “rats nest” which is environmental regulation. However, in this instance, neither the regulatory morass nor the tax appears to be necessary, absent near-religious belief in the scenarios produced by unverified climate models.

In my early youth, I believed in Santa Claus, the Easter Bunny and the Tooth Fairy. In my advancing maturity, I find it impossible to believe in a revenue neutral tax. The history of “cap and trade” (cap and tax) legislation in the US strongly suggests that the Congress is incapable of producing such a tax.

 

Related Articles:

Tags: Carbon Tax

Highlighted Article: At What Cost? Examining The Social Cost Of Carbon

Here is an excellent article we would like to highlight from Cato Institute's Patrick J. Michaels.

At What Cost? Examining The Social Cost Of Carbon

"My testimony concerns the selective science that underlies the existing federal determination of the Social Cost of Carbon and how a more inclusive and considered process would have resulted in a lower value for the social cost of carbon."

Tags: Highlighted Article

More Anomalous Anomalies

The three primary producers of global near-surface temperature anomalies, NASA GISS(Goddard Institute for Space Studies), NOAA NCEI(National Centers for Environmental Information) and HadCRUT all begin the process with access to the same data. They then select a subset of the data, “adjust” the data to what they believe the data should have been; and, calculate the current “adjusted” anomalies from the previous ”adjusted” anomalies. NASA GISS alone “infills” temperature estimates where no data exist. Each producer then independently prepares their global temperature anomaly product.

The calculated anomaly in any given month relates directly to a global average temperature for that month. The difference between the calculated anomalies in any pair of months is thus the same as the difference between the calculated global average temperatures for those months. However, the differences reported by the three primary producers of global average temperature anomaly products from month to month, or year to year, are rarely the same; and, the changes are not always even in the same direction, warming or cooling.

The global average temperature anomaly differences reported by the three primary producers of global near-surface temperature anomalies for the months of November and December, 2016 are an interesting case in point. NASA GISS reported a decrease of 0.12oC for the period. NOAA NCEI reported an increase of 0.04oC. HadCRUT reported an increase of 0.07oC. Each provider estimates a confidence range of +/- 0.10oC for their reported anomalies. (NASA GISS: -0.22oC / -0.12oC / -0.02oC; NOAA NCEI: -0.06oC / +0.04oC / +0.14oC; HadCRUT: -0.03oC / +0.07oC / +0.17oC) Therefore, the confidence ranges overlap, suggesting that the differences among the anomaly estimates are not statistically significant. However, it is clear that the global average near-surface temperature did not both increase and decrease from November to December.

The second decimal place in each of these reported anomalies is not the result of temperature measurement accuracy, but rather of numerical averaging of less accurate “adjusted” temperature estimates resulting from data “adjustment”. The Law of Large Numbers states that it can be appropriate to express calculated averages to greater precision than the precision of the numbers being averaged, if the errors in the individual numbers are random. However, the nature of the factors which cause individual temperature measurements to be inaccurate and thus require “adjustment” suggests that the resulting errors are not random. Certainly, the “adjustments” made to the data are not random. Therefore, it is highly unlikely that reporting global average near-surface temperatures to greater precision than the underlying “adjusted” temperature data is appropriate.

Tags: Temperature Record, Global Temperature

Hottest Year Time Again

The near-surface global temperature anomaly data for 2016 have been collected, selected, infilled, “adjusted” and analyzed. The results are in; and, again, they are anomalous. NASA GISS(Goddard Institute for Space Studies) and NOAA NCEI(National Centers for Environmental Information) both report the average anomaly for 2016 as 0.99oC. This represents an increase of 0.13oC for the NASA GISS anomaly, compared with 2015; but, an increase of 0.09oC for the NOAA NCEI anomaly, compared with 2015. Both NASA GISS and NOAA NCEI place the confidence limits on their reported anomaly at +/- 0.10oC, or approximately the same magnitude as the reported year to year global average anomaly change. Both NASA GISS and NOAA NCEI estimate that the influence of the 2015/2016 El Nino contributed 0.12oC to the increase in the reported anomaly for 2016, 0.01oC less than the global average anomaly increase reported by NASA GISS and 0.03oC more than the global average anomaly increase reported by NOAA NCEI. That is, essentially all of the 2016 global average temperature anomaly increase reported by both agencies was the result of the influence of the 2015/2016 El Nino, which was very similar in magnitude to the 1997/1998 El Nino, which are the two strongest El Ninos recorded in the instrumental temperature record. HadCRUT reported an average anomaly of 0.774oC, an increase of 0.14oC from the 2015 average anomaly. HadCRUT estimated similar confidence limits and a similar El Nino contribution.

All of the near-surface temperature anomaly products reported dramatic drops in their anomalies, beginning in the Spring of 2016, though these drops were from record high monthly peaks driven by the El Nino. The NASA GISS anomaly dropped from a high of 1.36oC to a December 2016 level of 0.81oC, a change of -0.55oC. The NOAA NCEI anomaly dropped from a high of 1.22oC to 0.79oC, a change of -0.43oC. The HadCRUT4 anomaly dropped from a high of 1.08oC to 0.59oC, a change of -0.49oC. This a variance of 0.12oC between the near-surface temperature anomaly products, approximately equal to the magnitude of the reported 2016 anomaly increases, the estimated impact of the 2015/2016 El Nino and half the confidence range claimed for the reported anomalies.

University of Alabama Huntsville (UAH)  and Remote Sensing Systems (RSS) both reported that 2016 was 0.02oC warmer than 1998, which both sources still report as the previous warmest year in the satellite temperature record. Dr. Roy Spencer of UAH stated that the increase in the reported temperature anomaly between 1998 and 2016 would have had to be ~0.10oC to be statistically significant. The UAH anomaly dropped from a 2016 high of 0.83oC to a December 2016 level of 0.24oC, a change of -0.59oC. The RSS anomaly dropped from a 2016 high of 1.0oC to a December level of 0.23oC, a change of -0.77oC. Both the UAH and RSS anomalies show the dramatic impact of the 2015/2016 El Nino. Both anomalies suggest that the “Pause” has returned, since they show no statistically significant warming since at least 1998.

The question now is whether there will be a La Nina in 2017; and, if so, the extent to which it will further reduce the post El Nino anomalies.

Tags: Warmest, Global Temperature, Temperature Record

Highlighted Article: Climate Models for the Layman

Here is an excellent paper on climate models by Dr. Judith Curry and The Global Warming Policy Foundation (GWPF).

Climate Models for the Layman

"This report attempts to describe the debate surrounding GCMs to an educated but nontechnical audience."

Tags: Highlighted Article

Opening the Kimono

Steve Goreham, the author of the book Climatism! Science, Common Sense, and the 21st Century's Hottest Topic coined the term Climatism, which he defines as “the belief that man-made greenhouse gas emissions are destroying the Earth's climate". Arguably, the definition should include the assertion “that man-made greenhouse gas emissions are destroying the Earth's climate", even absent belief in the assertion.

Ari Halperin manages the blog Climatism. (https://climatism.wordpress.com/) Early in 2016, Halperin authored a guest essay on Watts Up With That (http://wattsupwiththat.com) entitled Who unleashed Climatism (https://wattsupwiththat.com/2016/01/17/who-unleashed-climatism/) in which he discusses the origins of climate alarmism at length. He concludes that: “Climatism is a foreign assault on America. The aggressor is not another nation-state, but an alliance of UN agencies and environmental NGOs.”

Leo Goldstein (http://defyccc.com/) has recently posted two guest essays on Watts Up With That: https://wattsupwiththat.com/2016/12/23/the-command-control-center-of-climate-alarmism/; and, https://wattsupwiththat.com/2017/01/05/is-climate-alarmism-governance-at-war-with-the-usa/. Goldstein has coined two new terms in these essays: Climate Alarmism Governance (CAG); and, Climintern (Climatist International). Goldstein traces the history of CAG from the founding of the Climate Action Network (CAN) in 1989. CAN now has 1100+ members globally. CAN, the UN and numerous foundations provide CAG; and, are referred to collectively as the Climintern.

The Climintern refers to the analyses provided by Halperin and Goldstein as conspiracy theories. Others might refer to them as conspiracy exposes or histories. The documentation they provide in these essays clearly establishes the nature and scope of CAG and the influence of the Climintern. Their analyses are well worth reading. They provide a historical record of how climatism has proceeded from its earliest days to today; and, how it works to influence global and national climate policy through the UNFCCC and the UN IPCC.

“If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.” (HT: James Whitcomb Riley)

Tags: Climate Skeptics

Another One Bites the Dust – Judith Curry Resigns

Dr. Judith Curry has resigned her position at Georgia Tech, in frustration over “the politics and propaganda that beset climate science”. Dr. Curry explained,

“the deeper reasons have to do with my growing disenchantment with universities, the academic field of climate science and scientists… I no longer know what to say to students and postdocs regarding how to navigate the CRAZINESS in the field of climate science. Research and other professional activities are professionally rewarded only if they are channeled in certain directions approved by a politicized academic establishment — funding, ease of getting your papers published, getting hired in prestigious positions, appointments to prestigious committees and boards, professional recognition, etc.”
“How young scientists are to navigate all this is beyond me, and it often becomes a battle of scientific integrity versus career suicide.” (https://judithcurry.com/2017/01/03/jc-in-transition/)

Dr. Curry’s concerns regarding university climate science education do not bode well for the future of climate science over the next 30-40 years.

Dr. Curry follows Dr. Roger Pielke, Jr., who did not resign his faculty position at the University of Colorado, but has redirected his research efforts away from climate science. (http://www.wsj.com/articles/my-unhappy-life-as-a-climate-heretic-1480723518)

Other climate scientists, including Dr. David Legates (formerly Delaware State Climatologist) and Dr. Noelle Metting (formerly US DOE) were terminated for failing to adhere to the political climate change narrative. (http://www.delawareonline.com/story/news/local/2015/02/26/university-delaware-professor-caught-climate-changecontroversy/24047281/)

(http://freebeacon.com/politics/congress-obama-admin-fired-top-scientist-advance-climate-change-plans/)

Dr. Wei-Hock Soon and Dr. Sallie Baliunas, both of the Harvard-Smithsonian Center for Astrophysics have been under attack since 2003 for their work on the solar contribution to climate change. (Dr. Baliunas has since retired.)

The Climategate e-mails released in 2009 and 2010 exposed efforts on the part of members of the consensed climate science community to destroy the careers of other climate scientists, including Dr. Christopher deFrietas, Dr. Christopher Landsea and Dr. Patrick Michaels. Fortunately, these efforts were unsuccessful.

The life of a non-consensed climate scientist is hardly a bed of roses.

Tags: Climate Consensus

Trump’s Corruption Mandate

Donald Trump’s astonishing election victory was in part a backlash against increasingly corrupt American politics.

Transparency International publishes an annual Corruption Perceptions Index, ranking all nations from most to least clean in their political conduct. The United States entered the twenty-first century by falling out of the top ten. Scandinavian nations such as Finland, Denmark, and Sweden along with Commonwealth nations such as New Zealand, Canada, and the United Kingdom dominated the top spots, while the USA was ranked fourteenth.

Since then the USA has declined further in the Index's rankings.

Both corruption and the perception of corruption increased during the tenures of Bill and Hillary Clinton, George W. Bush, and Barack Obama. Examples included the use of the IRS to bully political enemies, government bailout funds going to politically-connected crony businesses, the use of high office to enrich one’s private foundation, and presidents and their appointees to regulatory bodies using their discretionary power indiscriminately.

Given this, one therefore understands the joke that the wildly-popular television show House of Cards is really a documentary.

Yet there is a danger in that joke. While corruption occurs in all governments, there a huge difference between cultures in which corruption is normalized and implicitly tolerated and those in which corruption is condemned, vigilantly monitored, and forced to go underground.

So Trump’s astonishing election victory may be a healthy reaction against the increasing corruption -- and therefore even more astonishing because his own character seems imbued with significant elements of personal and business-political corruption -- and because the combination of his presidency with his personal financial holdings is fraught with conflicts of interest (as this Wall Street Journal graphic shows). Yet since conflict-of-interest rules apply differently to the president and vice-president, according to 18 U.S. Code § 208, it is unclear how many conflicts will actually be avoided.

Of course some Trump supporters argue that it takes a beast to fight a beast, but what we really need is a political culture that does not lend itself to amoral animal metaphors.

Political leadership is a human endeavor, and effective human leadership in the free and open democratic republic we aspire to be requires both integrity and the widespread perception of integrity. We are a rich country economically, so we can recover from billions of dollars of loss. But the erosion of character among our leadership is much more expensive. It encourages cynicism among the citizenry. It imposes demoralization and disengagement costs upon them. It discourages the morally principled from seeking political office. And it attracts the even-more-corrupt to the corridors of power. No democratic republic can survive that downward cycle for long.

So while I did not vote for Trump, I am encouraged that his administration is following up on at least one of his campaign promises to for example, a five-year ban on lobbying for all transition and administration officials. We can debate the morality and likely effectiveness of that particular anti-corruption policy, but as a post-election statement of intent its seriousness is evident and positive.

Nations always have a choice. A century ago Argentina was among the top ten most prosperous and clean nations in the world, but it has declined precipitously and is now relatively much poorer and ranked in the bottom half of nations for bribery and related corruptions. South Africa was only moderately corrupt a generation ago and has also declined sadly.

Yet some countries have cleaned up their corruption impressively. Botswana improved dramatically in one generation, as did Chile -- both overcoming the widespread stereotype of irredeemable business-as-usual-corruption in African and Latin American politics.

A banana-republic destiny is avoidable for the USA. President Trump’s character -- with its odd mix of obviousness and unpredictability -- will be decisive, as will the vigilance of the rest of us and our commitment to putting the animals back in their cages and cleaning up their messes.

Tags: Corruption, Politics, Donald Trump

“Mann Overboard”

Much has been written this year about the 2015/2016 El Nino and about the apparent record global temperature anomalies. Professor Michael Mann of Penn State University was quick to provide his opinion that the El Nino contributed only about 0.1oC, or about 15%, to the 2015/2016 global average temperature anomaly increase. Others provided estimates ranging from 0.07oC to 0.2oC. The balance of the temperature anomaly increases was attributed to the continuing increase in atmospheric CO2 concentrations as the result of human fossil fuel combustion.

However, the 2015/2016 El Nino is now over; and, global temperature anomalies have dropped sharply: by approximately 0.4oC overall; and, by approximately 1.0oC over land only. The sea surface temperature anomalies are expected to decrease further, although more slowly, especially if a significant La Nina develops in 2017. The equatorial Pacific is in a weak La Nina condition at present, but La Nina conditions appear to be weakening.

Regardless, Mann and others who minimized the potential contribution of the 2015/2016 El Nino to the rapid global temperature anomaly increases in those years are now faced with explaining the large, rapid decreases in the global average anomalies following the end of the El Nino. It would be difficult enough to explain rapid anomaly increases in association with slow increases an atmospheric CO2 concentrations; but, even more difficult to explain rapid anomaly decreases in association with slow increases in atmospheric CO2 concentrations.

Tags: Temperature Record, Global Temperature

Political protest in a "post-fact era"

“Everyone is entitled to his own opinion, but not to his own facts” (Senator Daniel Patrick Moynihan)

 

A protester was shot at the University of Washington during a clash between rival factions — one faction physically blocking an audience from hearing a speech, the other faction seeking to hear a rabble-rousing orator.

The orator was Milo Yiannopoulos, a leading spokesman for the alt-right movement, a revitalized and muscularized version of nationalist and populist politics long submerged in American politics.

Outside the auditorium, blocs of red-wearing Trump supporters and black-wearing anarchists and others faced each other  (unconsciously updating Stendhal's novel The Red and the Black.) The man who was shot was apparently a peacemaker, placing himself in the middle of the verbally-abusing and pushing-and-shoving factions.

The victim's positioning was unfortunate, as there is little "middle" left in our polarized political times.

And it is symbolic that the shooting took place at a university, because it was precisely at universities where the battle for civility has been lost.

A generation ago in universities we had vigorous debates about truth, justice, freedom, and equality. The governing premises was that through argument rational people could fine-tune their grasp of the facts and test the logic of their theories. The process would often be contentious. Yet with professors and students committed to a baseline civility, it would be cognitively progressive.

But the leading professors of the new era — Michel Foucault, Jacques Derrida, and Richard Rorty among them -- undercut that entire process. Facts, they argued, are merely subjective constructs and masks for hidden power agendas. Over the next generation the words "truth," "justice," "freedom," and "equality" began to appear exclusively in ironic scare quotes.

"Everything," declared post-modern professor Fredric Jameson, "is political." And absent facts, argued post-modernist Frank Lentricchia, the professor's task is transformed from truth-seeker to political activist: in the classroom he should "exercise power for the purpose of social change."

We live in the resulting postmodern intellectual culture, with an entire generation (mis-)educated to see politics not as a cooperative quest to solve economic problems and protect human rights — but as a ceaseless clash of adversarial groups each committed to its own subjectivist values. Feminist groups versus racial groups versus wealth groups versus ethic groups versus sexuality groups versus an open-ended number of increasingly hostile and Balkanized subdivisions.

Thus we have a generation populated with biologically mature people who lack the psychological maturity to handle debate and occasional political loss — at the same time convinced of the absolute subjective necessity of asserting their goals in a hostile, victimizing social reality.

As reasonable discussion declined in universities, physicalist tactics quickly replaced them. Arguments about principles were replaced with routine ad hominem attacks. Letters of invitation to guest lecturers prompted threats of violence. The heckling of speakers turned to shouting them down. Picketing protests became intentional obstruction.

And now we get the inevitable backlash as other, rival factions learn the new rules and steel themselves for engagement.

Yiannopoulos himself is a product of post-modern culture, as it was he who exultingly coined the phrase "post-fact era" to describe how politics now works. He is proving himself to be an effective player of that brand of political activism.

Yet the governing ethic of our political culture is not a lost cause, as large swathes of the American populace are still committed to the core democrat-republican civic virtues of intellectually honest debate, free speech, tolerance — and of being both a good loser and a good winner. A fractious election brought out many of the worst among us. But journalistic headlines aside, our choice is not only between the tactics of post-modern political correctness and those of alt-right populism. Our leading intellectuals, especially those within universities who are nurturing the next generation of leaders, must also teach the genuinely liberal-education alternative.

Tags: Free Speech, College Students, Protests

“Just the facts, ma’am.”

“Politics is a battle of ideas; in the course of a healthy debate, we’ll prioritize different goals, and the different means of reaching them. But without some common baseline of facts; without a willingness to admit new information, and concede that your opponent is making a fair point, and that science and reason matter, we’ll keep talking past each other, making common ground and compromise impossible.”

– President Obama (Jan. 10, 2017)

Simple Definition of fact

  • : something that truly exists or happens : something that has actual existence
  • : a true piece of information

Source: Merriam-Webster's Learner's Dictionary

 

“Just the facts, ma’am.” #1

Some simple words are often used imprecisely. In discussions related to climate science, the simple word “fact” is a case in point. For example, a temperature measurement taken by an observer from a particular instrument, in a particular enclosure, at a particular location and at a particular time is frequently referred to as a “fact”. However, it is only a “fact”, as defined above, in those particular circumstances. It is not necessarily and not even likely “a true piece of information”, in the broader sense, since it is affected by those circumstances.

Temperature measurements taken “near-surface” are “selected” for inclusion in the temperature record; and, are then “adjusted” to account for the particular instrument, enclosure, location and time of observation. These “adjusted” measurements are not “something that truly exists or happens”, but rather an estimate of something that might “truly exist or happen”.

An “ideal” near-surface temperature measurement site is identified as follows: Climate Reference Network (CRN) Site Information Handbook

Class 1– Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19º). Grass/low vegetation ground cover <10 centimeters high.    Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots.  Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away.  No shading when the sun elevation >3 degrees.

Such a site is estimated to be able to produce a near-surface temperature measurement with an error of less than 1oC, assuming proper instrument selection and calibration, proper enclosure and timely reading. Such a measurement is a “fact”, subject to those limitations.

Climate science deals with these errors of “fact” regarding near-surface temperature measurements by using temperature anomalies, or the differences between temperature measurements taken at a particular site. These anomalies are “facts” only if there have been no changes in any of the circumstances which affect the measurements; and, they cease to be facts if the measurements are “adjusted”, rendering them merely estimates.

 

 

“Just the facts, ma’am.” #2

Above I discussed the limitations on “facts”; and, the difference between facts and estimates related to individual temperature measurements, whether analyzed as discrete temperatures or temperature anomalies.

Once near-surface temperature measurements have been recorded, selected and “adjusted”, the next step in the process is to combine these selected, “adjusted” temperature estimates, expressed as anomalies from previous selected, “adjusted” temperature estimates, into an estimated global average near-surface temperature anomaly. While it might be argued that errors in the recorded temperature measurements might be random, it cannot be argued that the selection of the temperature measurements to be included in the global average calculation or the “adjustments” made to these temperature measurements are random. There could be no rational explanation for making random “adjustments” to measurements.

The estimated global average surface temperature anomaly is reported to two decimal place “precision”; and, used to calculate decadal rates of temperature increase to three decimal place “precision”. This level of “precision” is highly questionable, bordering on ridiculous, considering the inaccuracy of the underlying temperature measurements. The underlying temperature measurements are estimated to be in absolute error by an average of more than 2oC in the US, where they have been surveyed and their siting compared to the US CRN1 siting requirements. The expected inaccuracy of the remaining global temperature measuring stations is assumed to be similar, though they have not been surveyed and their siting compared to the US CRN1 siting requirements.

Finally, the estimated “adjusted” global average temperature is reported to two decimal place “precision”. This estimate is reported as a “fact”, though the particular circumstances under which the estimate might have been a “fact” are ignored.

 

 

“Just the facts, ma’am.” #3

“Just the facts, ma’am.” (1 & 2) discussed “facts” in the context of individual near-surface temperature measurements and global average temperature anomaly calculations. The final step in the climate change analysis process is the creation of future climate change scenarios using climate models.

There are numerous climate models, none of which have been verified. Therefore, it cannot be said that there is a climate model which is a “fact”, in the sense that it accurately models “something that truly exists or happens”, rather than hypothesizes something which might happen if the model were accurate.

The climate models are run using a range of inputs for climate sensitivity and climate forcings, because there are no specific, verified values for the various sensitivities and forcings. Therefore, not only are the climate models not “facts” (“something that truly exists or happens”), the inputs which feed the models are not “facts” either, in the sense that they are “a true piece of information”. It is not even a “fact” that the actual climate sensitivity or actual climate forcings are within the ranges of the values used as inputs to the models.

Therefore, the modeled scenarios of future climate change (temperature change) are not “facts”, or arguably even based on facts. Rather, they are estimates, based on estimates, of the potential change in current estimates over some future period.

Based on the “facts”, as discussed in these commentaries, there is only a tenuous basis for concern about catastrophic anthropogenic climate change.

That’s “Just the facts, ma’am.” (HT: Sgt. Joe Friday, LAPD)

Tags: Temperature Record, Estimates as Facts, Global Temperature, Adjusted Data

Energy Efficiency as a Climate Change Reform Strategy: Are we just throwing money at the problem?

(For an introduction to this E3 blog series click here)

Put to one side for a moment whether we need to do anything about climate change.  Assume it is real and we need to do “something.”  There are a wide variety of “somethings” that we can do.  Indeed, right now we are in the “throw spaghetti on the wall and see what sticks” phase.

But let’s face it, we have many more problems than climate change (even assuming it is a real problem).  There is, if you will, strong competition for scarce public resources to solve problems.  I would think it hardly controversial to state that we should spend public tax dollars in the most cost effective way possible.  Bjorn Lomborg makes the point that we don’t want to just feel good, we want to DO good!

For example, if there are two competing proposals to reduce a certain amount of greenhouse gases, all other things being equal, the one that does it cheapest should be chosen.  Similarly, if there are two competing proposals that will save lives, we should choose the one that saves lives for lower costs.  What if one proposal is to save a life a century from now and one to save it today?  More difficult, what if one is to buy mosquito netting for developing countries to mitigate malaria and another to slow the increase in global temperature in 50 years?  These all involve difficult trade-offs on how to use scarce resources.

Nearly every discussion of remedies for climate change discusses the enormous opportunity for energy efficiency.  Based on engineering models, rather optimistic claims are made for the potential for investments in energy efficiency to cure a variety of what ails us, often called a win-win-win situation.  We would use less energy.  Our total energy bills would be reduced.  We would emit less greenhouse gases.  We would need to build fewer electric power plants.  And best of all, the return on investment would rival that of Bernie Madoff’s and it would be tax free.

To be fair, the US does have an energy efficiency problem.  If energy prices (either gasoline or electricity) are distorted, then by definition are we not using energy efficiently.  This has possible environmental, energy security, and economic implications.

Both liberal and conservative energy analysts agree that the way that electricity is priced in the US leaves lots of room for improvement.  Broadly, we set prices that are too low in the peak period and too high in the off-peak period.  Additionally, many states set electricity rates in a way that gives electric utilities incentives to build new power plants rather than to invest in electric efficiency technologies.

The key disagreement between market oriented analysts and many liberals is the technique that should be adopted to correct these problems.  Market analysts promote the use of competition and market forces and market-based regulatory approaches to achieve better pricing signals for consumers.  Once prices are “efficient,” then let the consumer choose how to make the myriad trade-offs as to how to spend their money.  Many liberals promote a much more command-and-control regulatory approach to rectify the distortions created by bad pricing.  In essence, they don’t believe the consumer will make the “right” choices and thus adopt policies that force correct choices.

In 2015, a dramatic study was released by several professors from the University of Chicago and University of California at Berkeley.   The study found that the engineering model that is most often used to project the costs and benefits of energy efficiency technologies was seriously flawed. It found that the model’s projections seriously overestimated the energy savings that would result from investing in a given technology. This is important because public monies are often used to fund investments in energy efficiency. If the study is correct many projects that are funded do not meet the standard of being cost beneficial.

The study caused quite a stir in the energy policy community. It threatened to slaughter one of the sacred cows of progressives. But the study is significant because it is the first of its kind to comprehensively study the actual after-the-fact results to compare projected energy savings to actual energy savings. If the study is correct, it severely undercuts one of the main arguments that is often used to justify significant public investment in energy efficiency.

Some who are climate skeptics will no doubt tout the study as evidence for a variety of propositions, i.e., wasteful government programs, unreliability of engineering models, the triumph of good intentions over good policy.  But even for those genuinely concerned with climate change, if the study is correct and it turns out we have a real climate change problem, energy efficiency strategies will fail to address the intended problem.  This means we are not really addressing climate change.  We are just throwing scarce public resources at the problem.

Tags: Efficiency Standards

Priorities

One of the principal concerns raised regarding climate change is its potential effects on agriculture. There is continuing discussion that the potential combination of increased temperatures with drought or increased rainfall might result in reduced crop yields or crop failure in some or all of the traditional crop production regions. There is also continuing discussion of the perceived need to reduce meat consumption, so that grazing land could be converted to food production.

There is little discussion of the likelihood that production of these food crops would move to areas which have been too cold or had too short growing seasons in the past. There is also little recognition of the contribution of plant genetics to increased plant tolerance and yield.

However, in the face of all of this concern about food production, at least in the United States, the number one cash crop in ten US states is marijuana, as shown in the attached table. Marijuana is among the top five cash crops in a total of 39 of the 50 states; and, among the top ten cash crops in all but 2 states.

This is not to suggest that marijuana is a major crop in any of these states by volume or weight, or that it is crowding out production of other crops for human or animal consumption, including export. Rather, it is to suggest that a very high value has been placed on a crop which has no food value (even when baked into brownies), in the face of vocal concern about the adequacy of future food production.

Current efforts to legalize the consumption of marijuana for other than medicinal purposes will likely increase the demand for the product, increasing the productive acreage dedicated to its production, though it might also result in corresponding reductions in its commercial value. One has to question whether this agricultural product should have such a high priority.

 

Marijuana Rank as Cash Crop, by State

Alabama 1 Louisiana 6 Ohio 4
Alaska NA Maine 1 Oklahoma 3
Arizona 3 Maryland 5 Oregon 4
Arkansas 4 Massachusetts 2 Pennsylvania 5
California 1 Michigan 5 Rhode Island 1
Colorado 4 Minnesota 6 South Carolina 3
Connecticut 1 Mississippi 3 South Dakota 9
Delaware 3 Missouri 4 Tennessee 1
Florida 2 Montana 4 Texas 6
Georgia 3 Nebraska 9 Utah 2
Hawaii 1 Nevada 2 Vermont 2
Idaho 5 New Hampshire 2 Virginia 1
Illinois 4 New Jersey 7 Washington 5
Indiana 3 New Mexico 2 West Virginia 1
Iowa 4 New York 2 Wisconsin 6
Kansas 6 North Carolina 5 Wyoming 8
Kentucky 1 North Dakota ?    

 

Source: NORML (USDA data)

Tags: Agriculture
Search Older Blog Posts