Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Green New Deal - Deadweight Loss

The Green New Deal would be the most extensive exercise in “Broken Window Economics” in the history of the globe. The “Green New Deal” would cause massive deadweight losses in virtually every sector of the US economy, while producing no measurable impact on global climate.

“Deadweight loss can be stated as the loss of total welfare or the social surplus due to reasons like taxes or subsidies, price ceilings or floors, externalities and monopoly pricing. It is the excess burden created due to loss of benefit to the participants in trade which are individuals as consumers, producers or the government.” (The Economic Times)

The following is a list of key industries affected by the climate change aspects of the “Green New Deal”. However, all US economic activity would be affected to some extent, so the total affects are likely significantly understated.

 

 

Obviously, the major financial impact is the result of the plan to leave the US substantial energy resources “in the ground”, as has been advocated by numerous environmental groups previously.

 

The impact of the plan to render air travel obsolete, combined with the intent to halt the production of the fuels required by the airline and air freight industries would expand well beyond the US. Air service to and from the US to other locations would also be impacted, since refueling would not be available in US. International air operations would require that the aircraft carry sufficient fuel for a round trip or add a refueling stop in some country which still permitted production and sale of aviation fuels.

The impact of terminating fossil-fueled electric generation and not only replacing the existing generating capacity but also adding sufficient capacity to power the replacements for direct-fired residential, commercial and industrial energy end uses would be profound. The electric energy required to meet all current electric end use consumption plus the direct end uses currently served by oil and natural gas and their derivatives would be approximately three times the quantity of energy currently provided by existing electric generators. Also, the intermittent nature of current renewable energy systems would require that the installed renewable generating capacity per unit of electric energy consumption be approximately three times the generating capacity of the fossil-fueled generating capacity it replaced; and, that the renewable generating capacity be supported by long duration, transmission-level storage capacity.

Current US electricity production is approximately 4 trillion kWh per year, of which approximately 2.5 trillion kWh is produced using fossil fuels and 0.8 trillion kWh using nuclear generators. Increasing US electricity production to approximately 12 trillion kwh per year would require increasing the current wind and solar electricity production from its current approximately 0.3 trillion kWh per year, or by a factor of approximately 40.

Current US electric generating capacity is approximately 1,000 GW, of which approximately 750 GW is fossil fueled. Replacing this dispatchable fossil fueled generating capacity with a mix of intermittent clean and renewable generation with an average availability of approximately 25-30% would require the installation of approximately 2000 GW of new generating capacity, plus the storage capacity necessary to assure reliable service. US EIA has estimated the cost of installing new solar photovoltaic generating capacity at approximately $3,700 per kW and new wind generating capacity at approximately $1,900 per kW. Assuming an average of $2,800 per kW of clean and renewable generating capacity, the installed cost of 2000 GW of new capacity would be approximately $5.6 trillion.

 

A trillion here, a trillion there and pretty soon you’re talking about real money.

(with apologies to the late US Senator Everett McKinley Dirksen , R, IL)

 

Tags: Green New Deal

Not-So-Green New Deal

The “Green New Deal” (GND) is the current fascination of the most liberal / progressive / socialist elements of our society and its elected representatives, as well as several declared candidates for President in 2020. The rallying point of the Democrat GND was replacing our existing national energy infrastructure with “clean, renewable, and zero-emission energy sources” by “dramatically expanding and upgrading existing renewable power sources.” However, the Democrat leadership has apparently replaced the original specific deadline for US fossil fuel use with a stated intent to achieve net-zero carbon emissions. While this statement of intent sounds somewhat less extreme, it could only be achieved with high percentage carbon capture and storage combined with active removal of CO2 from the environment. Neither technology is currently commercially viable or economically attractive.

The partial removal of the extreme green “rind” from the Democrat GND “watermelon” exposes the nature of its red interior, as described by its originators, Green Party US.

  1. The Economic Bill of Rights
  2. A Green Transition
  3. Real Financial Reform
  4. A Functioning Democracy

Interestingly, the Green Transition envisioned by the GND developed by the Green Party US is not the rallying point of the plan, as it was in the original Democrat version championed by Bernie Sanders and Alexandria Ocasio-Cortez.

Senator Edward Markey and Representative Alexandria Ocasio-Cortez have introduced a Sense of Congress Resolution outlining the major goals of the GND as follows.

            A: Achieve net-zero “greenhouse gas” emissions

            B: Create millions of new, high-wage jobs

            C: Invest in US infrastructure and industry

            D: Secure for all people of the US:

            clean air and water;

            climate and community resiliency;

            healthy food;

            access to nature; and,

            a sustainable environment.

E: Promote justice and equity

These goals are to be accomplished through a ten-year national mobilization. This accelerated schedule virtually ensures that the process would be more expensive than necessary, since much of the technology required to achieve net-zero emissions has yet to be commercially demonstrated. The costs of achieving the remaining goals is a function of the detailed descriptions of specific objectives to be achieved.

Achieving zero-net emissions would impact every aspect of energy production and use in the US economy.

All electric generation consuming coal, natural gas, propane, biomass and municipal solid waste would either require carbon capture and storage or replacement with clean and/or renewable sources. There is significant dispute in the environmental community regarding the retention or expansion of nuclear and hydroelectric generation.

All direct use of fossil fuels in residential and commercial markets would be eliminated, unless new technology facilitating carbon capture and storage at that scale could be developed and commercialized. Otherwise, all residential and commercial space heating, water heating, cooking and laundry drying appliances would have to be replaced by electric appliances; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

All industrial food preparation and heat processing equipment would have to be equipped with carbon capture and storage capability or replaced with electric equipment; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

All transportation equipment, including personal vehicles, trucks, buses and trains would have to be replaced with electric vehicles; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

Offsets would have to be provided for the “greenhouse gas” emissions from processes such as steelmaking and cement production, for which electric process alternatives are not available.

Current electric power generation of all types provides 38% of US energy consumption, as shown in the chart below. The GND would require the replacement of all or most of the remaining 62% of US energy consumption with clean and renewable energy from incremental sources.

 U.S. primary energy consumption source and sector, 2017

Essentially, the Green New Deal would be the most extensive exercise in “Broken Window Economics” in the history of the globe. I have estimated the investments required to achieve such a transition in the energy economy at $30 trillion, not including the investment in the replacement of equipment and systems which had reached the end of their useful lives. Achieving this transition on the accelerated schedule contemplated by the GND could significantly increase that investment requirement. The lost value of used and useful equipment and systems abandoned as a result of this transition would be very difficult to estimate, but would be enormous, especially in the rapid transition envisioned in the Green New Deal.

 

Tags: Green New Deal

Wichita Revisited

Wichita, Kansas is located very close to the geographic center of the contiguous United States. The annual average temperature in Wichita is 57°F, which is also the current global average near-surface temperature. Wichita has been used to provide perspective on climate change on this site previously (here), (here) and (here).

The graph below shows the average and record high and low temperatures for Wichita on a monthly basis. Note that the average diurnal temperature range is from 20 – 23°F throughout the year; and, that the average high temperature is 21°F lower than the high temperature record, while the record low temperature is 37°F lower than the average low temperature.

 

Wichita Temperature Averages

 

The chart below lists the all-time temperature records for Wichita and the dates on which the records occurred. Note that only the Highest Monthly Average, Highest Annual Average and Lowest Annual Average occurred in the post-1950 period, when increasing atmospheric CO2 concentrations are thought to influence global climate. Note also that the Record Warmest High and the Record Warmest Low both occurred in the 1936, during the Dust Bowl years in the US.

 

All Time Record Temperatures

 

The chart below summarizes annual average temperatures for Wichita, showing the 22°F average difference between average daily high and low temperatures and the 57°F Average Daily Mean temperature.

 

All Annual Temperatures

 

The graph below originated on the Powerline blog and has been modified here with the addition of the red and blue bands representing the average diurnal temperature ranges for the peak summer month (July, red) and peak winter month (January, blue) in Wichita. This allows comparison of the global average annual temperature change over the period from 1880 through 2015 (~1.6°F) with the average diurnal and peak seasonal temperature changes in Wichita. Note that the chart temperature range is from -10°F to +110°F, slightly lower than the -22°F to +114°F record temperature range for Wichita.

 

Wichita Average Annual Global

 

Bob Tisdale has begun a series of posts (here), (here) and (here) entitled   “…it is the change in temperature compared to what we’ve been used to that matters.” The graph below, from this series of posts, compares the rate of change of global annual land plus ocean temperatures with the rate of change of the highest annual maximum near-surface temperature and the lowest annual minimum near-surface temperature for the entire globe. Note that the rate of increase of highest annual maximum temperature is approximately 40% of the rate of increase of the lowest annual minimum temperature; that is, the lowest annual minimum temperature is increasing 2.5 times as rapidly as highest annual maximum temperature. That suggests that, of the approximately 1.6°F global annual average near-surface temperature increase, only 0.6°F represents an increase in maximum summer temperatures, while the remaining 1.0°F represents an increase in the minimum winter temperatures. That seems an unlikely scenario for the “fireball earth” envisioned by the consensed climate science community.

 

Temperature Anomalies

 

The graph below, also from this series of posts, compares the rate of change of global annual land plus ocean temperatures with the rate of change of the highest annual maximum near-surface temperature and the lowest annual minimum near-surface temperature in the contiguous US. Note again that the rate of increase of highest annual maximum temperature is approximately 40% of the rate of increase of the lowest annual minimum temperature; that is, the lowest annual minimum temperature is increasing 2.5 times as rapidly as highest annual maximum temperature. Note also that the rates of change of temperature maxima and minima are both approximately 30% lower in the US than the global rates. However, the average difference between the highest maximum and the lowest minimum in the contiguous US is approximately 70% greater than the global average.

 

Temperature Anomalies

 

The two Tisdale graphs above also illustrate the point made in the graph of Wichita temperatures, namely that the warming which has occurred over the past 100+ years is relatively modest compared to the total range of temperatures experienced over the same period and to the range of diurnal and seasonal temperatures. The Tisdale graphs also show that the warming over the period is of lower magnitude than the annual changes in both maximum and minimum temperatures, to which the respective populations have been adapting successfully.

We are becoming far more aware of what is happening in our climate but are still challenged to understand why those changes are happening. That should be the focus of climate research.

 

Tags: Global Temperature, Temperature Record

Highlighted Article: Reassessing the RCPs

 

 

Reassessing the RCPs

 

"A response to: “Is RCP8.5 an impossible scenario?”. This post demonstrates that RCP8.5 is so highly improbable that it should be dismissed from consideration, and thereby draws into question the validity of RCP8.5-based assertions such as those made in the Fourth National Climate Assessment from the U.S. Global Change Research Program."

 

Reassessing the RCPs

 

Tags: Highlighted Article

“Ideal” Climate Perspective

The “ideal” climate apparently centers about a global annual average temperature of approximately 57°F, the global annual average temperature of the climatological reference period most commonly used in climate science. The annual average temperatures of the individual nations of the globe range from 22°F in Canada to 83°F in Burkina Faso. The annual average maximum and minimum temperatures tend to lie within +/- 5-10°F of the annual average, while the annual maximum and minimum temperature range tends to be 5-10 times as large.

With that range of conditions as background, we are told that the current global annual average temperature anomaly of ~1.6°F should be cause for great concern; and, that beyond twice that anomaly lies impending catastrophe. The expressions of concern would suggest that the global maximum average temperature is increasing and that the higher temperatures would cause crop failures and increased deaths from heat-related conditions.

What those expressions of concern fail to mention is that the global annual average minimum temperatures are also rising, typically at approximately twice the rate of increase of the global annual average maximum temperatures. Since the global annual average temperature is the mean of the global annual maximum and minimum temperatures, this means that the 1.6°F global annual temperature anomaly consists of an increase in the maximum temperature of ~0.6°F and an increase in the minimum temperature of ~1.0°F.

The graph below from a post by Bob Tisdale illustrates this situation for the contiguous United States for the period 1900 – 2012. Note that the graph displays the land plus ocean surface temperature trends; and, that the surface only trends would show greater range and variation.

 

Annual Global Land & Ocean Surface Temperature Anomalies

 

The rate of increase of the maximum temperature is approximately 70% of the rate of increase of the mean temperature, while the rate of increase of the minimum temperature is approximately 150% of the rate of increase of the mean temperature. This means that, in the US, climate change is manifesting as slightly warmer summers and warmer winters and as slightly warmer days and warmer nights.

The post linked above displays similar graphs for nine other countries: China; India; Indonesia; Brazil; Pakistan; Nigeria; Bangladesh; Russia and, Mexico. The US and these countries contain approximately 60% of the population of the globe. In all these countries, with the exception of Mexico, the rate of increase of the minimum temperature is higher than the rate of increase of the maximum temperature. In China, the rate of increase of the minimum temperature is approximately 30 times the rate of increase of the maximum temperature, the largest ratio for the 10 countries. In Mexico, the rate of increase of the maximum temperature is approximately 30% greater than the rate of increase of the minimum temperature.

The average difference during the climate reference period (1981-2010) between the highest maximum temperature and the lowest minimum temperature for these ten countries ranges from 50°F to 124°F. Against this background, an increase of 0.6°F in the average maximum temperature and an increase in the average minimum temperature of 1°F do not seem particularly significant.

 

Tags: Global Temperature

Highlighted Article: Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

“The Industrial Revolution did not cause hunger, poverty and child labor. Those were always with us. The Industrial Revolution helped to eliminate them.”

----------------------------------

"Remove cheap energy and most aspects of modern life, from car manufacturing and cheap flights to microwaves and hospital incubators, become a luxury, rather than a mundane, everyday occurrence and expectation."

 

Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

Tags: Highlighted Article

Standards of Evidence

The Paris Accords call for the developed nations to provide $100 billion per year by 2020 to fund climate change “adaptation and mitigation” programs in the developing nations. The Accords also call for the developed nations to provide ~$400 billion per year to compensate developing nations for “loss and damage” resulting from climate change. The Accords call for this funding from the developed nations on the basis that they have caused / contributed to the climate change which has occurred over the past several decades and thus bear responsibility for compensating the “victims” of this climate change for its effects on their nations.

The development and disbursement of a funding stream of approximately one half trillion dollars per year, in the absence of documented needs for “adaptation and mitigation” and documented “loss and damage”, raises serious fiduciary responsibility issues.

  • What criteria are used to determine that the situation to be considered for funding is the result of climate change, specifically anthropogenic climate change, and not the result of severe weather or other causes?
  • What criteria determine that a situation requires “adaptation” or “mitigation”?
  • Who determines the appropriate “adaptation and mitigation” approaches?
  • Who evaluates the “loss and damage” and the extent to which it is the result of climate change, specifically anthropogenic climate change, rather than severe weather or other causes?
  • Who assures that the funds provided to compensate for “loss and damage” are used in a way that eliminates / minimizes the likelihood of future “loss or damage”?
  • Who controls the disbursement of funds and assures that the funds are used for the intended purpose?

There is no question that severe weather events must be adapted to and the risks of severe weather damage mitigated. There is also no question that severe weather events cause loss and damage. However, the funding intended to be provided under the Paris Accords through the Green Climate Fund are intended to deal specifically with adaptation and mitigation issues and loss and damage resulting from climate change, specifically anthropogenic climate change.

Climate change is the result of natural variation and other causes, likely including human activities which emit “greenhouse” gases to the atmosphere and which alter the albedo of the globe. It is not currently possible to determine the extent of human contribution to climate change; and, it is clearly demonstrable that climate change was occurring prior to the mid-twentieth century when the influence of human activity on climate is thought to have begun to any significant degree.

Scientists have begun to develop attribution studies in an attempt to establish the extent of the impact of human activity on severe weather events and climate change. However, these attribution studies rely on unverified climate models and estimated climate sensitivities and feedbacks. Therefore, their outputs hardly constitute evidence of some percentage of anthropogenic influence on any particular severe weather event.

Should the funding called for under the Paris Accords ever be made available, it would be essential to assure that it did not disappear down the rathole in numerous kleptocracies rather than accomplish its stated purpose. The intended purpose of these fund transfers, the de-development of the developed nations, would occur regardless.

 

“The problem with socialism is that eventually you run out of other people’s money.”, Lady Margaret Thatcher

 

Tags: Paris Agreement

“Ideal” Climate

The earth does not have a climate, except as an “average” of thousands of local climates. Each of those local climates is changing, though not always in the same way or at the same pace. The “average” climate is changing, as measured by numerous temperature measuring stations and reported as global average temperature anomalies. These anomalies are calculated deviations from the conditions measured over a 30+ year reference period. There is no explicit recognition of the climate of this reference period as the “ideal” average global climate. There is, however, the inference that this reference period, or some other period, was “ideal” and that the recently calculated anomalies represent a departure from that “ideal”.

The thousands of local climates on the earth vary tremendously. The highest temperature ever recorded on earth (134°F) occurred in Furnace Creek, CA in July, 1913. The lowest temperature ever recorded on earth (-128.5°F) occurred in Vostok, Antarctica in July, 1983. Neither of those record temperatures would likely be considered a characteristic of an ideal climate. The current estimated global average surface temperature is 58.6°F, well above the mean of the global temperature extremes. The current global annual average temperature anomaly is ~1.6°F, suggesting that 57°F was the annual average temperature during the reference period.

National average annual temperatures range from Canada at 22°F to Burkina Faso at 83°F. The US average temperature is 47°F. US cities with an average annual temperature approximately equal to the global annual average temperature of 57°F during the reference period include: Albuquerque, NM; Louisville, KY; St. Louis, MO; and, Wichita, KS. European cities with similar average annual temperatures include Sochi, Russia and Istanbul, Turkey.

Miami, FL has an average annual temperature of 77.2°F, with an average high temperature of 84.3°F and an average low temperature of 70.1°F; and, record temperatures of 98°F and 30°F. Miami’s annual average temperature is 18.7°F warmer than the global average. Barrow, AK has an average annual temperature of 11.8°F, with an average high temperature of 17.3°F and an average low temperature of 6.3°F; and, record temperatures of 79°F and -56°F. Miami’s average high and low temperatures vary by 14.2°F, while the record temperatures vary by 68°F, nearly 5 times the average variation. Barrow’s average high and low temperatures vary by only 11°F, while the record temperatures vary by 135°F, more than 12 times the average variation. Barrow’s annual average is 46.7°F below the global average.

Somewhere, between the record global temperatures, within the range of national average annual temperatures and climates, lies the “ideal” global annual temperature and the “ideal” global annual climate. There are numerous nations for which an increase in annual average temperatures would probably represent a move toward an “ideal” climate, while there are other nations for which it would represent a move away from an “ideal” climate. In each case, that perception would be based on the characteristics of the “ideal” climate. These perceptions vary considerably, but seem to favor warmer climates with limited seasonal variation.

 

Tags: Global Temperature

Highlighted Article: A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

 

A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

"The 169-year long instrumental temperature record is built upon 2 measurements taken daily at each monitoring station, specifically the maximum temperature (Tmax) and the minimum temperature (Tmin). These daily readings are then averaged to calculate the daily mean temperature as Tmean = (Tmax+Tmin)/2. Tmax and Tmin measurements are also used to calculate monthly and yearly mean temperatures. These mean temperatures are then used to determine warming or cooling trends. This “historical method” of using daily measured Tmax and Tmin values for mean and trend calculations is still used today. However, air temperature is a signal and measurement of signals must comply with the mathematical laws of signal processing. The Nyquist-Shannon Sampling Theorem tells us that we must sample a signal at a rate that is at least 2x the highest frequency component of the signal. This is called the Nyquist Rate. Sampling at a rate less than this introduces aliasing error into our measurement. The slower our sample rate is compared to Nyquist, the greater the error will be in our mean temperature and trend calculations. The Nyquist Sampling Theorem is essential science to every field of technology in use today. Digital audio, digital video, industrial process control, medical instrumentation, flight control systems, digital communications, etc., all rely on the essential math and physics of ..."

 

A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

Tags: Highlighted Article

A CERES Possibility

NASA launched the first of the Clouds and the Earth’s Radiant Energy System (CERES) satellites in 1997. CERES is managed by NASA’s Langley Research Center. The CERES surface datasets include a dataset of the upwelling longwave radiation from the surface. This dataset can be converted to surface temperature using the Stefan-Boltzmann equation if the surface emissivity is known. Since the emissivity of common surfaces vary, but are equal to or greater than 0.94, only minimal error is introduced to actual surface temperatures by using an emissivity of 1.0; and, no error affects relative temperatures or temperature trends.

This suggests an opportunity to employ the CERES satellite surface dataset in combination with a relatively small number of highly accurate near-surface temperature measuring stations distributed around the globe to provide a far more comprehensive near-surface temperature dataset which would not require data “adjustment”. A recent paper by Willis Eschenbach suggests that the CERES surface temperature trends match well with the Reynolds sea surface temperature and the UAH MSU (University of Alabama Huntsville / microwave sounding units) lower tropospheric temperature trends, though they are lower than the Berkeley Earth and HadCRUT surface temperature trends by a factor of ~1.5.

Employing the CERES satellite surface dataset in combination with the US Climate Reference Network (CRN) would avoid use of the “adjusted” near-surface temperature records produced by Berkeley Earth, HadCRUT, NOAA and NASA GISS. Installation of a limited number of near-surface temperature measuring stations like the US CRN stations throughout the globe would facilitate “ground-truthing” of the CERES land surface data.  Similarly, the drifting and Argos buoys would be used to “ground-truth” the CERES sea surface temperature data.

This approach would represent a major change in the measurement and reporting of global surface and near-surface temperatures; and, a break in the instrumental temperature record. Therefore, it would be essential that the deviations between the CERES temperatures and the current near-surface and sea surface temperature records be resolved. It appears that much of this deviation in the near-surface temperature records is the result of Urban Heat Island (UHI) effects on the existing near-surface temperature measuring stations and the repeated “adjustments” to the near-surface temperature records by their producers. Similarly, it appears that much of the deviation in the sea surface temperature record is the result of the continuing use of temperature measurements made in a variety of ways by surface ships.

The existing US CRN measuring stations could be used to establish accurate absolute surface temperature data points which could then be used to correct the emissivities used in the surface temperature conversion; and, to establish the extent of UHI effects on the existing near-surface temperature records and the accuracy of the “adjustment” protocols used to prepare the near-surface temperature data for inclusion in the global near-surface temperature anomaly products. The Argos buoys and the floater buoys could perform the same roles for the sea surface temperature measurements.

The greatest advantage of this approach to using the CERES data is the complete global coverage provided by these satellites, including measurements of the Arctic and Antarctic surfaces.

 

Tags: Temperature Record, Global Temperature, Satellites, Adjusted Data

Highlighted Article - U.S. Media Bans Scientific Dissent – Claim Wildfires, Floods, Droughts, Hurricanes Are Human-Controlled

 

 

U.S. Media Bans Scientific Dissent – Claim Wildfires, Floods, Droughts, Hurricanes Are Human-Controlled

 

"NBC News’ Chuck Todd recently asserted that we humans can control the climate and the frequency or intensity of extreme weather events (hurricanes, floods, droughts) and disasters (wildfires) with our CO2 emissions. He has declared the science is “settled” on this point and therefore no “denier” is allowed on his Meet the Press program. But Todd and the members of his panel have recited claims that are contravened by observational evidence and scientific publications."

 

U.S. Media Bans Scientific Dissent – Claim Wildfires, Floods, Droughts, Hurricanes Are Human-Controlled

 

Tags: Highlighted Article

More Climate Issues 2019

Climate Priorities 2019 focused on issues related to measurement and modeling of the earth’s climate: temperature measurement, climate sensitivity, forcings and feedbacks, and, model verification. These issues are a priority for 2019, but they have been a priority for decades without resolution. However, they are hardly the only unresolved or poorly understood issues in climate science.

The oceans cover approximately 70% of the surface of the planet. There are several natural ocean phenomena which have significant effects on weather and climate which are known but not well understood. On a global scale, the Global Ocean Conveyor Belt (GOCB) circulates water among the world’s oceans. While scientists are aware that the GOCB exists, there is less certainty about why; and, about how its existence influences climate globally. Perhaps the best-known component of the GOCB is the Gulf Stream, which is one of the streams which compose the Atlantic Meridional Overturning Circulation. There are numerous other regional currents which also have weather and climate effects in the various ocean basins.

Global Ocean Conveyor Belt

Source: NASA/JPL

The GOCB interacts with several regional phenomena which affect both weather and climate, including:

These regional phenomena occur over very different time frames. The El Nino Southern Oscillation consists of two relatively short-lived phenomena (1-2 year duration). The warm phenomena are referred to as El Ninos, while the cool phenomena are referred to as La Nina. The timing of the ENSO phenomena are not highly predictable and their magnitudes vary considerably, as do their effects on climate.

UAH Satellite-Based Temperature

Source: UAH

The graph of global lower atmosphere temperatures above illustrates the climate effects of ENSO events, among other things, highlighted by two Super El Ninos in 1997-1998 and 2016/2017.

The factors which trigger ENSO events and determine their intensity are not well understood, but the events have significant weather effects and arguably might have significant climate effects as well.

The PDO is a longer-term phenomenon, with an estimated period of 20-30 years for both its warm phase and its cool phase. The impacts of the phases and the phase change on weather and climate are not well understood, nor is their interaction with ENSO events in the Pacific Basin.

The AMO is an even longer term phenomenon, with an estimated period of 60-80 years for both its warm phase and its cool phase. The impacts of the phases and the phase change are even less well understood than for the PDO, because the period of the phases is so long that perhaps only a single phase change has been subjected to instrumental measurement.

Understanding the factors which cause these phenomena and the effects of these individual phenomena and their interactions are a major scientific challenge. Once these phenomena are well understood, they can be modeled and included in the global climate models, so that their effects can be included in the modeled scenarios. Until that is the case, their absence represents a significant limitation to the accuracy and thus the utility of the climate models.

Tags: Ocean Currents and Circulation, Climate Models

Climate Priorities 2019

There continue to be four fundamental climate science research priorities:

  • accurate and comprehensive temperature measurements;
  • accurate climate sensitivity determination;
  • accurate feedback magnitude determination; and,
  • a verified, accurate and comprehensive climate model.

The political process struggles to advance in the absence of these fundamentals, spending essential resources on political advocacy efforts rather than on addressing these scientific priorities.

The first research priority, accurate and comprehensive temperature measurements, applies to both near-surface land and sea surface temperature measurements. Accurate temperature measurement facilities for both applications exist, but they are not comprehensively deployed and are not used exclusively as a result.

The US Climate Research Network (CRN) provides accurate near-surface temperature measurements; and, its use of three instruments assures continuous measurement while permitting detection of instrument failure or drift. The measurement sites are located away from infrastructure which could cause Urban Heat Island effects on the measurements. Deployment of similar measuring stations globally would provide comprehensive near-surface temperature data which would not require “adjustment”.

The collection of drifting buoys and the Argo floats provide accurate temperature measurements, though an array of three measuring instruments would provide the same ability to detect instrument failure or drift available with the US CRN. The total number of floats and buoys deployed and their distribution globally is currently inadequate to produce a comprehensive picture of global sea surface temperature.

Establishing an accurate relationship between the land near-surface and the sea surface temperature measurements and the more comprehensive satellite temperature measurements might minimize the number of additional measuring stations required to provide the necessary data.

The second research priority would facilitate replacing the current estimated range of sensitivities used to drive the climate models with a single accurate and verifiable climate sensitivity. The IPCC currently uses a range of equilibrium climate sensitivity of 1.5 – 4.5. However, recent research by several scientists suggests the equilibrium climate sensitivity (ECS) is more likely between 0.5 and 2.0. These lower climate sensitivities would result in significantly smaller increases in global temperatures as the result of a doubling of atmospheric CO2 concentration. However, these lower climate sensitivities are still estimates over a relatively wide range, rather than a definitive number, leaving significant uncertainty regarding potential future temperature increases.

The third research priority would resolve the dispute between the consensed climate science community, which generally argues that feedback is net positive, and researchers analyzing satellite data, who argue that feedbacks are net negative. This is a very significant difference which affects projections of potential future temperature increases.

The fourth research priority would determine whether it is possible to accurately model the climate changes which have already occurred and been documented over the past 30-year climate period. This would require: initializing the model(s) with conditions 30 years previous; using accurate climate sensitivity and climate feedback measurements in the model runs; and, producing an accurate modeled replication of the climate changes in the intervening 30-year period.

The successful achievement of the first three research priorities listed above would make it possible to pursue the fourth priority. However, it is possible that the known unknowns and the unknown unknowns remaining in climate science might make accurate modeling unachievable. Also, even if accurate modeling of the most recent climate period is achieved, there is still no assurance that the successful model has any predictive ability over the longer term.

Tags: Climate Models, Climate Sensitivity, Temperature Record

Pyramid or Scheme

A pyramid is an inherently stable structure built upon a broad base which tapers toward a point at its top. A pyramid scheme is an inherently unstable structure built upon a point and expanding to a broad top.

Pyramid Scheme

Perhaps the most famous pyramids are those in Egypt, which are a testament to the stability and durability of the structural form. Perhaps the most famous pyramid scheme is the scheme created by Charles Ponzi and later emulated by Bernard Madoff, which are a testament to the instability of the inverted pyramid. The more common pyramid schemes are the numerous multi-level marketing schemes, which “grow like Topsy” and then rapidly topple like a “House of Cards”.

The catastrophic anthropogenic global warming (CAGW) or catastrophic anthropogenic climate change (CACC) science is in many ways similar to a pyramid scheme. It is based on a small body of evidence which is used to support a growing collection of estimates used to concoct an even more rapidly growing collection of projections of future catastrophes. These projected future catastrophes are based on the outputs of unverified climate models built upon hindcast comparisons to “adjusted” temperature measurements and fed with uncertain climate sensitivities, forcings and feedbacks.

The structural pyramid converges to a point at its top upon completion. The pyramid scheme diverges from a point at its base to a broad but undefined top prior to its collapse. The weakness of the base of CAGW / CACC science is illustrated by the numerous failed short-term predictions based on that science. The predictions of an ice-free Arctic and of the end of snow are classic examples.

The consensed climate science community has shifted its efforts toward longer term projections which would likely not be falsifiable in their lifetimes. These longer term projections include more frequent and more intense tropical cyclones, tornadoes, droughts and floods, more extreme temperatures, melting glaciers and Arctic and Antarctic ice caps, coastal and island inundation, crop failures and even the end of human civilization.

These long-term extreme projections become the basis for demands to end human fossil fuel use, eliminate animal husbandry, institute global governance with massive redistribution of wealth and income and control and reduce global population. The actions would require investments of tens of trillions of dollars combined with massive personal and economic turmoil.

It is amazing to contemplate that these extreme projections and planned massive dislocations are built upon a very narrow knowledge base combined with an ever-expanding series of estimates of climate sensitivity, climate feedbacks, Representative Concentration Pathways, fed into an ensemble of unverified climate models, most of which have been effectively falsified by the actual climate observations over the most recent climate period.

It is not too late in the political process to begin efforts to expand the scientific knowledge base upon which the CAGW / CACC meme is based, while reducing the magnitude of the overhanging mass of estimates, unverified climate models and hypothetical “scary scenarios” and replacing them with information more reliably supported by the expanded scientific knowledge base.

 

Tags: Climate Models
Search Older Blog Posts