Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

Green New Deal and Buildings

The principal energy efficiency thrust of the “Green New Deal” is "upgrading all existing buildings in the United States and building new buildings to achieve maximal energy efficiency, water efficiency, safety, affordability, comfort, and durability, including through electrification".

Building Green analyzedThe Challenge of Existing Homes: Retrofitting for Dramatic Energy Savings” several years ago. The residential sector includes approximately 125 million dwelling units, including approximately 85 million single family detached structures; and accounts for approximately 20% of US energy consumption.

Intelligent choices regarding building envelope characteristics, window and door choices, appliance and equipment choices can make large differences in building energy consumption, typically at relatively modest cost. Retrofitting existing buildings of any type is far more challenging and expensive.

Building retrofits are an exercise in “Broken Window Economics”, since functional components such as windows and doors, appliances and equipment are being removed and replaced with more efficient equipment before the ends of their useful lives. Improving the insulation in existing structures is also frequently very difficult and expensive. Additional attic insulation and installation of insulation over crawl spaces is relatively straightforward, as is adding insulation to uninsulated walls and replacing caulking and weather stripping. However, adding insulation to walls which are already insulated, though not optimally, can be both very expensive and relatively ineffective. Adding insulation to existing building slabs is typically prohibitive.

The intent of the GND is to accomplish what Building Green refers to as a major energy retrofit, which they estimated would incur an average cost of approximately $50,000 per dwelling unit. Accomplishing a major energy retrofit of 125 million dwelling units in the 10-year time horizon suggested by the GND would cost approximately $6 trillion, assuming the availability of sufficient building materials and sufficient skilled labor to complete the retrofits. The number of dwellings to be retrofitted each year would be an order of magnitude greater than the number of such retrofits being performed each year today.

There are approximately 5.6 million commercial buildings in the US, containing approximately 87 billion square feet of floor space. These buildings consume approximately 18% of all US energy. The average commercial building is approximately 7 times the area of the average residential dwelling.  Applying this ratio to the major energy retrofit cost for the average dwelling suggests that accomplishing a major energy retrofit for these buildings would cost approximately $2 trillion, again assuming the availability of sufficient materials and skilled labor to accomplish the retrofits.

Retrofits of industrial facilities could only be analyzed on an industry by industry basis, since the structures involved vary so greatly, as does the process equipment in use at these facilities. Such an analysis is beyond the scope of this commentary. However, US industry consumes approximately 33% of total US energy consumption, so major energy retrofits would be essential to achieving the energy consumption reduction goals of the GND. The costs could easily approach $10 trillion, assuming that the required alternative energy use technologies and equipment were even available for application.

 

Tags: Green New Deal

Green New Deal - High Speed Rail #2

A previous commentary discussed: the requirements for a high speed rail system; an approach to designing such a system; and, the US High Speed Rail Plan (HSRP) system design developed by the US High Speed Rail Association. The GND approach to high speed rail focuses on eliminating the need for air travel in the US, while the US High Speed Rail Plan (HSRP) assumes that trips over approximately 1,000 – 1,500 miles would be made by air. This assumption has significant impacts on the HSRP design shown in the map below.

 

High Speed Rail Map

 

The HSRP design includes no distance optimized cross country routes, though it does include distance optimized North / South routes along the East and West coasts. The HSRP, including the lower speed feeder routes, has an estimated total cost of $3.5 trillion.

The addition of the following distance optimized cross country routes could displace the HSRP reliance on air travel for longer trips.

  • New York City – Chicago – Seattle                       790 +  2,110
  • New York City – Chicago – San Francisco                      2,135
  • New York City – Chicago – Los Angeles                         2,018

These routes are shown in the map below as black dotted lines. These distance optimized routes would add a total of approximately 7,000 miles to the HSRP developed by the US High Speed Rail Association, at an estimated additional cost of approximately $1 trillion, for a total system cost of approximately $4.5 trillion.

 

High Speed Rail Map

 

Chicago O’Hare International Airport currently serves approximately 80 million passengers per year, or approximately 220,000 passengers per day, roughly divided between arrivals and departures. If we assume that approximately half of these passengers have Chicago as their point of origin or destination, then 50,000 passengers per day would arrive and depart Chicago daily on trains taking them to or from their point of origin or destination.

Typical high-speed trains can be assumed to have a passenger loading of 1,000. Therefore, approximately 50 trains would arrive and depart the Chicago rail terminal, in addition to the through, non-stop trains to and from New York, Seattle, San Francisco and Los Angeles. Numerous other trains would arrive and depart Chicago to and from other points of origin and destinations.

Annual passenger trips between New York City and Los Angeles total approximately 2.8 million. In addition, annual passenger trips between New York City and Chicago total approximately 2.3 million; and, between Chicago and Los Angeles total approximately 1.6 million. Therefore, the rail lines between New York City and Chicago would move approximately 5.1 million passengers per year, or approximately 14,000 passengers per day or 14 trains per day for that 3.5-hour trip. The rail lines between Chicago and Los Angeles would move 4.4 million passengers per year, or approximately 12,000 passengers per day or 12 trains per day for that 10-hour trip.

Therefore, there would be one train moving on the rail lines between New York City and Chicago non-stop in each direction at any given time.; and, there would be 2 trains moving on the rail lines between Chicago and Los Angeles non-stop in each direction at any given time. There would be additional trains operating on these rail lines serving intermediate stops on each route.

 

Tags: Green New Deal

Green New Deal - High Speed Rail

     The GND Line is a mighty good road

     The GND Line is the road to ride

     If you want to ride you gotta ride it like you find it

     Get your ticket at the station on the GND Line

     (Apologies to Lonnie Donegan / Rock Island Line)

 

The “Green New Deal” currently being promoted by socialist Senators and Congresspersons and several Democrat 2020 presidential candidates exists today only as a conceptual framework. A portion of this conceptual framework deals with proposed efforts intended to reduce or eliminate “greenhouse gas” emissions. One of the most challenging of these proposed efforts involves rendering air travel unnecessary, primarily through development of a high-speed rail network throughout the US.

The defining characteristics of a high-speed rail system for the US might be as follows:

  • new dedicated dual track right-of-way
  • dedicated station infrastructure
  • system design for 220 mph operation
  • no grade level crossings
  • continuous intrusion barrier
  • full electronic real-time monitoring
  • electric powered drive systems

These characteristics are common to high-speed rail systems operating in other countries.

The first step in designing such a rail system would be determining the cities to be connected by the system. Since displacing air travel is the objective, the rail system would have to connect the locations of the nation’s most heavily used airports. The connected cities would thus include: New York; Charlotte; Atlanta; Orlando; Miami; Dallas; Houston; Los Angeles; Las Vegas; San Francisco; Seattle; Denver; and, Chicago. Washington, DC would also be included, since this would be a federally funded project. Each of these airports serves more than 40 million passengers per year, or more than 100,000 passengers per day. The route system would then be expanded to include additional cities with significant populations.

The US High Speed Rail Association has developed a US High Speed Rail Plan and maps showing the proposed four stage buildout of a US high speed rail system.

 

High Speed Rail Map

 

The US High Speed Rail Plan includes most US cities of 500,000+ population in the 220-mph high speed service Cities of lower population are connected by 110-mph feeder lines which would bring passengers to the higher speed system.

The US High Speed Rail Plan (HSRP) envisions a 17,000-mile system which would be completed by 2030, essentially in line with the GND timetable. However, the HSRP assumes that most trips in excess of approximately 1000 miles would be by air, unlike the GND. A cross-country trip of approximately 3000 miles, which would take approximately 6 hours by air, would require approximately 13.6 hours by non-stop high-speed rail.

Developing the investments required to complete such a plan is an extremely time consuming and costly process. However, the recently cancelled California high speed rail system project provides some idea of the estimated cost of a joint Federal/State high speed rail project. The most recent estimate of the complete project cost for the California system is $77 billion for a 520-mile route, or approximately $150 million per mile. Thus, a first estimate of the cost of the 17,000 mile HSRP would be on the order of $2.5 trillion. A first estimate of the cost or approximately 10,000 miles of 110 mph rail lines would be on the order of $1 trillion.

The issue of international air travel is unaddressed in the GND conceptual framework.

 

Tags: Green New Deal

Green New Deal - Flights of Fancy

One of the most perplexing aspects of the “Green New Deal” (GND) and one of its most difficult technical and economic challenges in the elimination of the need for air travel, since air travel without the use of fossil fuels appears to be beyond the ten-year time horizon of the climate plan. Just the replacement of all passenger vehicles, small and medium-size trucks and tractor-trailer rigs is a major challenge with currently available technology.

The plan for domestic travel would require the construction of a high-speed rail network using dedicated track to accommodate very high-speed trains operating at speeds up to 220 mph for long haul routes and high-speed trains operating at speeds up to 110 mph for shorter routes. This network would be constructed without grade level crossings, both to avoid the possibility of road traffic damage to the high-speed rails and to avoid the potential for collisions between the trains and road vehicles. The network would require long, sweeping, banked curves to allow the trains to maintain speed along the entire route.

The very high-speed trains would permit cross country non-stop trips of approximately 15 hours, compared to approximately 5 hours for non-stop domestic air travel. Very high-speed trains operating with minimal stops might add 15 – 30 minutes per stop to that cross-country schedule, similar to the experience with domestic air travel. The need to change trains in-route would likely add 1 – 2 hours per stop to the schedule, as is common with domestic air travel. Passenger mile data from the airline industry would likely be used to establish the number of transportation corridors, the number and location of transportation hubs and the number and location of cities served by both the very high speed and the high-speed rail networks.

True high-speed rail service is unknown in the US. The closest approach is the Acela service offered by Amtrak in the Boston – New York – DC corridor, which is very limited in speed relative to the Shinkansen in Japan and other high-speed rail systems. The Acela service has proven to be unpopular and unprofitable, likely because there have been faster alternatives available at relatively similar prices. The unavailability of choice would dramatically change the equation.

The issue of international travel is a totally different matter, since high-speed rail travel between the US and either Europe or Asia is currently unavailable; and, is unlikely to be an option in the ten-year time horizon of the GND. Air travel from the US would not be possible because fossil-based aviation fuels would no longer be available. Air travel to the US would require planes capable of making the round trip without refueling, since fuel would not be available in the US, or the establishment of refueling stops in countries taking a less aggressive approach to emissions reductions.

China, which is taking no approach to emissions reductions during the time horizon of the GND, could perhaps construct giant floating air terminals off the coasts of the US in international waters, in close proximity to the major coastal high-speed rail hubs. Aviation fuel could be provided to these air terminals by supertankers operating from ports in the major oil producing nations. Transportation from the air terminals to shore near the rail hubs could be provided by sail-powered ferries. This arrangement would add significantly to international air travel times to and from the US, which would make such travel far less desirable. The issue of continued fossil-fueled air travel in US air space by non-US airlines is not addressed in the GND at this time.

If this all sounds rather silly, that is because it is rather silly, like the remainder of the “Green New Deal”.

 

Tags: Green New Deal

Green New Deal - Deadweight Loss

The Green New Deal would be the most extensive exercise in “Broken Window Economics” in the history of the globe. The “Green New Deal” would cause massive deadweight losses in virtually every sector of the US economy, while producing no measurable impact on global climate.

“Deadweight loss can be stated as the loss of total welfare or the social surplus due to reasons like taxes or subsidies, price ceilings or floors, externalities and monopoly pricing. It is the excess burden created due to loss of benefit to the participants in trade which are individuals as consumers, producers or the government.” (The Economic Times)

The following is a list of key industries affected by the climate change aspects of the “Green New Deal”. However, all US economic activity would be affected to some extent, so the total affects are likely significantly understated.

 

 

Obviously, the major financial impact is the result of the plan to leave the US substantial energy resources “in the ground”, as has been advocated by numerous environmental groups previously.

 

The impact of the plan to render air travel obsolete, combined with the intent to halt the production of the fuels required by the airline and air freight industries would expand well beyond the US. Air service to and from the US to other locations would also be impacted, since refueling would not be available in US. International air operations would require that the aircraft carry sufficient fuel for a round trip or add a refueling stop in some country which still permitted production and sale of aviation fuels.

The impact of terminating fossil-fueled electric generation and not only replacing the existing generating capacity but also adding sufficient capacity to power the replacements for direct-fired residential, commercial and industrial energy end uses would be profound. The electric energy required to meet all current electric end use consumption plus the direct end uses currently served by oil and natural gas and their derivatives would be approximately three times the quantity of energy currently provided by existing electric generators. Also, the intermittent nature of current renewable energy systems would require that the installed renewable generating capacity per unit of electric energy consumption be approximately three times the generating capacity of the fossil-fueled generating capacity it replaced; and, that the renewable generating capacity be supported by long duration, transmission-level storage capacity.

Current US electricity production is approximately 4 trillion kWh per year, of which approximately 2.5 trillion kWh is produced using fossil fuels and 0.8 trillion kWh using nuclear generators. Increasing US electricity production to approximately 12 trillion kwh per year would require increasing the current wind and solar electricity production from its current approximately 0.3 trillion kWh per year, or by a factor of approximately 40.

Current US electric generating capacity is approximately 1,000 GW, of which approximately 750 GW is fossil fueled. Replacing this dispatchable fossil fueled generating capacity with a mix of intermittent clean and renewable generation with an average availability of approximately 25-30% would require the installation of approximately 2000 GW of new generating capacity, plus the storage capacity necessary to assure reliable service. US EIA has estimated the cost of installing new solar photovoltaic generating capacity at approximately $3,700 per kW and new wind generating capacity at approximately $1,900 per kW. Assuming an average of $2,800 per kW of clean and renewable generating capacity, the installed cost of 2000 GW of new capacity would be approximately $5.6 trillion.

 

A trillion here, a trillion there and pretty soon you’re talking about real money.

(with apologies to the late US Senator Everett McKinley Dirksen , R, IL)

 

Tags: Green New Deal

Not-So-Green New Deal

The “Green New Deal” (GND) is the current fascination of the most liberal / progressive / socialist elements of our society and its elected representatives, as well as several declared candidates for President in 2020. The rallying point of the Democrat GND was replacing our existing national energy infrastructure with “clean, renewable, and zero-emission energy sources” by “dramatically expanding and upgrading existing renewable power sources.” However, the Democrat leadership has apparently replaced the original specific deadline for US fossil fuel use with a stated intent to achieve net-zero carbon emissions. While this statement of intent sounds somewhat less extreme, it could only be achieved with high percentage carbon capture and storage combined with active removal of CO2 from the environment. Neither technology is currently commercially viable or economically attractive.

The partial removal of the extreme green “rind” from the Democrat GND “watermelon” exposes the nature of its red interior, as described by its originators, Green Party US.

  1. The Economic Bill of Rights
  2. A Green Transition
  3. Real Financial Reform
  4. A Functioning Democracy

Interestingly, the Green Transition envisioned by the GND developed by the Green Party US is not the rallying point of the plan, as it was in the original Democrat version championed by Bernie Sanders and Alexandria Ocasio-Cortez.

Senator Edward Markey and Representative Alexandria Ocasio-Cortez have introduced a Sense of Congress Resolution outlining the major goals of the GND as follows.

            A: Achieve net-zero “greenhouse gas” emissions

            B: Create millions of new, high-wage jobs

            C: Invest in US infrastructure and industry

            D: Secure for all people of the US:

            clean air and water;

            climate and community resiliency;

            healthy food;

            access to nature; and,

            a sustainable environment.

E: Promote justice and equity

These goals are to be accomplished through a ten-year national mobilization. This accelerated schedule virtually ensures that the process would be more expensive than necessary, since much of the technology required to achieve net-zero emissions has yet to be commercially demonstrated. The costs of achieving the remaining goals is a function of the detailed descriptions of specific objectives to be achieved.

Achieving zero-net emissions would impact every aspect of energy production and use in the US economy.

All electric generation consuming coal, natural gas, propane, biomass and municipal solid waste would either require carbon capture and storage or replacement with clean and/or renewable sources. There is significant dispute in the environmental community regarding the retention or expansion of nuclear and hydroelectric generation.

All direct use of fossil fuels in residential and commercial markets would be eliminated, unless new technology facilitating carbon capture and storage at that scale could be developed and commercialized. Otherwise, all residential and commercial space heating, water heating, cooking and laundry drying appliances would have to be replaced by electric appliances; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

All industrial food preparation and heat processing equipment would have to be equipped with carbon capture and storage capability or replaced with electric equipment; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

All transportation equipment, including personal vehicles, trucks, buses and trains would have to be replaced with electric vehicles; and, incremental clean and renewable electric generating capacity would have to be built to supply their requirements.

Offsets would have to be provided for the “greenhouse gas” emissions from processes such as steelmaking and cement production, for which electric process alternatives are not available.

Current electric power generation of all types provides 38% of US energy consumption, as shown in the chart below. The GND would require the replacement of all or most of the remaining 62% of US energy consumption with clean and renewable energy from incremental sources.

 U.S. primary energy consumption source and sector, 2017

Essentially, the Green New Deal would be the most extensive exercise in “Broken Window Economics” in the history of the globe. I have estimated the investments required to achieve such a transition in the energy economy at $30 trillion, not including the investment in the replacement of equipment and systems which had reached the end of their useful lives. Achieving this transition on the accelerated schedule contemplated by the GND could significantly increase that investment requirement. The lost value of used and useful equipment and systems abandoned as a result of this transition would be very difficult to estimate, but would be enormous, especially in the rapid transition envisioned in the Green New Deal.

 

Tags: Green New Deal

Wichita Revisited

Wichita, Kansas is located very close to the geographic center of the contiguous United States. The annual average temperature in Wichita is 57°F, which is also the current global average near-surface temperature. Wichita has been used to provide perspective on climate change on this site previously (here), (here) and (here).

The graph below shows the average and record high and low temperatures for Wichita on a monthly basis. Note that the average diurnal temperature range is from 20 – 23°F throughout the year; and, that the average high temperature is 21°F lower than the high temperature record, while the record low temperature is 37°F lower than the average low temperature.

 

Wichita Temperature Averages

 

The chart below lists the all-time temperature records for Wichita and the dates on which the records occurred. Note that only the Highest Monthly Average, Highest Annual Average and Lowest Annual Average occurred in the post-1950 period, when increasing atmospheric CO2 concentrations are thought to influence global climate. Note also that the Record Warmest High and the Record Warmest Low both occurred in the 1936, during the Dust Bowl years in the US.

 

All Time Record Temperatures

 

The chart below summarizes annual average temperatures for Wichita, showing the 22°F average difference between average daily high and low temperatures and the 57°F Average Daily Mean temperature.

 

All Annual Temperatures

 

The graph below originated on the Powerline blog and has been modified here with the addition of the red and blue bands representing the average diurnal temperature ranges for the peak summer month (July, red) and peak winter month (January, blue) in Wichita. This allows comparison of the global average annual temperature change over the period from 1880 through 2015 (~1.6°F) with the average diurnal and peak seasonal temperature changes in Wichita. Note that the chart temperature range is from -10°F to +110°F, slightly lower than the -22°F to +114°F record temperature range for Wichita.

 

Wichita Average Annual Global

 

Bob Tisdale has begun a series of posts (here), (here) and (here) entitled   “…it is the change in temperature compared to what we’ve been used to that matters.” The graph below, from this series of posts, compares the rate of change of global annual land plus ocean temperatures with the rate of change of the highest annual maximum near-surface temperature and the lowest annual minimum near-surface temperature for the entire globe. Note that the rate of increase of highest annual maximum temperature is approximately 40% of the rate of increase of the lowest annual minimum temperature; that is, the lowest annual minimum temperature is increasing 2.5 times as rapidly as highest annual maximum temperature. That suggests that, of the approximately 1.6°F global annual average near-surface temperature increase, only 0.6°F represents an increase in maximum summer temperatures, while the remaining 1.0°F represents an increase in the minimum winter temperatures. That seems an unlikely scenario for the “fireball earth” envisioned by the consensed climate science community.

 

Temperature Anomalies

 

The graph below, also from this series of posts, compares the rate of change of global annual land plus ocean temperatures with the rate of change of the highest annual maximum near-surface temperature and the lowest annual minimum near-surface temperature in the contiguous US. Note again that the rate of increase of highest annual maximum temperature is approximately 40% of the rate of increase of the lowest annual minimum temperature; that is, the lowest annual minimum temperature is increasing 2.5 times as rapidly as highest annual maximum temperature. Note also that the rates of change of temperature maxima and minima are both approximately 30% lower in the US than the global rates. However, the average difference between the highest maximum and the lowest minimum in the contiguous US is approximately 70% greater than the global average.

 

Temperature Anomalies

 

The two Tisdale graphs above also illustrate the point made in the graph of Wichita temperatures, namely that the warming which has occurred over the past 100+ years is relatively modest compared to the total range of temperatures experienced over the same period and to the range of diurnal and seasonal temperatures. The Tisdale graphs also show that the warming over the period is of lower magnitude than the annual changes in both maximum and minimum temperatures, to which the respective populations have been adapting successfully.

We are becoming far more aware of what is happening in our climate but are still challenged to understand why those changes are happening. That should be the focus of climate research.

 

Tags: Global Temperature, Temperature Record

Highlighted Article: Reassessing the RCPs

 

 

Reassessing the RCPs

 

"A response to: “Is RCP8.5 an impossible scenario?”. This post demonstrates that RCP8.5 is so highly improbable that it should be dismissed from consideration, and thereby draws into question the validity of RCP8.5-based assertions such as those made in the Fourth National Climate Assessment from the U.S. Global Change Research Program."

 

Reassessing the RCPs

 

Tags: Highlighted Article

“Ideal” Climate Perspective

The “ideal” climate apparently centers about a global annual average temperature of approximately 57°F, the global annual average temperature of the climatological reference period most commonly used in climate science. The annual average temperatures of the individual nations of the globe range from 22°F in Canada to 83°F in Burkina Faso. The annual average maximum and minimum temperatures tend to lie within +/- 5-10°F of the annual average, while the annual maximum and minimum temperature range tends to be 5-10 times as large.

With that range of conditions as background, we are told that the current global annual average temperature anomaly of ~1.6°F should be cause for great concern; and, that beyond twice that anomaly lies impending catastrophe. The expressions of concern would suggest that the global maximum average temperature is increasing and that the higher temperatures would cause crop failures and increased deaths from heat-related conditions.

What those expressions of concern fail to mention is that the global annual average minimum temperatures are also rising, typically at approximately twice the rate of increase of the global annual average maximum temperatures. Since the global annual average temperature is the mean of the global annual maximum and minimum temperatures, this means that the 1.6°F global annual temperature anomaly consists of an increase in the maximum temperature of ~0.6°F and an increase in the minimum temperature of ~1.0°F.

The graph below from a post by Bob Tisdale illustrates this situation for the contiguous United States for the period 1900 – 2012. Note that the graph displays the land plus ocean surface temperature trends; and, that the surface only trends would show greater range and variation.

 

Annual Global Land & Ocean Surface Temperature Anomalies

 

The rate of increase of the maximum temperature is approximately 70% of the rate of increase of the mean temperature, while the rate of increase of the minimum temperature is approximately 150% of the rate of increase of the mean temperature. This means that, in the US, climate change is manifesting as slightly warmer summers and warmer winters and as slightly warmer days and warmer nights.

The post linked above displays similar graphs for nine other countries: China; India; Indonesia; Brazil; Pakistan; Nigeria; Bangladesh; Russia and, Mexico. The US and these countries contain approximately 60% of the population of the globe. In all these countries, with the exception of Mexico, the rate of increase of the minimum temperature is higher than the rate of increase of the maximum temperature. In China, the rate of increase of the minimum temperature is approximately 30 times the rate of increase of the maximum temperature, the largest ratio for the 10 countries. In Mexico, the rate of increase of the maximum temperature is approximately 30% greater than the rate of increase of the minimum temperature.

The average difference during the climate reference period (1981-2010) between the highest maximum temperature and the lowest minimum temperature for these ten countries ranges from 50°F to 124°F. Against this background, an increase of 0.6°F in the average maximum temperature and an increase in the average minimum temperature of 1°F do not seem particularly significant.

 

Tags: Global Temperature

Highlighted Article: Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

“The Industrial Revolution did not cause hunger, poverty and child labor. Those were always with us. The Industrial Revolution helped to eliminate them.”

----------------------------------

"Remove cheap energy and most aspects of modern life, from car manufacturing and cheap flights to microwaves and hospital incubators, become a luxury, rather than a mundane, everyday occurrence and expectation."

 

Marian Tupy: “Celebrate the Industrial Revolution and What Fueled It”

 

Tags: Highlighted Article

Standards of Evidence

The Paris Accords call for the developed nations to provide $100 billion per year by 2020 to fund climate change “adaptation and mitigation” programs in the developing nations. The Accords also call for the developed nations to provide ~$400 billion per year to compensate developing nations for “loss and damage” resulting from climate change. The Accords call for this funding from the developed nations on the basis that they have caused / contributed to the climate change which has occurred over the past several decades and thus bear responsibility for compensating the “victims” of this climate change for its effects on their nations.

The development and disbursement of a funding stream of approximately one half trillion dollars per year, in the absence of documented needs for “adaptation and mitigation” and documented “loss and damage”, raises serious fiduciary responsibility issues.

  • What criteria are used to determine that the situation to be considered for funding is the result of climate change, specifically anthropogenic climate change, and not the result of severe weather or other causes?
  • What criteria determine that a situation requires “adaptation” or “mitigation”?
  • Who determines the appropriate “adaptation and mitigation” approaches?
  • Who evaluates the “loss and damage” and the extent to which it is the result of climate change, specifically anthropogenic climate change, rather than severe weather or other causes?
  • Who assures that the funds provided to compensate for “loss and damage” are used in a way that eliminates / minimizes the likelihood of future “loss or damage”?
  • Who controls the disbursement of funds and assures that the funds are used for the intended purpose?

There is no question that severe weather events must be adapted to and the risks of severe weather damage mitigated. There is also no question that severe weather events cause loss and damage. However, the funding intended to be provided under the Paris Accords through the Green Climate Fund are intended to deal specifically with adaptation and mitigation issues and loss and damage resulting from climate change, specifically anthropogenic climate change.

Climate change is the result of natural variation and other causes, likely including human activities which emit “greenhouse” gases to the atmosphere and which alter the albedo of the globe. It is not currently possible to determine the extent of human contribution to climate change; and, it is clearly demonstrable that climate change was occurring prior to the mid-twentieth century when the influence of human activity on climate is thought to have begun to any significant degree.

Scientists have begun to develop attribution studies in an attempt to establish the extent of the impact of human activity on severe weather events and climate change. However, these attribution studies rely on unverified climate models and estimated climate sensitivities and feedbacks. Therefore, their outputs hardly constitute evidence of some percentage of anthropogenic influence on any particular severe weather event.

Should the funding called for under the Paris Accords ever be made available, it would be essential to assure that it did not disappear down the rathole in numerous kleptocracies rather than accomplish its stated purpose. The intended purpose of these fund transfers, the de-development of the developed nations, would occur regardless.

 

“The problem with socialism is that eventually you run out of other people’s money.”, Lady Margaret Thatcher

 

Tags: Paris Agreement

“Ideal” Climate

The earth does not have a climate, except as an “average” of thousands of local climates. Each of those local climates is changing, though not always in the same way or at the same pace. The “average” climate is changing, as measured by numerous temperature measuring stations and reported as global average temperature anomalies. These anomalies are calculated deviations from the conditions measured over a 30+ year reference period. There is no explicit recognition of the climate of this reference period as the “ideal” average global climate. There is, however, the inference that this reference period, or some other period, was “ideal” and that the recently calculated anomalies represent a departure from that “ideal”.

The thousands of local climates on the earth vary tremendously. The highest temperature ever recorded on earth (134°F) occurred in Furnace Creek, CA in July, 1913. The lowest temperature ever recorded on earth (-128.5°F) occurred in Vostok, Antarctica in July, 1983. Neither of those record temperatures would likely be considered a characteristic of an ideal climate. The current estimated global average surface temperature is 58.6°F, well above the mean of the global temperature extremes. The current global annual average temperature anomaly is ~1.6°F, suggesting that 57°F was the annual average temperature during the reference period.

National average annual temperatures range from Canada at 22°F to Burkina Faso at 83°F. The US average temperature is 47°F. US cities with an average annual temperature approximately equal to the global annual average temperature of 57°F during the reference period include: Albuquerque, NM; Louisville, KY; St. Louis, MO; and, Wichita, KS. European cities with similar average annual temperatures include Sochi, Russia and Istanbul, Turkey.

Miami, FL has an average annual temperature of 77.2°F, with an average high temperature of 84.3°F and an average low temperature of 70.1°F; and, record temperatures of 98°F and 30°F. Miami’s annual average temperature is 18.7°F warmer than the global average. Barrow, AK has an average annual temperature of 11.8°F, with an average high temperature of 17.3°F and an average low temperature of 6.3°F; and, record temperatures of 79°F and -56°F. Miami’s average high and low temperatures vary by 14.2°F, while the record temperatures vary by 68°F, nearly 5 times the average variation. Barrow’s average high and low temperatures vary by only 11°F, while the record temperatures vary by 135°F, more than 12 times the average variation. Barrow’s annual average is 46.7°F below the global average.

Somewhere, between the record global temperatures, within the range of national average annual temperatures and climates, lies the “ideal” global annual temperature and the “ideal” global annual climate. There are numerous nations for which an increase in annual average temperatures would probably represent a move toward an “ideal” climate, while there are other nations for which it would represent a move away from an “ideal” climate. In each case, that perception would be based on the characteristics of the “ideal” climate. These perceptions vary considerably, but seem to favor warmer climates with limited seasonal variation.

 

Tags: Global Temperature

Highlighted Article: A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

 

A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

"The 169-year long instrumental temperature record is built upon 2 measurements taken daily at each monitoring station, specifically the maximum temperature (Tmax) and the minimum temperature (Tmin). These daily readings are then averaged to calculate the daily mean temperature as Tmean = (Tmax+Tmin)/2. Tmax and Tmin measurements are also used to calculate monthly and yearly mean temperatures. These mean temperatures are then used to determine warming or cooling trends. This “historical method” of using daily measured Tmax and Tmin values for mean and trend calculations is still used today. However, air temperature is a signal and measurement of signals must comply with the mathematical laws of signal processing. The Nyquist-Shannon Sampling Theorem tells us that we must sample a signal at a rate that is at least 2x the highest frequency component of the signal. This is called the Nyquist Rate. Sampling at a rate less than this introduces aliasing error into our measurement. The slower our sample rate is compared to Nyquist, the greater the error will be in our mean temperature and trend calculations. The Nyquist Sampling Theorem is essential science to every field of technology in use today. Digital audio, digital video, industrial process control, medical instrumentation, flight control systems, digital communications, etc., all rely on the essential math and physics of ..."

 

A condensed version of a paper entitled: “Violating Nyquist: Another Source of Significant Error in the Instrumental Temperature Record”

 

Tags: Highlighted Article

A CERES Possibility

NASA launched the first of the Clouds and the Earth’s Radiant Energy System (CERES) satellites in 1997. CERES is managed by NASA’s Langley Research Center. The CERES surface datasets include a dataset of the upwelling longwave radiation from the surface. This dataset can be converted to surface temperature using the Stefan-Boltzmann equation if the surface emissivity is known. Since the emissivity of common surfaces vary, but are equal to or greater than 0.94, only minimal error is introduced to actual surface temperatures by using an emissivity of 1.0; and, no error affects relative temperatures or temperature trends.

This suggests an opportunity to employ the CERES satellite surface dataset in combination with a relatively small number of highly accurate near-surface temperature measuring stations distributed around the globe to provide a far more comprehensive near-surface temperature dataset which would not require data “adjustment”. A recent paper by Willis Eschenbach suggests that the CERES surface temperature trends match well with the Reynolds sea surface temperature and the UAH MSU (University of Alabama Huntsville / microwave sounding units) lower tropospheric temperature trends, though they are lower than the Berkeley Earth and HadCRUT surface temperature trends by a factor of ~1.5.

Employing the CERES satellite surface dataset in combination with the US Climate Reference Network (CRN) would avoid use of the “adjusted” near-surface temperature records produced by Berkeley Earth, HadCRUT, NOAA and NASA GISS. Installation of a limited number of near-surface temperature measuring stations like the US CRN stations throughout the globe would facilitate “ground-truthing” of the CERES land surface data.  Similarly, the drifting and Argos buoys would be used to “ground-truth” the CERES sea surface temperature data.

This approach would represent a major change in the measurement and reporting of global surface and near-surface temperatures; and, a break in the instrumental temperature record. Therefore, it would be essential that the deviations between the CERES temperatures and the current near-surface and sea surface temperature records be resolved. It appears that much of this deviation in the near-surface temperature records is the result of Urban Heat Island (UHI) effects on the existing near-surface temperature measuring stations and the repeated “adjustments” to the near-surface temperature records by their producers. Similarly, it appears that much of the deviation in the sea surface temperature record is the result of the continuing use of temperature measurements made in a variety of ways by surface ships.

The existing US CRN measuring stations could be used to establish accurate absolute surface temperature data points which could then be used to correct the emissivities used in the surface temperature conversion; and, to establish the extent of UHI effects on the existing near-surface temperature records and the accuracy of the “adjustment” protocols used to prepare the near-surface temperature data for inclusion in the global near-surface temperature anomaly products. The Argos buoys and the floater buoys could perform the same roles for the sea surface temperature measurements.

The greatest advantage of this approach to using the CERES data is the complete global coverage provided by these satellites, including measurements of the Arctic and Antarctic surfaces.

 

Tags: Temperature Record, Global Temperature, Satellites, Adjusted Data
Search Older Blog Posts