Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

The Gates Math Formula

Bill Gates has propounded what he asserts is the math formula which will solve climate change.

P * S * E * C = CO2

Where: P is the population of the globe;

                                    S is the services demanded by the population;

                                    E is the energy required to provide those services; and,

                                    C is the carbon released in producing that energy.

Gates point is that global population and the population’s demand for services is growing faster than can be offset by increases in energy efficiency and transitions to lower carbon fuels. He is certainly correct in that assessment.

His message is that, if global annual CO2 emissions are to be reduced by 80% by 2050 and to zero by the end of the century, reliance on increased energy efficiency and a transition to lower carbon fuels will not be sufficient to achieve the desired result. Rather, there is the need for a massive increase in R&D funding in search of breakthrough technologies which could achieve the desired result. One such breakthrough would be a source which is always available. Advanced, modular nuclear generators could be one example. Another breakthrough might be low cost, high capacity, high charge rate and discharge rate energy storage systems. Such storage systems, combined with lower cost, higher efficiency solar and wind systems could broaden the potential of intermittent energy generators to provide reliable grid power.

Unstated in the Gates message is the realization that it is far more beneficial to invest limited capital in R&D on technologies which could be effective, rather than spending that capital attempting to commercialize technologies which are incapable of being effective. This is clearly not the approach currently being pursued by the globe’s governments.

Obviously, Gates position is based on the premises that: climate change is caused by human activities; climate change is undesirable; climate change can be eliminated; and, it is urgent that climate change be eliminated. Given these premises, Gates position makes eminent good sense; far more sense than the programs being pursued and proposed by the globe’s governments.

However, the first premise ignores the historical fact that the climate of the globe has been changing over the entire period of the earth’s history we have been able to study. The second premise ignores the benefits currently resulting from the increase in atmospheric CO2 concentrations, as documented by the greening of the planet observed by satellites and the expansion of global growing seasons and of tillable land to higher latitudes. The third premise relies on the belief that halting the increase in atmospheric CO2 concentrations would halt the recently observed warming, though that belief relies on models which have not been validated. Finally, the fourth premise relies on the sensitivities and feedbacks input to the climate models, which are unmeasured and currently unmeasurable.

Gates asserts that what he perceives as vast problems will require solutions based on vast ideas. Governments today are attempting to solve what they assert to be vast problems with half-vast ideas.

Tags:

Hottest Year Evah

2015 – The Warmest Year in the Near-surface Instrumental Temperature Record

The US National Center for Environmental Information (NCEI), the US National Aeronautics and Space Administration Goddard Institute of Space Studies (NASA GISS) and the UK Hadley Center / University of East Anglia Climate Research Unit (HadCRUT) have proclaimed 2015 to be the warmest year in the instrumental record of global near-surface temperatures. They reported that 2015 was 0.16 +/- 0.09°C (NCEI), 0.13 +/- 0.10°C (NASA GISS) and 0.18 +/- 0.10°C (HadCRUT) warmer than 2014.

Obviously, the increase in global average near-surface temperature from 2014 to 2015 could not be precisely 0.16°C and precisely 0.13°C and precisely 0.18°C, though it might fall within the range of those calculated figures (0.13 – 0.18°C). However, based on the confidence limits expressed by the producers of each of these global temperature anomaly products, the estimated global near-surface temperature difference between 2014 and 2015 would more likely fall within the range of 0.03°C (0.13°C – 0.10°C) – 0.28°C (0.18°C + 0.10°C), or 5 times the range of the calculated figures (0.05°C vs. 0.25°C).

Assuming that the confidence limits on the temperature increases reported by the producers of these near-surface temperature anomaly products for 2014 were the same as the confidence limits reported for 2015, it is statistically possible that 2014 was actually warmer than 2015. However, the linked paper suggests that the global near-surface temperature anomaly calculations are based on temperature readings with an estimated +/- 0.2°C station error, which has been incorrectly assessed as random error; and, that there is also systematic error from uncontrolled variables. The author calculates a representative lower limit of uncertainty in the calculated temperature anomalies of +/- 0.46°C; and, based on this lower limit of uncertainty, the global near-surface anomaly trend is statistically indistinguishable from zero.

The Law of Large Numbers, relied upon by the global near-surface temperature anomaly producers to report global anomalies to greater precision than the underlying “adjusted” temperatures, requires that the errors in the underlying temperatures be random. Assessments of the errors introduced by the temperature measuring instruments, their enclosures, their siting and changes in the characteristics of their surroundings suggest strongly that the measurement errors are not random; and, that the “adjustments” made to the temperature readings are not random and do not make the errors in the resulting “adjusted” readings random either.

Tags: Warmest, Hottest, Temperature Record

Highlighted Article: A Guide to Understanding Global Temperature Data

Dr. Roy Spencer just published this booklet.

A Guide to Understanding Global Temperature Data

This is a pretty basic, balanced view of the global temperature issue.

"Whether we use thermometers, weather balloons, or Earth-orbiting satellites, the measurements must be adjusted for known sources of error. This is difficult if not impossible to do accurately. As a result, different scientists come up with different global warming trends—or no warming trend at all."

Tags: Highlighted Article

The “Pause” Returns

The climate science community had been troubled by an extended “pause” in global warming prior to two events in the Spring of 2015: the onset of a major El Nino in the NINO region of the Pacific; and, the publication of Karl et al 2015 (Possible Artifacts of Data Biases in the Recent Global Surface Warming Hiatus’), the “pause buster” reanalysis of global sea surface temperatures (ERSSTv4). As the result of one or both of those two events, the “Pause” paused, though it was frequently said to have ended.

However, the end of the 2015/2016 El Nino and the disappearance of the Pacific “Warm Blob” off the West coast of North America have restored the pause in the satellite anomaly products produced by UAH and RSS, to 23 years and 1 month and 22 years and 8 months respectively, through May 2016. The pause has also been restored in the HadCRUT near-surface temperature anomaly product, to 11 years and 2 months; and, in the HadSST sea surface temperature anomaly product to 19 years and 10 months, through April 2016. The pause has not yet been restored in the NASA GISS LOTI (land/ocean temperature index) temperature anomaly product through May 2016, nor in the NOAA NCEI combined anomaly product through April 2016. This is likely the result of the sea surface temperature revisions in NCEI’s ERSSTv4 sea surface temperature product, as well as near-surface data “adjustments”.

It appears likely that the pause will ultimately be restored in the NASA GISS and NOAA NCEI combined temperature anomaly products as both near-surface and sea surface temperatures continue to drop with the end of the 2015/2016 El Nino and the “Blob”; and, with the anticipated onset of the 2016/2017 La Nina, though the increased sea surface temperatures resulting from the Karl et al 2015 reanalysis will likely delay the restoration by several months in both combined anomaly products.

Tags: Pause, El Nino

Goodbye, El Nino

The major El Nino of 2015-2016 is over. Sea surface temperatures in the Nino regions have dropped to normal or below normal levels.

All of the temperature anomaly products are showing cooling from the peak of the 2015-2016 El Nino. The NASA GISS near-surface temperature anomaly product is the only near-surface temperature anomaly product currently available through May, 2016. It has declined by 0.4°C from its peak. The UAH tropospheric temperature anomaly has dropped by 0.3°C from its peak, while the RSS tropospheric temperature anomaly has dropped by 0.5°C from its peak. The NCEI and HadCRUT temperature anomaly products have also cooled from their peaks through April, 2016, though the drops are not as large since they do not include the May 2016 changes.

The Pacific “warm blob”, which has also been affecting global temperatures, has also disappeared, according to NASA. This should further reduce the sea surface temperature anomalies; and, thus, all of the global integrated temperature anomalies.

NOAA and the Australian Bureau of Meteorology have issued La Nina alerts for 2016-2017. Both organizations are anticipating a moderate to strong La Nina. However, there is no basis on which to project the magnitude and extent of the sea surface temperature cooling which will result.

The news releases from the producers of the near-surface temperature anomaly products tended to minimize the assessment of the impact of the El Nino on their surface temperature anomalies. However, the end of the El Nino has already produced a very significant reduction in the NASA GISS anomaly products. There is no reason to expect that similar reductions will not appear in the NCEI and HadCRUT anomaly products when the May 2016 anomalies are announced. This should make it quite clear that the minimal attribution of the 2015 warming to the El Nino was political spin, rather than scientific assessment.

Tags: El Nino

COP 21 Agreement

The Obama Administration insisted that the Agreement reached at the conclusion of COP 21 in Paris, France not be legally binding on the parties because of the near-absolute certainty that the US Senate would not ratify the Agreement as a treaty; and, because of the very limited likelihood that the Congress would appropriate the anticipated level of funding which would have been required by the UN Clean Energy Fund.

The US has reportedly pledged $3 billion of the ~$10 billion currently pledged to the UN Green Climate Fund, with the stipulation that the US pledge not exceed 30% of the total funds pledged. The COP 21 Agreement calls for funding from a “floor” of $100 billion per year, beginning in 2020, apparently in perpetuity. The Group of 77 + China have stated that the $100 billion is insufficient and must be increased substantially.  Assuming that the US maintains its 30% stipulation, the US would be expected to pledge ~$30 billion per year, or more, to the Green Climate Fund.

The current US pledge of $3 billion is to be provided over a period of 4 years; and, it is not certain whether it is intended to be “new money”, or funds reallocated from other appropriations. However, meeting the expected US share of the $100 billion per year would require a new congressional appropriation, which is highly unlikely in the current Congress. That appropriation would have to be funded by a new tax, such as a carbon tax, which would have to be adjusted upward “Progressively”, as CO2 emissions declined, to maintain the pledged funding stream.

There is currently very little definition regarding the criteria to be used to determine allocations from the UN Green Climate Fund. Much of the funding would likely be distributed to known kleptocracies; and, predictably, very little of the funding would actually reach the citizens of those kleptocracies purportedly adversely impacted by climate change. Based on the UN’s history, in which the Iraq “Oil for Food” program degenerated into the Iraq “Oil for Palaces, Payloads and Payoffs” program, a high percentage of the $100 billion per year would likely be consumed by waste, fraud and abuse.

Tags: COP - Conference of Parties

US EPA Clean Power Plan

The primary weapon in the US Administration’s crusade to save the world from anthropogenic global warming (climate change) is the US EPA Clean Power Plan, which sets CO2 emissions limits for coal-fired electric generators which could only be met with the installation of carbon capture and sequestration (CCS) systems. The plan requires CO2 emissions from power generation to be reduced by 32% from 2005 levels by 2030.

The Plan effectively requires the retirement of older coal-fired generators which are either technologically or economically unsuitable for the installation of carbon capture systems. The Plan also effectively discourages (prevents?) the construction of new coal-fired generators, since CCS is currently not demonstrated technology at commercial scale; and, the technology does not appear to be economically viable, even on new plants designed to accommodate CCS. The Plan also encourages expanded transition from economic dispatch of power generators (lowest cost first) to environmental dispatch (renewables and lowest emissions first), thus inexorably increasing wholesale power costs.

A coalition of 26 states and one coal company have filed suits against EPA in federal court, seeking to have the court overturn the EPA rule; and, to stay its execution until the legal challenges are resolved. The stay is viewed as essential, given EPA’s history of “slow walking” the appeals process, thus forcing those affected by the regulations to prepare to comply with the contested regulations, even as they challenge them in the courts. This is particularly important for electric generators, which require long lead times for planning, regulatory approval, equipment procurement and installation, construction and commissioning.

Members of the US Senate and House of Representatives have also filed challenges to the Plan under the Congressional Review Act, challenging EPA rules for both existing and new power plants. Congress considered the inclusion of CO2 under the Clean Air Act, but chose not to include CO2 as a criteria pollutant under the Act. Thus, certain members of Congress believe that EPA has effectively rewritten the Act in the issuance of the Clean Power Plan.

Tags: EPA, Clean Power Plan, Coal

The Many Costs of COP 21

“The time has come, the walrus said, to talk of many things; of painted ships and sealing wax and cabbages and kings.”

The pledges of GHG emissions reductions made by the developed country participants at COP 21 will impose a variety of costs on individuals and businesses in those countries. Premature replacement of existing coal-fired electric generating facilities will result in economic dead losses, both for the owners of the generation facilities and for the owners of the coal mines which have provided their fuel. The extent of these economic dead losses cannot be accurately determined until the specific generators to be abandoned and coal mines to be closed have been determined; and, that process is ongoing. The closure of these plants and mines will also result in job losses in both industries, as well as job losses in the transportation industries which moved the coal from the mines to the generators.

Closure of these existing facilities will require the construction of new facilities to replace their electricity output. New natural gas combined-cycle generators will be more efficient than the coal generators they replace; and, in the current market, will use less expensive fuel. In areas without available natural gas transmission and storage capacity, major investments will also be required to deliver and store the additional natural gas required to fuel the new generators.

New renewable generating facilities will be more expensive per megawatt hour generated than either the closed coal plants or the new natural gas plants; and, will require significant expansion of the electric transmission grid and the installation of massive quantities of grid storage, increasing electricity rates as has occurred in several countries in Europe, including England and Germany.

Taxpayers in the developed countries will also be required to provide the $100 billion per year pledged to the UN Clean Development Fund to support energy development and climate change mitigation and remediation in the developing countries. The developing countries, to the extent that they implement renewable generation, will find their development impeded by the higher costs of the renewable generation and the construction costs of the storage required to achieve acceptable reliability of service.

China, India and other developing nations have clearly expressed their unwillingness to impede their economic development in the interest of controlling the global climate. While these nations intend to increase their reliance on nuclear generation and renewables, they are also aggressively increasing their use of coal-fired generation. Once each of these countries reaches its peak CO2 emissions rate, at some undefined date in the future, their emissions will decline slowly over a period of approximately 40 years (expected plant life, absent life extension investments), assuming that old coal-fired plants being retired are not replaced by new coal plants. The COP 21 Agreement does not preclude such actions by these countries.

The COP 21 Agreement includes the intent to achieve net-zero global annual CO2 emissions by 2100. The global investments required to achieve this result are estimated to exceed $1 trillion per year. The Agreement also includes the desire to achieve net-zero emissions, at least among the developed nations, far sooner, with the hope of limiting global mean near-surface temperature increase to 1.5oC. The investments required to achieve that result would be far greater than $1 trillion per year, because of the shorter period over which the investments would be made and the state of development of the technologies required to achieve that result.

A recent analysis by University of Alabama Hunstville (Drs. John Christy and Roy Spenser) suggests that, at current warming rates, the global temperature anomaly would peak below 1.5oC regardless of the emissions reductions agreed to in the COP 21 Agreement. However, that analysis, like the continuing satellite temperature record, is likely to be ignored, since it does not fit the globalist narrative.

Tags:

Prosperity and Electricity In-Depth Article

The United States faces a prosperity crisis.  Increased prosperity will make it somewhat easier to resolve many of the problems that the next generation must face head-on.  Reviving prosperity in the United States poses some difficult challenges over the next decade.  Many articles explain the myriad issues that will need to be addressed to restore prosperity after 16 years of focus on other priorities, resulting in at best tepid regard for economic growth.  I believe one such subject is not fully appreciated for its potential to thwart prosperity if left on its projected course or to be a catalyst for prosperity if we include it as part of the dialogue for restoring prosperity: ELECTRICITY.

In energy policy discussions, oil receives a disproportionate amount of attention, especially as regards the threat to quality of life and prosperity.  I have thought long and hard as to why oil receives more attention than electricity and I can only conclude that oil issues are easy to understand and electricity issues are not.  (The ease of understanding oil has not, however, led to sound policy as is discussed in our earlier Commentary “In Praise of Global Oil Markets: Will the Idiocy End?”.)  In this Commentary I will lay out the case for paying more attention to electricity.

First, an important observation.  I have carefully perused the websites of prominent national think tanks and I have not found one such organization that dedicates a full time person to electricity industry structural issues. 

  • There are environmental people who will dabble in the overlap of electricity and environment. 
  • There are energy people who will pay lip service every once and a while to an electric issue.  
  • There are national security people who will worry about the electricity infrastructure exposure to terrorism. 
  • There are regulatory reform people who every once and awhile make a glancing blow at some electricity issue. 

But I have not found any broad-based think tank organization with a single person who on a full time basis and with the proper credentials concentrates on electricity issues as their exclusive focus.  The organizations that have full time people addressing electricity issues are all trade associations or environmental organizations, dripping with self-interest in their analysis and recommendations, or relatively small specialized organizations like the Institute for Energy Research.  This concerns me.  Either I am woefully misguided about the threat (as demonstrated in this article) or there is something missing in the think tank community.  If the latter, then I believe a grave risk exists that we will be caught with our pants down when the electric system begins to freeze up.

Why are think tanks asleep at the switch?  Part of my theory is that electric utilities are part of the corporate donor base of many of these think tanks and are part of the coalition on taxes, labor, health, environment, etc.  I perceive (but I could be wrong) that there is some reluctance to dedicate resources to electricity policy because it would offend the donor base of electric utilities, which if done right almost certainly would.

A second preliminary point.  There is a rich history of reforming industries that have similar network characteristics to the electric industry.  Ironically, there is no consensus name in economics about these types of industries and that may be part of the problem. 

Since the mid-1970s, the US has massively and successfully restructured a series of industries that have several similar characteristics, though many would fail to see the commonality.  First, they are network or grid type industries, i.e., industries that move people, goods, or digits from point A to point B without fundamentally changing the physical properties of the “transported” item.  Second, the reforms transitioned the industry from heavy-handed governmental control to a more competitive, market based policy.  Third, all required preemption of some traditional State authorities that became anachronistic as the maturation of the industry increasingly implicated more interstate commerce.  The industries include airlines (1978), trucks (1978), railroads (1980), telecommunications (1982-84), natural gas (1985-92), cable (1984), internet (1986), GPS (1996), Microsoft Windows (2001), and oil pipelines (1994).  Interestingly, the Supreme Court adopted a similar analytical framework for the movie theatre industry in 1948.  Most recently, Google has become a target of European regulators on much the same theory as these other industries (using monopoly assets to advantage affiliated competitive assets of their company and disadvantage independent competitors).

For lack of a commonly understood term for these types of industries (and because “network” industries would be confused with computer networks), I have coined the term “plexus” industries.  The dictionary definition of plexus is “an intricate network or web-like formation.”  Plexus industries are the connective technology that allows a person, good, or digit to be moved from a lower value condition to a higher value condition but without changing its fundamental characteristics.  This is not the place to explore the technical arguments for plexus but a couple of examples will help understand the point.

The easiest plexus industry to understand is a gas pipeline.  Gas goes into a pipe in Texas and comes out in Boston.  The chemical properties of the molecule of gas are unchanged.  It has just been moved from Texas to Boston.  The process of such movement is well understood within the gas industry but if you asked someone to compare that “process” to Windows operating system they would look at you confused.  At a certain level of abstraction, pipelines and Windows perform the same function.  Windows has a point at which something is put into the system.  We call it an application, e.g., Chrome, Firefox, Adobe Acrobat, AOL etc.  The application is then “delivered” to a user who then creates value by using the combination of the application and the operating system.  The Department of Justice used the same basic rationale for suing Microsoft for using their “monopoly” facility (Windows operating system) to impede competition in those products that need the monopoly facility to compete with other Microsoft products.  Microsoft still operates under a consent decree with the Department of Justice.  At a certain level of abstraction, this is identical to the reforms in the natural gas industry separating pipeline functions from commodity functions.

The key insights to all these largely successful policy reforms is that the “plexus” facility was recognized as having monopoly characteristics that would distort markets if left unchecked, the goods being moved through the plexus facility were capable of being subjected to market principles even if the plexus facility was not.  Rather the plexus facility had to be “regulated” in such a way as to promote efficient input and output markets.  This all sounds rather abstract but the concepts apply to a wide array of goods and services and a widely varying terminology is used by different plexus industries.     

My point is that pro-market reforms of fundamentally interstate plexus industries has been done before, we have a template for reform that has been enormously supportive of prosperity policies.  

Now onto the main stage of how this applies to electricity, but starting with a bit of boring history.

Electricity was originally introduced to the US on a city by city basis, indeed sometimes neighborhood by neighborhood.  Thomas Edison’s first foray into electricity was the famous Pearl Street Station.  The Wiki entry gives a sense as to how local was its’ introduction:

Pearl Street Station was the first central power plant in the United States. It was located at 255-257 Pearl Street in Manhattan on a site measuring 50 by 100 feet (15 by 30 m), just south of Fulton Street and fired by coal. It began with one direct current generator, and it started generating electricity on September 4, 1882, serving an initial load of 400 lamps at 85 customers. By 1884, Pearl Street Station was serving 508 customers with 10,164 lamps. The station was built by the Edison Illuminating Company, which was headed by Thomas Edison.

Without recounting the tortured history of electricity regulation, we arrived at the current allocation of jurisdiction of regulatory authority by 1935.  The States had regulatory authority over generation, constructing transmission and distribution grids, sales to the consumer by the distribution company, billing, and metering -- in essence, the whole system from generation to consumption.  The feds had a very limited role in transmission and wholesale sales in interstate commerce.[1]  But in 1935, there was precious little interstate commerce in electricity, especially compared to today.

That simple framework has been under nearly constant assault since 1978.  Today, the electricity system[2] is a mess, bordering at times on chaos and calamity. 

Much like the AT&T phone monopoly came under scrutiny for the extent to which it had developed into a very broad monopoly, electric utilities were put under a microscope to examine the continuing vibrancy of the assumption that they should be permitted as a comprehensive monopoly.

The first crack in the dam came in 1978 when utilities were required by Federal law to purchase electricity from third parties who used certain technologies.  The Federal law required that the State commissions set the price at which the purchase would take place.  In 1992, Congress broadened who could sell electricity to the utility.  Over the next decade, the Federal Energy Regulatory Commission enacted rules to encourage even more competition in generation.  At the same time a number of states began to experiment with programs to allow customers to purchase electricity from competitive marketers.  But then the Enron scandal and the catastrophic California electricity crisis created a more hostile climate for competitive electricity and progress slowed.  Today the US finds itself in a situation where there is a confusing mixture of models of electricity competition and traditional regulation.    

Does the current situation create risk of failure of the system?  By what standard should we judge adequacy of the existing system?  When the basic jurisdictional allocation was solidified, electricity was virtually a luxury.  Many homes did not have electricity and we certainly didn’t have ubiquitous air conditioning, labor saving appliances, electric cars, and the magical world of digital technology.  I suggest that our test should be how well the US electric system serves the world we envision in 2050.  Luckily, the US is not the only country asking this question.  Indeed, we are laggards compared to how aggressively some other countries are addressing the electric system of the future.  Maybe not so surprisingly, China, for example, has been far more thoughtful and aggressive than the US in addressing electric system structural issues.  Australia has also grappled with some of the challenging issues of electric market reform.

So what is wrong with the US electric system today?

Climate Change:  Far and away, the most confounding variable in the electric system today is the issue of carbon emissions.  Let’s put to one side the degree to which carbon emissions are problematic and just take a look at the current situation.  The US Congress has steadfastly refused to pass comprehensive legislation regulating carbon.  Environmentalists have thus turned their attention to the States and the Federal executive branch.  Accordingly, there are literally hundreds of different approaches being taken to deal with carbon, mostly by discouraging coal and nuclear generation (in a variety of ways and forums) and encouraging renewable energy and efficiency (in a variety of ways and forums).  Whatever may be said of this approach, one thing is certain.  There is absolutely no reason to believe that this chaotic approach will achieve cost effective carbon emissions reductions.  Yet billions are being spent debating, analyzing, and executing these programs, many of which are cosmetic.      

Jurisdictional Allocation: As noted above, the allocation of jurisdiction over the electric industry is an historical accident that has not been formally reconsidered since its inception in 1935.  Rather there has been a nipping at the heels over the years to address the absurdities that result from the current allocation of jurisdiction between the states and the feds.  In ALL the plexus industry reforms, one key, indeed essential, element was the reconsideration of the allocation of jurisdiction between the feds and the States.  Railroads, air travel, trucking, natural gas, and telecommunications all shifted the allocation of jurisdiction from the states to the feds as these industries became more entwined in interstate commerce.  Just as importantly, when the feds exercised control under all of these reallocations it did so by adopting more competitive, market based policies.  Some will no doubt resist the feds taking a stronger role in the electric industry on historical, ideological, policy, economic, or constitutional grounds. But these arguments are weak given the historical success of the reforms to other plexus industries, the inherent impact of electricity on prosperity, and the increasing impact on interstate commerce.

Regulatory Chaos: While overlapping with the issue of jurisdictional allocation, it is necessary to analyze the regulatory chaos that exists when literally hundreds of governmental organizations have control over different pieces of the electric system.  FERC has done yeoman’s work in trying to work around this chaos with its establishment of Regional Transmission Organizations (RTOs) but there is substantial opinion that these organizations have become bureaucratic, expensive, political, and inefficient (though admittedly more efficient than the status quo ante).  Moreover, having each State make a utility by utility decision on each and every issue is expensive, confusing, and inefficient.  Indeed, even in some States policies are very different depending on which utility service territory one lives in.  Not only is this costly, it has the added disadvantage of making the electric system more fragile than would be the case with more coherent integration.

Technological Innovation: There is a joke told at electric meetings.  If Thomas Edison came back he would recognize the electric industry.  Nothing’s changed!  LOL (not).  But it’s true that both digital technology and just plain old innovation have created a situation where there is a possibility that many new technologies exist that might (?) improve the operation of the electric system.  The reason I say “might” is that I take Hayek’s admonition about Fatal Conceit seriously.  I don’t pretend to know what new technologies make sense and which don’t.  But I also know that most State regulators know less than I do about which technologies make sense, and they are in the driver’s seat. 

One of the more general critiques of regulation is that it impedes innovation.  But that is true in spades in the electric industry, given the dispersed regulatory authority.    

Aging Infrastructure: The electric system is victim to the same malady as much of America’s infrastructure.  It is simply a fact that no matter which part of the industry you look at there are issues of the need for modernization of infrastructure.  Putting new infrastructure in place to satisfy increasing demand is relatively easy.  Investments will result in more customers and greater revenue.  But that is not what we are talking about.  The investment in replacing aging electric system infrastructure only results in higher costs but not necessarily increased demand for service.  Thus commissions and consumers can be quite beggarly when they are asked to support higher rates needed for modernization. Compounding this issue is the disaggregated nature of decisions to require renewable energy in the grid.  Arguably, dollars are being wasted on this endeavor that would do more good in promoting modernization.

Reliability: Some of the aforementioned drivers already hint at the issue of reliability.  Reliable electricity is essential for a prosperous Nation.  While reliability was always an important concern, digitalization makes reliability more imperative.  Food will not spoil if electricity goes out for an hour.  But if electricity ceases for even the blink of an eye it can cause damage to some electronic equipment.  Measuring the risk of reliability is difficult.  We have had two major reliability failures in the US, in 1965 and 2003.  Additionally, California suffered debilitating blackouts in 2001-02.  While it is stating the obvious, a catastrophic failure of the US electric system would result in catastrophic property loss and sometimes lives.  Congress has formalized the issue of electric reliability in legislation; it has not taken steps to put in place a regulatory model that will stand the test of 2050.

Reliability is threatened in ways too numerous and technical to list here but several can be highlighted.  First, I doubt there is anyone who has not heard there is a “war on coal.”  The Environmental Protection Agency (EPA) has proposed regulations that would make it difficult bordering on the impossible to build new coal plants and would also force the closure of some existing plants.  Less familiar is the “war on nuclear.”  A new nuclear plant has not been built in the US since the late 1970s because of Three Mile Island, Chernobyl, and unanticipated cost overruns due to safety regulations.  Just as the US was on track for a small nuclear renaissance, an earthquake and typhoon hit the Fukushima nuclear plants in Japan, causing immediate devastation and long term harm.  Fukushima was undoubtedly a setback for nuclear power in the US.  Environmentalists a few years ago were very positive on natural gas to replace coal and nuclear in the generation mix as a short term strategy to transition to renewables.  But as it became clear in recent years that natural gas would be more plentiful (measured in centuries), environmentalists soured on natural gas and now anti-fracking has become part of the extremist mantra.  So where will the base of generation come from to supply our electric needs.  Some believe that renewables and policies encouraging less need for electric will fill the gap.  But this is a pipedream (pun intended).  This leads to the second point about reliability.  Renewables are an intermittent source electricity.  If the sun don’t shine and the wind don’t blow, you don’t have reliable renewable energy.  The technology for storage of electricity (to smooth out intermittency) is still not there, although it has been “10 years away” for the last 30 years.   

Terrorism: Few targets would cause as much disruption as would a major terrorist attack on the electric system.  And yet few targets are as exposed as the electric system.  By definition, the electric system must be spread out all across the Nation and has been called the largest machine in the history of the world. 

Indeed, one has to wonder why there has not been a comprehensive attack on the US electric system.  Recently, there was an event that put a scare into those that worry about this type of attack.  In 2013, there was a small arms attack on an electric switching station in San Jose, California.  As of the writing of this Commentary, the FBI has not made any arrests. 

Utilities and State, and Federal authorities are working behind the scenes to prepare for such an attack and, like many terrorist attacks; we may never know what attacks were prevented by these actions.  Nonetheless, two points must be made.  First, protecting the electric system from attack is no doubt expensive. Unfortunately, these efforts must compete with many of the other priorities being placed (many indeed misplaced) on the electric system.  Second, our protections have to be successful 100% of the time, while the terrorist only has to be successful once.  The magnitude of the potential harm to the Nation is unimaginable yet must be understood and dealt with within an atrophied regulatory framework.     

Electromagnetic Disruption:

Saving the best for last, the end is near!!  It has become widely recognized in the esoteric world of electricity and has started to spill over into popular culture that an electromagnetic pulse could bring down all or part of the electric system.  Literally, we are talking end of the world type catastrophe, with millions dying in months.  Such shocks can result in three ways: nuclear bomb detonated above the earth, solar geomagnetic disturbance, and a ground based weapon.  There is currently an active debate in the US Congress about how to deal with this vulnerability and FERC has begun to issue rules for electric utilities on developing contingency plans for solar threats.  As with all the other issues, this will cost money to address and there are competing priorities for dollars to be spent on the electric system.

Conclusions and Recommendation

Scared yet?  I hope so because I am. 

I usually rail against many left wing arguments as “alarmism.” I might be accused of alarmism in this Commentary.  The difference between “alarmism” and “alarm” is evidence and sound analysis.  My goal in this Commentary was to convince you that there was a sleeping giant of a threat to economic prosperity.  I am trying to think of any other system in our economy that would wreak as much havoc as a major break of the electric system.  I honestly cannot think of a major failure of any other system that would cause as much harm to prosperity as a major failure of the electric system. 

Significantly, you are probably thinking of a “catastrophic” failure of the system as a unique, obvious, one-time event.  That might happen; a 9-11 type event.  But it will probably be more insidious than that.  Think of it like a deteriorating highway.  There is no major, obvious failure.  But every day there is a bit of damage to cars and trucks.  There is less efficient travel.  There is more political tension caused by consumer complaints and the need for more resources.  Some people, likely people with options and money, move away to avoid poor public services.  The tax base erodes and now everything is more difficult.  It is more like a cancer than a fractured skull.  That is an equally frightening scenario for electricity.  There is no singular measure of where our collective national electric system is on a spectrum from third-world to best in class.  Everyone will have a different opinion on how significant the threat of widespread interruption is.  I myself am not sure what the particular scenario is that will cause us to wake up and say “why didn’t we see this coming?”   

So, what to do?

There are two dimensions to fixing the electric system: a policy and a plan to implement the policy. 

The policy is actually pretty simple to identify but very difficult to implement. 

First, we need to embrace a policy of reliance on market competition for ALL services that are capable of sustaining competition.  This was done in other plexus industries and it worked out either reasonably well or spectacularly well.  Thus all generation should be operate in competitive markets and for reasons below should not be owned by either transmission or distribution companies.  But recognize this will affect a lot of economic interests so there will be wailing and gnashing of teeth. 

Second, transmission (not a function easily capable of competition) must be reconceptualized.  Today, hundreds of business entities own electric transmission facilities and some of those facilities are operated by Regional Transmission Organizations.  The reason that RTOs operate the facilities owned by many other business organizations is that there is an inherent conflict of interest when the same business organization owns generation (competitive), transmission (monopoly), distribution (monopoly), and retail services.  We can agree on the principle that the business organization should not be permitted to advantage its potentially competitive operations by abusing their monopoly power over certain facilities by linking the competitive good to the monopoly good.  In antitrust law this is called a tying arrangement.  An easy way to understand the problem is to envision an umpire in a Little League game.  His daughter is the pitcher for one of the teams.  Could you really blame the opposing coach objecting to his being the umpire, no matter how solid his reputation for honesty?  Similarly, would it surprise you to find out that consumers believed they would get better service from the marketing affiliate of the utility than from an independent marketer?  Attempts to regulate this type of abuse are next to impossible, though FERC and many State commissions have tried.  It is burdensome, ineffective, and not trusted by potential independent competitors.  I once was hired to testify against a gas pipeline who had abused their monopoly by advantaging their storage facilities in such a way as to drive an independent storage facility into bankruptcy.  The smoking gun is that the monopolist applied more favorable requirements to its affiliate than it did to independents.  It argued that this was reasonable because it made good business sense.  Many of the independents were poorly financed, unreliable, and untrustworthy; whereas their affiliate was none of these. RTOs are not a natural business construct.  They were imposed because of the limitations caused by jurisdictional allocation and existing authorities.  In theory, a single company owning and operating transmission and only transmission over large regions, regulated by a Federal authority would promote more efficiency and have incentives for technological innovation and more efficient decision-making.

Third, distribution companies should be regulated by States under a Federal policy that promotes competition.  Retail services should be competitive and not performed by the distribution company, again avoiding conflict of interest complications.  There should be strong encouragement for massive consolidation of distribution companies since that would dramatically simplify both the regulation of such companies and easier for competitive independent marketers to do business across larger regions.

Fourth, all retail customers should be served by a competitive entity that is independent of the regulated entities.  I don’t have a clue what the aggregation model would be.  Would Google or Apple or Walmart end up as the aggregators of many of our electric services?  Maybe it will be Visa or MasterCard?  Maybe it will be tied to a bundle of cable, internet, telecommunications, gas, water, and other in-home services?  The beauty of the competitive market is that we don’t know what will develop by 2050.  The key is to put the right institutions in place with the right incentives and then let the market innovate.  When I was working on reforming natural gas policies, I could scarcely have imagined the role that natural gas and oil would play 30 years later.  I can guarantee you that none of us thought that we would cripple OPEC and create headaches for Russia.  It may sound naïve but it is nonetheless true.  Markets can be magnificent tools for progress if they are not distorted by policies that inhibit price signals to consumers.

So that’s the big picture.  I could write a book on the many tedious and technical economic, legal, regulatory, political changes that need to be made to prepare the US electric system for 2050.  But there are several such books already on the market and few get the attention they deserve. 

So how do we get there?  We need a game changer.

Frankly, I wish I had a bold, exciting, innovative recommendation on accomplishing a radical restructuring of the electric system that would cause you to sit back in your chair and say “WOW!”  But I don’t.  There are hundreds of reports but it has not resulted in anything more than tinkering at the periphery, often for the benefit of special interests.

Maybe one you reading this will email me such a recommendation.  But for now all I can come up with is the recommendation that we take this issue more seriously than we have done.  While we don’t need another Congressional hearing, or DOE Report, or Industry Association Sponsored Strategy, we do need to develop a compelling plan and build the consensus to execute it iby people that have the gravitas to make it a game changer.  Unless I missed it, such an effort is not underway and nowhere in sight.


[1] It is a bit different when it comes to the environment.  The feds have taken a stronger role in environment than in industrial organization issues.

[2] I struggle to find the right words to describe the whole of the problem.  If I said “electric industry” some would construe that to mean electric utilities, clearly too limited a concept.  If I use the term “electric policy” it might be perceived as being limited to the world of “wonks.”  So I use the term “electric system” to include the widest possible look at the challenge of delivering reasonably priced, reliable, environmentally responsible electricity.

 

Tags:

Financial Mechanism of the Convention

The United Nations Framework Convention on Climate Change (UNFCCC) established the Green Climate Fund (GCF) at COP 16 in 2010, as an operating entity of the Financial Mechanism of the Convention. The original intent was for the developed countries to provide a fund of $100 billion for use by the developing nations in climate adaptation and remediation, of which only about $10 billion has actually been pledged.

COP 21 set a new “collective quantified goal” of $100 billion per year for GCF funding, beginning in 2020. However, the Group of 77 plus China argued that this base funding level must be substantially increased if it is to meet the requirements of the Group of 77 plus China to contribute to the goals of the COP 21 Agreement, as well as meet their adaptation and remediation needs.

It is interesting that the number 1 and number 3 GHG emitters are members of this group; and, that neither of these countries has submitted an Intended Nationally Determined Contributions (INDC) document which makes any commitment to emissions reductions.

The UNFCCC COP 21 Agreement expresses some degree of urgency with regard to this funding, despite the fact that there are no demonstrated adverse impacts of anthropogenic global warming (AGW); and, significant evidence of positive climate impacts on crop production and general vegetation growth.

The Board of the Green Climate Fund has determined that the pledged funds should be allocated approximately 50% to adaptation and 50% to remediation over time; and, that at least 50% of the adaptation funds should be devoted to meeting the needs or the most vulnerable nations.

There is currently no formula which determines the contributions of individual nations to the Green Climate Fund. Currently, relatively few nations have announced commitments.

Tags:

Intended Nationally Determined Contributions (INDCs) Comparisons

Prior to COP 21, nations were asked to submit Intended Nationally Determined Contributions (INDCs) toward the COP objective of reducing global carbon emissions. The key aspects of the INDCs submitted by the four largest CO2 emitters are listed below.

1 – China (9680 Mt CO2)

  • Peak CO2 emissions by around 2030.
  • Lower CO2 emissions per unit of GDP by 60-65% from 2005 levels.
  • Increase non-fossil share of primary energy consumption to ~20%.
  • Increase forest stock volume by 4.5 billion cubic meters from 2005 levels.

2 – USA (5561 Mt CO2)

  • Reduce GHG emissions by 26-28% by 2025 from 2005 levels

3 – India (2597 Mt CO2)

  • Lower CO2 emissions per unit of GDP by 20-25% from 2005 levels by 2020.

4 – Russia (1595 Mt CO2)

  • “Limiting anthropogenic greenhouse gases in Russia to 70-75% of 1990 levels by the year 2030 might be a long-term indicator, subject to the maximum possible account of absorbing capacity of forests.”

The EU countries collectively are the fourth largest emitter (~4500 Mt CO2). The EU countries have agreed to reduce annual emissions by 40% below 1990 levels by 2030.

The INDCs of the rest of the nations which submitted them can be found here.

There are several key points which must be made about these INDCs.

  • None of these INDCs are legally binding.
  • Global annual CO2 emissions would continue to increase, even if these INDCs are actually achieved.
  • Any nation can exit the agreement after 3 years, effective 1 year after notification.
  • The INDCs are not directly comparable in form or time frame.
  • The USA contribution is a percentage reduction from a historical emissions level by 2025
  • The Russian contribution is also a percentage reduction from a historical emissions level by 2030, subject to a condition.
  • The Chinese contribution is actually negative “until around 2030”, though it commits to a reduction of “carbon intensity” from a historical intensity level.
  • The Indian contribution is also negative, through an undefined time frame, though it commits to a reduction of “carbon intensity” by 2020.

In summary, it is not possible to determine when, or even if, global annual CO2 emissions would stabilize, no less begin to decline, under the Agreement reached at the conclusion of COP 21. It appears unlikely that stabilization will occur prior to 2030; and, even less likely that net zero CO2 emissions will be achieved by 2050.

It is clear that the current INDCs, even if met in their entirety, are insufficient to allow the earth to stay within the 2oC target established by the UNFCCC, no less achieve the 1.5oC sought by the nations which have declared themselves to be the most vulnerable. These nations are typically low lying coastal nations or island nations thought to be most susceptible to sea level rise. This is particularly interesting, since there has been no significant change in the rate of sea level rise over the past 145 years, despite the significant increase in global annual CO2 emissions over the past 65 years. There does not appear to be even a coincidental relationship between CO2 emissions and sea level rise, no less any causal relationship.

Tags:

In Praise of Global Oil Markets: Will the Idiocy End? In-Depth Article

Two days in October 1973 (16th and 17th) marked the most important turning point for US energy policy.  That is when energy policy idiocy began and despite all evidence to the contrary continues to this day.  When will it end?

In October 1973, the Organization of Petroleum Exporting Countries or OPEC doubled the price of a barrel of oil and imposed an embargo on deliveries to several countries, including the US, for supporting a military operation by Israel.

Since that embargo, literally every president, even the normally economically literate Ronald Reagan, has chanted the mantra of “energy independence.”  In pursuit of that silly goal, the US has embraced a number of bad policies.

All sound analysis of oil markets proceeds from an understanding of global oil markets and the oil resource base.  Because there has been such a profound misunderstanding of these basic concepts, there has been much mischief enacted into policy.

 

Understanding Global Oil Markets

Dr. William Nordhaus of Yale University proposed the best metaphor for the global oil market.  Think of a bathtub with many faucets and many drains.  It doesn’t matter where you put oil into the tub or take oil out of the tub; the only thing that matters is how much oil is in the tub at a given moment.[1]  More oil in the tub will lower world oil prices and vice versa. 

This simple bathtub metaphor has profound implications for how we think about energy policies relating to oil. The first significant implication is that “energy independence” is an economically nonsensical concept.

Today, as world oil prices hover at around $50 a barrel, Japan with no oil resources pays the same on the global market for a barrel of oil as does Great Britain with plentiful North Sea resources.  So the amount of oil a country imports (a measure of supposed “energy independence”) is irrelevant to the availability of oil and its price.  If magically the US discovered oil that it could produce for $10 a barrel in Iowa, what would the price of oil be in the US?  Absent some form of governmental price control, it would NOT be $10.  It would be whatever the global price of oil was at any given moment.  Iowa would gladly export oil to any country willing to pay more than $10 rather than sell it in the US for $10.  Thus the world price of oil, no matter where produced will achieve some price that balances supply and demand (or a price based on how much oil is in the bathtub).[2]

So maybe “energy independence” arguments are not based on either the availability or price of oil.  Maybe there is some other reason we would want to produce all the oil we consume. 

As for being held hostage by hostile foreign producers, as we will see later in this commentary, they are actually more dependent on us then we are on them.  Their economies are built on an expectation of oil over $100 a barrel and they (Russia, Venezuela, and OPEC) are now scrambling.

The recent dramatic drop in world oil prices provides an excellent example demonstrating the fallacy of energy independence. A major factor in the dramatic drop in oil prices is the increase in supply in the United States made possible by new production technologies such as fracking, horizontal drilling, and 3D seismic. The increase in supply in the United States not only resulted in lower prices for oil in the US but indeed was critical to a worldwide drop, nay collapse, in oil prices. Oil prices on the global market have dropped based on a myriad of factors. But the reality is that the price will be set for oil purchased in the United States on a global market. It makes absolutely no difference in our energy policy if we buy more or less oil on the global market. Certainly oil prices will have a profound impact on the economy at large not only in the United States but in many other countries as well.  But we have no control over oil prices.

 

Understanding the Oil Resource Base

Sound energy policy requires an understanding of the oil resource base.  Let’s start with a profound and startling insight: The World Will Never Run Out of Oil

But wait, surely you must be mistaken, you say.  Oil is a finite resource and we are consuming it at extraordinarily high rates.  Surely the bathtub will run dry someday. 

Such a view belies a fundamental misunderstanding of natural resource economics.  Whale oil used to be the staple for lighting and other purposes.  Did we run out of whales?  No.  Demand for whale oil increased and the supply of whales decreased with increased harvesting of whales for their oil.  Did we wait for the last whale to be captured before we started to consider substitutes?  No. Innovators and entrepreneurs driven by the pursuit of profit explored new ways to satisfy consumer demands for the things that whale oil could provide.  Circa the 1850s, innovators on both sides of the Atlantic began to realize the potential for large scale use of crude oil in its refined state.  Circa 1890, major breakthroughs in electricity were being made worldwide, and Thomas Edison built the first power plant on Pearl Street in 1882.

Thomas Malthus and more recently Malthusians such as the Club of Rome and Dr. John Holdren base their worldviews on doom and gloom projections of future lack of availability of resources.  A view widely reflected in the media and even common sense suggests that surely population growth and resource limitations will inevitably result in disaster for the earth.  Thankfully, many Malthusians have made predictions as to how long current resources would last unless draconian steps were taken to address issues raised by their worldview.  Remarkably, these “doomsters,” as they are called have never been right.  Their track record is disastrously wrong every time.  Yet these consistent failures have seemingly not diminished their reputations or the media/public’s willingness to be persuaded that disaster is just around the corner.

Dr. Julian Simon spoke most persuasively rebutting the doomsters.  His view is that human ingenuity is the “ultimate resource,” as he titled two of his books.  He stated that:

Our supplies of natural resources are not finite in any economic sense. Nor does past experience give reason to expect natural resources to become more scarce. Rather, if history is any guide, natural resources will progressively become less costly, hence less scarce, and will constitute a smaller proportion of our expenses in future years.

He specifically applied this view to energy:

Energy is the master resource, because energy enables us to convert one material into another. As natural scientists continue to learn more about the transformation of materials from one form to another with the aid of energy, energy will be even more important. . . .

For example, low energy costs would enable people to create enormous quantities of useful land. The cost of energy is the prime reason that water desalination now is too expensive for general use; reduction in energy cost would make water desalination feasible, and irrigated farming would follow in many areas that are now deserts. And if energy were much cheaper, it would be feasible to transport sweet water from areas of surplus to arid areas far away.

Another example: If energy costs were low enough, all kinds of raw materials could be mined from the sea.

But he viewed that the “ultimate resource is people—especially skilled, spirited, and hopeful young people endowed with liberty—who will exert their wills and imaginations for their own benefits, and so inevitably they will benefit the rest of us as well.”

So, the “amount of energy” is not the issue; human ingenuity, the ultimate resource, is the issue.  We will never run out of oil because as the resource base is depleted, oil will become more expensive.  As it becomes more expensive, craven, heartless, greedy entrepreneurs will find currently unknown but clever ways to find more oil, invent oil substitutes, or develop other technology that uses an increasingly expensive resource more efficiently. 

And the shock of it all is that no one person or government will have to develop a plan or a strategy to make sure that chaos is avoided.  Rather, as Dr. Friedrich Hayek pioneered, the concept of “spontaneous order,” i.e., “the emergence of various kinds of social orders from a combination of self-interested individuals who are not intentionally trying to create order through planning.” The information contained in prices drives producers, consumers, and innovators to make myriad and sometimes microscopic adjustments that over time ensure that we will have the supplies we need when we need them.  When dislocation does seemingly exist in a market, you can be virtually assured that it is because some government action or policy has prevented the clear dissemination of price signals.  Hence, it is crucial to closely examine government policy to ensure that impediments to clear price signals are either not imposed or are removed.  Again, Julian Simon pierces the balloon of ignorance:

Not understanding the process of a spontaneously-ordered economy goes hand-in-hand with not understanding the creation of resources and wealth. And when a person does not understand the creation of resources and wealth, the only intellectual alternative is to believe that increasing wealth must be at the cost of someone else. This belief that our good fortune must be an exploitation of others may be the taproot of false prophecy about doom that our evil ways must bring upon us.

But it is understandable that some are concerned with how much oil is in the ground.  An understanding of this requires a foray into some technical distinctions regarding an energy resource that is buried deep in the earth.  Here, the distinction between reserves and resources is important.  In essence, reserves are oil that we are highly certain is in the ground and capable of being produced commercially.  Typically this means that wells have been drilled and testing has been done to increase the probability that the estimates of the amount of oil are highly reliable.  Even this category breaks down further into proved and unproved, with unproved being further broken down into probable and possible, going from the most certain to less certain.  Resources, on the other hand, are even less certain.  We know there is oil but we are taking a guess as to how much is there and how much it will cost to produce.  Typically banks will only lend money on reserves.

This brings us to the controversial question of “peak oil.”  The concept of peak oil was pioneered by a geophysist named Dr. M. King Hubbert.  Peak oil is a somewhat logical concept that means that we have reached the halfway point in available oil resources after which there is a downward trajectory in exploitation and production of the resource base.  Whatever value this concept may have to geologists, it has virtually NO importance to economists, as noted above.  Peak oil alarmism is often used by special interests to scare policymakers into embracing whatever stupid idea that will feather the nest of the advocate.

There are two refutations of peak oil; one based on experience the other on economics.  There seemed to be a consensus that the world had reached peak oil circa 1970 and that governments had to enact policies to ensure an orderly transition to some other fuel, usually renewables. Many doubted the insight that peak oil supposedly provided even before the easy refutation of peak oil by the fracking/horizontal drilling/enhanced oil recovery technological innovations.  But certainly reality brought peak oil theory to a screeching halt with the dramatic increase in our ability to access the resource base made possible by technology. 

So how much oil do we have?  I hope by now you realize this is almost a silly question to which the answer is “enough.”  The US Energy Information Administration has a somewhat technical explanation for this.  Julian Simon also has a more comprehensive answer to the question:  When Will We Run Out Of Oil?  Never!

But for the skeptics or those who can’t be comfortable without a number, here is the best information there is.

BP does an annual statistical review of world energy resources every year, and it is highly regarded. According to BP’s most recent review “Total world proved oil reserves reached 1687.9 billion barrels at the end of 2013, sufficient to meet 53.3 years of global production.”

Just remember that every prediction of when we would run out of oil or how much oil we have left has ALWAYS turned out to be wrong.  We always find ways to find more plentiful, cheaper, and more environmentally benign ways of finding more natural resources.

The recent boom in oil supplies and collapse of oil prices should teach us something about projections. We are constantly looking for new technologies to access energy supplies.  Let’s use an example of an energy resource you probably have never even heard of: methane hydrates.  The US DOE states:

While global estimates vary considerably, the energy content of methane occurring in hydrate form is immense, possibly exceeding the combined energy content of all other known fossil fuels. However, future production volumes are speculative because methane production from hydrate has not been documented beyond small-scale field experiments. 

There are three things important about the future of energy resources: technology, technology, and technology.

Just look at the role of technology in our current (March 2015) oil markets.  Currently, the most controversial issue regarding oil and natural gas is the issue of hydraulic fracturing or “fracking.”  The development of this technology allows oil to be produced in quantities and geologic formations that were historically thought to be impossible.  Essentially, various chemicals are deposited in a deep hole and put under intense pressure.  This pressure “breaks” up the rock-like formation and allows oil and gas to become available for production.  Part of this technology advance is also the ability to drill at angles and horizontally to more efficiently drain a reservoir of oil. And our insight on where to drill is informed with “4D Seismic Technology” that allows us to “see” the geophysical characteristics of formations like never before. 

Yet another technological development is enhanced oil recovery.  Here is what the US DOE has to say about enhance oil recovery:

Crude oil development and production in U.S. oil reservoirs can include up to three distinct phases: primary, secondary, and tertiary (or enhanced) recovery. During primary recovery, the natural pressure of the reservoir or gravity drive oil into the wellbore, combined with artificial lift techniques (such as pumps) which bring the oil to the surface. But only about 10 percent of a reservoir's original oil in place is typically produced during primary recovery. Secondary recovery techniques extend a field's productive life generally by injecting water or gas to displace oil and drive it to a production wellbore, resulting in the recovery of 20 to 40 percent of the original oil in place.

However, with much of the easy-to-produce oil already recovered from U.S. oil fields, producers have attempted several tertiary, or enhanced oil recovery (EOR), techniques that offer prospects for ultimately producing 30 to 60 percent, or more, of the reservoir's original oil in place.

While no one questions the ability of fracking to make many more years of oil available, some have raised environmental concerns about whether the chemicals used in fracking will spoil the water supply and whether fracking will cause earthquakes.  New York for example has banned the use of fracking because of these concerns.

The US Environmental Protection Agency (EPA) has undertaken a study of fracking that was to be completed in 2014 but has been delayed to 2016.  The preliminary results and statements of the top official indicate that there is "only an upside to hydraulic fracturing."  Despite the fact that the Obama Administration has been very responsive to environmental concerns, e.g., Keystone Pipelines and War on Coal, environmentalists continue to attack fracking on environmental grounds. http://cfpub.epa.gov/ncea/hfstudy/recordisplay.cfm?deid=244651

The fracking issue is important for another reason.  It illustrates the folly of government directed research and development (R&D).  The Department of Energy (DOE) has spent literally billions on R&D since its founding in 1978.  Billions have gone to nuclear, renewables, efficiency, and coal research.  Fracking has NEVER been a major priority of DOE’s research agenda and very little has been spent by DOE on fracking research.  A similar case could be made regarding natural gas combined cycle turbines.  These turbines are today the backbone of the electric generation industry.  Yet DOE research support played literally no role in perfecting this technology for electric generation.  These two technologies alone—fracking (including horizontal drilling, 4D seismic, and enhanced oil recovery) and combined cycle turbines—are the two most important energy technological breakthroughs in the last three decades.  And DOE had virtually no role in their development.  The lesson is a cautionary one.  Government is not good at picking winners and losers regarding commercial technologies.  Additionally, government funding of R&D can have what is called a “crowding out” effect.  The private sector will be reluctant to do research that competes with government for fear that they will not realize the full profits for their innovation and invention. 

The simple fact that oil is a fungible commodity and trades in a global market has seemingly eluded US policymakers since 1973. Believing that imports of oil from the Middle East exposes the US to jeopardy has been used by both political parties to spearhead distortions in energy markets that are with us still today.

If one makes the assumptions that oil is finite in the sense that we will run out of it and that importing oil threatens some aspect of America’s security (funding terrorism, national security due to supplies coming from hostile nations, balance of trade, etc.), then the following policies make some sense:

  • CAFE:  Government should mandate that car companies increase the mile-per-gallon of cars sold in the US (otherwise known as Corporate Average Fuel Economy or CAFE) so that less fuel is needed to run cars.
  • SPR: Government should store reserves of crude oil and refined products in the ground in case of emergency shortages in supply (Strategic Petroleum Reserves or SPR).
  • Biofuel: Government should require that gasoline substitutes be produced from biological products such as ethanol as a way to increase supply of fuel for cars (ethanol). 
  • Synthetic Fuels:  Government should establish very expensive processes for converting abundant fuel (coal) into gasoline to increase the supply (Synthetic Fuels Corporation). 
  • Export Bans: Government should mandate that all oil produced in the US be consumed in the US and not exported in order to ensure security of supply.

Now make the assumption that markets work and that supply and demand will be driven to equilibrium by price. 

  • CAFE:  Because there is an assurance of supply in the bathtub there is only a need to worry about price.  Consumers will demand cars that have the features that best serve their needs, of which tradeoffs between price, safety and fuel efficiency will be several of such needs.  Thus CAFE is an unnecessary policy.
  • SPR: Because there is an assurance of supply in the bathtub there is no need to worry about supply shortages.  Let the private sector decide to diversify price risk by holding inventories.  SPR has turned out to be very expensive inventories of oil owned by the government funded by the taxpayer.
  • Biofuel: Ethanol is more expensive than gasoline and less efficient.  Since there is no risk of shortages of oil supply this is wasteful.  Additionally, there is the unintended consequence that using corn for producing ethanol increases food prices, which is an added burden on the poor.  The use of corn (the dominant feedstock in the US) for production of ethanol from field to gas pump uses nearly as much energy than the ethanol produces as a gasoline additive.  This net loss does not include the destruction of CO2- absorbing trees to increase the size of cornfields.       
  • Synthetic Fuels:  Because there is an assurance of supply there is no reason for the government to be in the synthetic fuel business.  Mercifully, the Reagan Administration put an end to this dream.
  • Export Bans:  In a world of guaranteed adequate supply, it makes no economic sense to ban oil exports.

 

Implications and Conclusion

Most oil producing countries depend on oil revenues to meet the needs of their people.  In some sense they need us more than we need them.

Look at how the recent collapse in oil prices has scrambled global politics.  Many oil producing countries based their budgets on expectations of oil at more than $100 a barrel.  Today (March 2015), it is hovering under $50 a barrel.  Time Magazine’s January 22, 2015, edition had an excellent analysis of the ramifications of the radical change in world oil prices.

Somewhat serendipitously, it is countries hostile to US interests who are most severely affected by the drop in oil prices.  Oilprice.com identified the “Top 5 Countries at Risk” listing: Venezuela, Nigeria (Boko Haram), Iraq (ISIS), Iran (intensifying the impact of the embargo), and Russia (limiting Putin’s use of energy as a weapon).  No one could have dreamt up a policy that would be more harmful to our adversaries and more beneficial to our friends than sustained low oil prices.

One needs to be careful, however, about being completely gleeful about this situation.  The fracking revolution benefited from higher oil prices and much less exploration and development in the US will result from low oil prices.  That sector is already feeling the pain and will continue to contract as prices remain low.  But other sectors of the economy will undeniably benefit from lower oil prices, so enhanced economic growth will be the net result.

More sinisterly, many of the countries that will suffer will become more desperate and thus potentially more dangerous.  They will also be able to use rhetoric that “blames” the US for low oil prices and thus the need to impose harsh measures on the populaces of these countries.  Venezuela’s President Maduro is already using the US as a punching bag to blame the US for the failure of his and the late President Chavez’s socialist policies.

Even when we had to rely on imports of oil for 60% of our consumption, our oil policies made no economic sense.  We will always have oil available in global markets simply because it is dispersed throughout the world.  But with the advent of the production technology revolution, the unreality of these policies has become even more manifest.  But entrenched interests have grown up around all these dysfunctional and market distorting policies. 

As President Reagan said, “A government bureau is the nearest thing to eternal life we’ll ever see on this earth.”  It is not too far an extension to conclude the same about a subsidy or a mandate.  The great news is that removing all these distortions will have a positive effect on prosperity, though there will certainly be some losers.  But we are running out of time to keep putting off the myriad actions that must be taken to ensure that we have the ability to deal with the challenges we face and leave our children and grandchildren a better, energy efficient world.



[1] Actually this is a simplistic but useful metaphor. Though we often talk about crude oil as if it were a completely homogenous commodity, there are several grades of crude that have slightly different characteristics.  So there are actually several different global bathtubs, each with a slightly different grade of crude.

[2] There are two components to the price of oil: the resource cost and the transportation cost.  The delivered price of oil to Japan might be more due to longer transportation distances than for Great Britain.  But the barrel of crude transported would be priced the same for both countries.   

 

Tags:

Patrick Moore: Should We Celebrate CO2

In October of 2015 the Global Warming Policy Foundation invited Patrick Moore, one of the founders of Greenpeace, to deliver a lecture at their annual meeting. See what he has to say about CO2 in his "SHOULD WE CELEBRATE CO2?"

Tags:

Welcome to The Right Insight

We offer fact-based, non-partisan and in-depth analysis of major political, economic and cultural issues confronting American society -- with emphasis on how these issues are affected by Federal and State governmental policy.

Tags:
Search Older Blog Posts