Call or complete the form to contact us for details and to book directly with us
435-425-3414
435-691-4384
888-854-5871 (Toll-free USA)

 

Contact Owner

*Name
*Email
Phone
Comment
 
Skip to Primary Navigation Skip to Primary Content Skip to Footer Navigation

In the Wake of the News

A New Paradigm - ORIGINAL CONTENT

Electric grids have demonstrated the ability to adapt to some fraction of intermittent renewable generation, as long as there is sufficient dispatchable generation available to meet contemporaneous grid demand when the intermittent renewables are not producing power or are producing power at less than rated capacity. The dispatchable generating capacity is currently predominantly fossil fueled, since existing nuclear generating capacity is largely base loaded.

However, as grid demand grows as the result of population growth and a move to “all-electric everything” and the quantity of intermittent renewable generation increases, there will be a growing need for additional dispatchable generating capacity. However, the parallel pressure to close both coal and natural gas generating facilities will lead to decreased, rather than increased, dispatchable fossil-fueled generation capacity in both absolute and percentage terms. This would lead to reduced grid reliability and resiliency, and probably to managed blackouts to avoid grid collapse.

The current industry paradigm is for the utility industry to accept connection to unsmoothed and non-dispatchable intermittent renewable generation and to accept all power produced by those generators on a priority basis. That paradigm is sustainable as long as dispatchable generation capacity exceeds intermittent renewable generating capacity. However, current federal climate change efforts to promote intermittent renewables and force closure of dispatchable fossil-fueled generation presage the end to that paradigm.

The North American Electric Reliability Corporation (NERC) is currently raising concerns about grid reliability and resilience. NERC should work with the Federal Energy Regulatory Commission (FERC) and the National Association of Regulatory Utility Commissioners to assure that the transition to a grid based on intermittent renewable generators continues to provide economical, reliable power. Two essential aspects of such a transition are dispatchable generation and economic dispatch.

The federal and state regulators should encourage and support a utility requirement that all new intermittent renewable generation sources connected to their grids include sufficient storage to render them dispatchable and sufficient excess generating capacity to recharge storage after use. The storage necessary to meet this requirement would depend on the maximum number of consecutive hours or days during which the generators were unable to operate because of low solar insolation or inadequate or excessive wind conditions and the frequency of such occurrences. The generators could be required to be dispatchable 85% of the year, which is the common dispatchability percentage for coal generating stations.

The resulting dispatchable renewable generators would be capable of replacing conventional coal and natural gas generators, rather than merely displacing the output of conventional generation when the intermittent generators were operating. Their ownership and operating costs would be directly comparable to the ownership and operating costs of conventional generators, particularly if current federal and state incentives were terminated. A return to economic dispatch would maximize power supplied by the lowest cost generators, eliminating the current preferences for renewable generation and minimizing wholesale power costs.

The storage required to render intermittent renewable generators dispatchable is currently very expensive and is not capable of delivering stored power for the expected duration of renewable generation unavailability. This is a critical impediment.

 

Tags: Electric Power Generation, Electric Power Reliability, Electric Power Dispatchable

Misperception and amplification of climate risk - Highlighted Article

 

From: Climate Etc.

By: Judith Curry

Date: December 13, 2022

 

Misperception and amplification of climate risk

 

“Something frightening poses a perceived risk. Something dangerous poses a real risk.” – Swedish physician Hans Rosling et al.[i]


This post is a follow on to my recent post Victims of the faux climate ‘crisis’. Part I: Children.  The issue of psychological trauma of children is one that I am continuing to work on, to identify root causes and a way forward.

The theme of this particular post is how our perceptions of risk differ from the actual risk itself.  Understanding this difference provides insights to understanding these fears, as well as providing insights into how these differences are manipulated by propagandists.

Apart from the objective facts about a risk, the social sciences find that our interpretation of those facts is ultimately subjective.  Risk science makes a clear distinction between professional judgments about risk versus the public perception of risk. Risk perception is a person’s subjective judgement or appraisal of risk, which can involve social, cultural and psychological factors.

No matter how strongly we feel about our perceptions of risk, we often get risk wrong. People worry about some things more than the evidence warrants (e.g. nuclear radiation, genetically modified food), and less about other threats than the evidence warrants (e.g., obesity, using mobile phones while driving). This gap in risk perception produces social policies that protect us more from what we are afraid of than from what actually threatens us the most.  Understanding the psychology of risk perception is important for rationally managing the risks that arise when our subjective risk perception system gets things dangerously wrong. (continue reading)

 

Misperception and amplification of climate risk

 

Tags: Highlighted Article

Future Grid - ORIGINAL CONTENT

 

"The Navy is a master plan designed by geniuses for execution by idiots."    Herman Wouk, The Caine Mutiny

It currently appears that the Administration’s vision of the future US electric grid, supplied predominantly by intermittent renewables and supported by electricity storage, is a master fantasy designed by politicians for execution by geniuses with the unique talent of Rumpelstiltskin. There appears to be no plan to assure that the required number of geniuses will be available timely.

Wind and solar generation operate intermittently, and their output fluctuates continuously when they are operating. The grid is currently required to accept this intermittent, fluctuating output on a priority basis and to smooth the output and dispatch alternative sources of generation when the intermittent generator output declines or ceases as the result of time of day or weather conditions. This requirement imposes predictable but uncontrollable costs on the grid and on the conventional generation capacity which supplies the grid during periods of low/no intermittent generation.

As the grid expands in line with the Administration’s “all-electric everything” goal and the capacity of fossil-fueled conventional generation declines as the result of federal mandates and the unfavorable economics of reduced operating hours, there will be a growing need for increased electricity storage capacity and for “Dispatchable Emissions-Free Resources”(DEFR). Unfortunately, the long-duration storage which would be required to support the grid through multi-day renewable energy “droughts” is not currently available and the DEFR remain undefined.

One approach to long-duration storage is pumped hydroelectric facilities. These facilities require paired reservoirs separated by significant elevation differences. There are approximately 23 GW of pumped storage capacity  in the US. This compares with a total US generating capacity of approximately 1,200 GW, of which approximately 140 GW is wind and 65 GW is solar renewable generating capacity. The move to “all-electric everything” over the next 28 years would require a rough tripling of US generating capacity, to approximately 3,600 GW, assuming no demand growth.

The conventional generating capacity which would be replaced by renewable generation plus storage consists of coal (~85% availability), Natural gas (~90% availability) and Nuclear (~95% availability). The renewable generation would consist of wind (~35% capacity factor times ~85% availability) and solar (~25% capacity factor times ~ 90% availability). Therefore, the rating plate capacity of the renewable generation required would be approximately four times the rating plate capacity of conventional generation capacity to serve the same grid demand, or approximately 14,000 GW.

The storage capacity required to support renewable generators during periods when they are not available to generate is the capacity of the generator times the maximum number of consecutive hours over which the renewable generation might be unavailable. The current US electricity storage capacity of 23 GW would be capable of replacing only one third of the current solar generating capacity. Assuming that storage capacity is all 8-hour storage (8 hr * 23 GW = 184 GWh), that storage capacity is the equivalent of replacing current US solar generating capacity for approximately 3 hours.

The ”all-electric everything” grid would require approximately 70 times current US renewable generating capacity and approximately 5,000 times current US electricity storage capacity.

 

Tags: Power Grid, Electric Power Generation, Electric Power Reliability, Energy Storage / Batteries

Level Playing Field - ORIGINAL CONTENT

Wind and solar generation are intermittent forms of renewable generation. Wind generation functions only when wind velocity is above a minimum threshold and below a maximum threshold. Solar functions only during the daytime, and then only when the sun shines. This wind velocity intermittency is reasonably predictable over the short term. The nighttime unavailability of solar is totally predictable and the sunshine intermittency is reasonably predictable over the short term.
However, these are not the only intermittency issues with wind and solar. There are also second-by-second volatility events which affect the output of wind and solar generators. The graphs below are taken from papers authored by Thunder Said Energy.
The graph below displays the short-term volatility of wind output from a 25 MW wind facility over a one-month period. During this month there were an average of 75 short-term volatility events per day during which power output dropped by more than 10% and as much as 100% for at least 1 second and fewer than 100,000 seconds (~28 hours). While every day and every month are unique, this monthly record displays a type of wind variability which is rarely discussed. This volatile wind facility output is fed to an electric grid which must match supply and demand for 60 cycles every second.

 

2250 volatility events

 

The next graph shows the variation of wind power output from the 25 MW wind facility over the course of a single day, during which maximum output was approximately 6.5 MW, or approximately 25% of rating plate capacity, and the average output was 2.3 MW, or approximately 10% of rating plate capacity.

 

Average Output 27 August

 

The next graph shows the output of the 25 MW wind facility over a period of a single day during which maximum output was approximately 2.3 MW and the average output was 0.1 MW.

 

Average Output 10 August

 

The graph below displays the short-term volatility of solar output over a period of 1 year. During this year there were an average of 96 volatility events per day during which power output dropped by more than 10% and as much as 95% for at least 1 second and fewer than approximately 13,000 seconds (~3.6 hours).

 

35000 volatility events

 

The final graph shows the variation in solar insolation on a single day. This is the type of intermittency which is generally discussed regarding solar energy. The times of day when measurable insolation becomes available and ceases to be available would change with latitude and with the seasons, as would the maximum daily insolation.

 

variation in solar insolation on a single day

 

Currently, it is the responsibility of the grid operator to compensate for the volatility of wind and solar generation output. Typically, the grid operator is dealing with input to the grid from numerous wind facilities and/or solar fields, each of which is experiencing volatility to some degree. The volatility might be either synchronous or asynchronous at any given time. Smoothing this volatility imposes costs on the grid which result in increased electricity prices.
To level the playing field for the various sources of electric generation, the volatility of wind and solar output should be smoothed prior to output delivery to the grid. Smoothing could be accomplished with capacitors or batteries, or a combination of both. This will become increasingly important as the fraction of intermittent generation on the grid increases and the availability of dispatchable conventional generation to compensate for wind and solar volatility decreases. Of course, the maximum output of the wind and solar generation facilities would be reduced somewhat by the need to recharge capacitors or storage used to smooth the output volatility.

 

Tags:

Policy Implications Of The Energy Storage Conundrum - Highlighted Article

 

From: Manhattan Contrarian

By: Francis Menton

Date: December 13, 2022

 

Policy Implications Of The Energy Storage Conundrum


It occurs to me that before moving on from my obsession with energy storage and and its manifest limitations, I should address the policy implications of this situation.  I apologize if these implications may seem terribly obvious to regular readers, or for that matter to people who have just thought about these issues for, say, five minutes.  Unfortunately, our powers-that-be don’t seem to have those five minutes to figure out the obvious, so we’ll just have to bash them over the head with it.

Here are the three most obvious policy implications that nobody in power seems to have figured out:

(1) More and more wind turbines and solar panels are essentially useless because they can never fully supply an electrical grid or provide energy security without full dispatchable backup.

Here in the U.S. the so-called “Inflation Reduction Act” of 2022 provides some hundreds of billions of dollars of subsidies and tax credits to build more wind turbines and solar panels.  Simultaneously, the Biden Administration, directed by a series of Executive Orders from the President, proceeds with an all-of-government effort to suppress the dispatchable backup known as fossil fuels.  Does somebody think this can actually work?  It can’t.  

And then there’s the December 6 press release from the UN’s International Energy Agency, touting how renewable energy sources (wind and solar) are being “turbocharged” to provide countries with “energy security.”  The headline is: “Renewable power’s growth is being turbocharged as countries seek to strengthen energy security.”   Excerpt: (continue reading)

 

Policy Implications Of The Energy Storage Conundrum

 

Tags: Highlighted Article

Climate “Reparations” - ORIGINAL CONTENT

The UN COP27 concluded in mid-November with an agreement to work toward establishment of a funding mechanism to compensate developing countries for “loss and damage” resulting from the effects of anthropogenic climate change. This concept is problematic on several levels.

First, weather events such as floods, droughts, tropical cyclones, tornadoes and lightning ignited wildfires have always caused “loss and damage”. The frequency, duration and severity of these events has varied over time. No nation or group of nations is in any way responsible for the occurrence of these weather events. However, nations affected by these weather events are, at least in part, responsible for the magnitude of the “loss and damage” caused by the events as the result of placing infrastructure and people in harm's way in flood plains and on seashores and by failing to build flood control dams and water storage reservoirs. This is the case in both developed and developing countries.

Second, climate has always changed over the entire historical period we have been able to study. It is not possible to measure the alleged effects of anthropogenic emissions on climate change. Both climate warming and cooling events occurred prior to the period in which humans began adding CO2 and other GHGs to the atmosphere; and, they have continued since. Had climate been unchanging prior to the advent of anthropogenic emissions, it might have been possible to attribute changes in climate to the anthropogenic emissions. However, that is not the case.

The “loss and damage” compensation issue raised at COP27 is based on the assumption that incremental climate change caused by CO2 and other GHG emissions from the developed nations has somehow contributed to the frequency and/or severity of these various weather events. However, observations do not support the assertions of increased frequency, duration or severity of adverse weather events. Data do support assertions of increased absolute financial costs of the “loss and damage” from these weather events resulting from increased infrastructure investment in areas subject to damage from the weather events, though there is no increase relative to GDP.

The assertions of increased “loss and damage” from anthropogenic climate change are based on the outputs of a class of climate models referred to as attribution models. These models are of relatively recent origin. They are unverified and unvalidated, as are the global climate models on which they are based. The current ensemble of global climate models project temperature anomaly increases, on average, twice as large as the observed temperature anomaly increases.

The attribution models attempt to identify the differences between the actual weather event as it occurred and what the event might have been like in the absence of anthropogenic climate change. For example, there was much discussion regarding the potential effects of anthropogenic climate change on what was expected to be an above average Atlantic hurricane season in 2022. The actual 2022 Atlantic hurricane season was far below normal, suggesting that our understanding of the effects of climate change on weather events is “not ready for prime time”.

Our understanding is certainly not sufficient to serve as the basis for massive transfer payments from the developed nations to the developing nations based on “responsibility” for computer estimated incremental “loss and damage”.

 

Tags: COP - Conference of Parties, Climate Change Economics

The Impossibility Of Bridging The "Last 10%" On The Way To "100% Clean Electricity" - Highlighted Article

 

From: Manhattan Contrarian

By: Francis Menton

Date: December 10, 2022

 

The Impossibility Of Bridging The "Last 10%" On The Way To "100% Clean Electricity"

 

As my last post reported, the Official Party Line from our government holds that we have this “100% Clean Electricity” thing about 90% solved.  As the government-funded NREL put it in their August 30, 2022 press release, “[a] growing body of research has demonstrated that cost-effective high-renewable power systems are possible.”  But then they admit that that statement does not cover what they call the "last 10% challenge” — providing for the worst seasonal droughts of sun and wind, that result in periods when there is no renewable power to meet around 10% of annual electricity demand.  That last 10%, says NREL, will require one or more “technologies that have not yet been deployed at scale.”  

But hey, we’ve got 90% of this renewable transition thing solved.  How hard could figuring out that last 10% really be?

And on that basis the government has embarked upon forcing the closure of large numbers of power plants that use fossil fuels like coal and natural gas, as well as on suppressing exploration for fossil fuels and other things like pipelines and refineries.  After all, if we’re transitioning at least 90% to renewables, we won’t need 90% of the fossil fuel infrastructure any more, will we?

Actually, that’s completely wrong.  Until the full solution to the so-called “last 10% challenge” is in place, we need 100% of our fossil fuel backup infrastructure to remain in place, fully maintained, and ready to step in when the wind and sun fail.  

Let’s take a brief look at what bridging the last piece of the renewable transition actually looks like.

NREL’s August 2022 Report titled “Examining Supply-Side Options to Achieve 100% Clean Electricity by 2035” lays out several scenarios for supposedly achieving that goal.  For all the scenarios, the most important piece is the same:  building and deploying lots more wind turbines and solar panels.  (The scenarios differ in the degree of deployment of other elements like transmission lines, battery storage, carbon capture technology, and additional nuclear.). As foreseen by NREL, by 2035, total electricity generation capacity in the U.S. has more than tripled, with the large majority of the additions being wind and solar.  There is substantial overbuilding of the wind and solar facilities, presumably to provide enough electricity on days of light wind or some clouds, while having large surpluses to discard on days of full wind and sun.  Some storage has been provided, but mostly “diurnal” (intra-day) and not seasonal. (continue reading)

 

The Impossibility Of Bridging The "Last 10%" On The Way To "100% Clean Electricity"

 

Tags: Highlighted Article

2023 - The Year Ahead - ORIGINAL CONTENT

“Predictions are hard, especially about the future.”, Yogi Berra, American philosopher

Considering Yogi’s caution above, I will not make predictions regarding climate science or climate change policy in 2023. Rather, I will discuss what I believe should happen in 2023 to advance the state of climate science and climate change policy.

An important first step in advancing the state of climate science is improving the quality of the data being used to measure the impacts of climate change. This includes expanding the coverage of both near-surface land temperatures and sea surface temperatures, so that “infilling” of estimated temperatures is no longer necessary. It also includes improving the quality of the data, so that “adjustments” to the data are no longer necessary. The US Climate Reference Network and the Argo buoys provide suitable data quality, but they do not provide global coverage. Satellite temperature measurements provide near-global coverage in the troposphere but display differences with the near-surface land and sea surface temperatures which should be analyzed and resolved.

There is also a major data quality issue regarding the rate of sea level rise which should be addressed and resolved. The sea level rise measured by satellite is approximately twice the rate measured by tide gauges in geologically stable locations. The rate of sea level rise measured by the satellites has increased inexplicably with each new satellite placed into service, while the rate of rise measured by the tide gauges has not changed. The tide gauge record begins well before the presumed start of anthropogenic warming, so it records natural variation associated with the recovery from the Little Ice Age.

The projections of future anthropogenic warming rely on the outputs of numerous climate models, none of which have been verified and validated. All of the climate models over-project warming relative to both near-surface and satellite-based observations and their outputs diverge significantly into the future. The models have been “tuned” by hindcasting to the near-surface temperature anomaly records which have issues as described above. A single, validated and verified model would constitute a far more solid basis for climate policy formation than the current situation.

There remains major uncertainty regarding the sensitivity of the climate to a doubling of atmospheric CO2. IP AR6 identifies a range of 2.5 – 4 degrees C, with a likely value of 3 degrees C and a very low likelihood of a value lower than 1.5 degrees C. However, recent research not reflected in AR6 suggests that sensitivity might well be less than or equal to 1.5 degrees C. This uncertainty has a dramatic effect on the range of projected temperature futures and should be the focus of aggressive research to resolve it.

There also remains uncertainty regarding the magnitude of climate feedbacks and whether the feedbacks are net positive or negative. The assumption of positive net feedback increases the temperature increases projected by the climate models.

There also remains uncertainty regarding the Representative Concentration Pathway used to project future atmospheric CO2 concentrations. While it is not possible to predict the actual future pathway, there is growing agreement that RCP 8.5 is implausible and should not be assumed to represent a “business-as-usual” future scenario.

Each of the above issues is of far greater significance to the formation of rational climate policy than the production of “scary scenarios” based on RCP 8.5, which has been a persistent focus of recent climate research.

The UN and national governments should abandon the “climate crisis”, “existential threat”, “climate emergency” political narrative, which is not supported by the climate science, and refocus on getting the climate science right.

Politicians would do well to contemplate the following observation.    
 
“Politics is the art of looking for trouble, finding it everywhere, diagnosing it incorrectly and applying the wrong remedies.”   Groucho Marx

 

Tags: Preview of the New Year, Climate Science, Climate Policy, Climate Change Debate

Looking For The Official Party Line On Energy Storage - Highlighted Article

 

From: Manhattan Contrarian

By: Francis Menton

Date: December 8, 2022


Looking For The Official Party Line On Energy Storage


If you’ve read my energy storage report, or just the summaries of parts of it that have appeared on this blog, you have probably thought:  this stuff is kind of obvious.  Surely the powers that be must have thought of at least some of these issues, and there must be some kind of official position on the responses out there somewhere.

So I thought to look around for the closest thing I could find to the Official Party Line on how the U.S. is supposedly going to get to Net Zero emissions from the electricity sector by some early date.  The most authoritative thing I have found is a big Report out in August 2022 from something called the National Renewable Energy Laboratory, titled “Examining Supply-Side Options to Achieve 100% Clean Electricity by 2035.”  An accompanying press release with a date of August 30 has the headline “NREL Study Identifies the Opportunities and Challenges of Achieving the U.S. Transformational Goal of 100% Clean Electricity by 2035.”  

What is NREL?  The Report identifies it as a private lab “operated by Alliance for Sustainable Energy, LLC, for the U.S. Department of Energy under Contract.”  In other words, it’s an explicit advocacy group for “renewable” energy that gets infinite oodles of taxpayer money to put out advocacy pieces making it seem like the organization’s preferred schemes will work.

Make no mistake, this Report is a big piece of work.  The Report identifies some 5 “lead authors,” 6 “contributing authors,” and 56 editors, contributors, commenters and others.  Undoubtedly millions of your taxpayer dollars were spent producing the Report and the underlying models (which compares to the zero dollars and zero cents that the Manhattan Contrarian was paid for his energy storage report).  The end product is an excellent illustration of why central planning does not work and can never work.

So now that our President has supposedly committed the country to this “100% clean electricity” thing by 2035, surely these geniuses are going to tell us exactly how that is going to be done and how much it will cost.  Good luck finding that in here.  From the press release: (continue reading)


Looking For The Official Party Line On Energy Storage

 

 

Tags: Highlighted Article

2022 Year in Review - ORIGINAL CONTENT

Climate science focuses on two fundamental issues: the current status of the changed climate; and, projections of future changes in the climate. The science is not “settled” regarding either of these issues.

The key aspects of interest regarding the current status of the changed climate are: atmospheric GHG concentration; global average near-surface temperature; tropospheric temperature; sea level; extreme weather events; and, ocean pH. Projections of future changes in the climate also focus on these issues and rely on climate models.

The Mauna Loa data on the atmospheric concentration of CO2 are broadly accepted and document a rough doubling of the atmospheric CO2 concentration since the pre-industrial period.

The global average near-surface temperature data and sea surface temperature data remain problematic as the result non-uniform global coverage, particularly in the Southern hemisphere and in the global oceans, urban sprawl, aging and malfunctioning measuring stations and large areas with no measuring stations. This continues to result in “adjusting” measurements and “infilling” estimates where no data are available. The US Climate Reference Network continues to demonstrate the ability to measure near-surface temperature accurately and reliably, but there appears to be little interest in establishing such a network on a global basis.

The tropospheric temperature data provide almost complete global coverage and indicate a slower rate of warming than the near-surface temperature anomaly products. The reasons for this discrepancy remain unexplained.

The sea level data are also problematic. The rate of sea level rise measured by satellite is approximately twice the rate of increase measured by geologically stable tide gauges. This discrepancy also remains unexplained.

There are no increasing trends in the frequency, severity or duration of extreme weather events, including floods, droughts, tropical cyclones or tornadoes, despite frequent political handwringing to the contrary.

Ocean pH has decreased very slightly but remains solidly basic at 8+.

There are numerous climate models used to project future global near-surface temperature change. All the models have projected temperature increases greater than observed, on average nearly twice as great. None of the models have been validated and verified. The models influence and are influenced by climate sensitivity, climate forcings and feedbacks, as well as by the Representative Concentration Pathway (RCP) selected to estimate future atmospheric CO2 increases.

The climate sensitivity to a doubling of atmospheric CO2 is unknown, but is estimated to range from 1.5 – 4.5 degrees C. Recent research suggests that sensitivity is near, or even below, the bottom of that range. The magnitude of climate forcings are also estimated. The magnitude and direction of climate feedbacks are also unknown but estimated. Finally, the rate at which additional CO2 will accumulate in the atmosphere is unknown, though there are several Representative Concentration Pathways in current use. The combination of these uncertainties results in model projections of future temperatures which vary significantly and diverge rapidly into the future.

The most commonly used RCP is RCP8.5, which projects the greatest increase in atmospheric CO2 concentrations. RCP8.5 has frequently been referred to as the “business-as-usual” scenario, though there is growing recognition that it is implausible. It does, however, produce the scariest scenarios of future climate catastrophe.

Models are also being used to attempt to attribute some fraction of some extreme weather event to climate change. However, these models are also unverified and unvalidated. Most recently, there has been a growing effort to provide “instant attribution” to take advantage of the news cycle immediately after the event.

 

Tags: Climate Science, Representative Concentration Pathway (RCP), Climate Models, Year in Review

My Energy Storage Report: Hydrogen As An Alternative To Batteries - Highlighted Article

 

From: Manhattan Contrarian

By: Francis Menton

Date: December 4, 2022

 

My Energy Storage Report: Hydrogen As An Alternative To Batteries

 

As mentioned in the last post, my new energy storage report, The Energy Storage Conundrum, mostly deals with issues that have previously been discussed on this blog; but the Report goes into considerable further detail on some of them.

One issue where the Report contains much additional detail is the issue of hydrogen as an alternative to batteries as the medium of energy storage.  For examples of previous discussion on this blog of hydrogen as the medium of storage to back up an electrical grid see, for example, “The Idiot’s Answer To Global Warming: Hydrogen” from August 12, 2021, and “Hydrogen Is Unlikely Ever To Be A Viable Solution To The Energy Storage Conundrum” from June 13, 2022.

At first blush, hydrogen may seem to offer the obvious solution to the most difficult issues of energy storage for backing up intermittent renewable generation.  In particular, the seasonal patterns of generation from wind and sun require a storage solution that can receive excess power production gradually for months in a row, and then discharge the stored energy over the course of as long as a year.  No existing battery technology can do anything like that, largely because most of the stored energy will simply dissipate if it is left in a battery for a year before being called upon.  But if you can make hydrogen from some source, you can store it somewhere for a year or even longer without significant loss.  Problem solved!

Well, there must be some problem with hydrogen, or otherwise people would already be using it extensively.  And indeed, the problems with hydrogen, while different from those of battery storage, are nevertheless equivalently huge.  Mostly, to produce large amounts of hydrogen without generating the very greenhouse gas emissions you are seeking to avoid, turns out to be enormously costly.  And then, once you have the hydrogen, distributing it and handling it are very challenging.

Unlike, say, oxygen or nitrogen, which are ubiquitous as free gases in the atmosphere, there is almost no free hydrogen available for the taking.  It is all bound up either in hydrocarbons (aka fossil fuels — coal, oil and natural gas), carbohydrates (aka plants and animals), or water.  To obtain free hydrogen, it must be separated from one or another of these substances by the input of energy.  The easiest and cheapest way to get free hydrogen is to separate it from the carbon in natural gas.  This is commonly done by a process called “steam reformation,” which leads to the carbon from the natural gas getting emitted into the atmosphere in the form of CO2.  In other words, obtaining hydrogen from natural gas by the inexpensive process of steam reformation offers no benefits in terms of carbon emissions over just burning the natural gas.  So, if you insist on getting free hydrogen without carbon emissions, you are going to have to get it from water by a process of electrolysis.  Hydrogen obtained from water by electrolysis is known by environmental cognoscenti as “green hydrogen,” because of the avoidance of carbon emissions.  Unfortunately, the electrolysis process requires a very large input of energy. (continue reading)

 

My Energy Storage Report: Hydrogen As An Alternative To Batteries

 

Tags: Highlighted Article

Utility Regrets (Large) - ORIGINAL CONTENT

The electric utility industry functions within the framework of federal, state and local legislation and regulation. The legislators and regulators are influenced by the renewable energy industry and by numerous environmental advocacy groups. In this environment, the utilities have been required to connect wind and solar generation to the grid and to accept all of the electricity generated by these intermittent renewables on a priority basis whenever it is available.

This intermittent renewable electricity displaces output from conventional electric generators owned and operated by the utilities and their wholesale suppliers to the extent that it is available. However, the conventional generators are still required to provide power during periods when the renewable sources are not generating. Therefore, the fixed costs of the conventional generation are largely unaffected, but the revenue from generation and the associated variable costs are reduced. The net result in an increase in the cost of the power produced by the conventional generators.

Increasing renewable generation further decreases the cumulative output of the conventional generators, but does not reduce the conventional generation capacity required to satisfy grid demand when renewable generation is unavailable. In fact, increased electric demand from customer load growth and fuel switching would increase the capacity of conventional generation required to support the grid even as renewable generation capacity increased, though a portion of the increased conventional capacity requirement could be offset by the addition of electricity storage.

Utilities have agreed to accept the output of renewable generation as produced, without smoothing to eliminate the frequent fluctuations in renewable output or storage capacity sufficient to render the renewable generators dispatchable. This approach reduces the apparent cost of the renewable electricity, but increases the cost and complexity of utility operations.

Utilities could have and likely should have fought to require renewable generators to provide smoothed and dispatchable power meeting the same requirements as their own and their wholesale suppliers’ generators. That approach would have reflected the full cost of renewable generation, which would have been several times the cost of “source of opportunity” generation.

This issue will become critical as renewable generation proliferates and conventional coal and natural gas generators are required to discontinue operation under federal mandates over the next 13 years. The grid support currently provided by the conventional generators would have to be provided by electricity storage, while the stability provided by the inertia of the large rotating turbine generators would have to be provided by power electronics.

Electric utilities earnings are based on an allowable rate of return on net physical plant in service. Electric utility physical plant is typically 70-80% generation. Displacement of utility coal and natural gas generation with third party generation reduces utility earnings potential. This is currently causing utilities to seek to invest in renewable generation capacity, as well as to focus on the investment required to provide electricity storage to support the grid through periods of low/no renewable generation. These investment requirements will be substantially increased by the federal push for “all-electric everything”.

It appears highly unlikely that pursuing this path would result in the long-promised reduced energy costs.

 

Tags: Electric Power Reliability, Electric Power Generation, Renewable Energy

The Energy Storage Conundrum - Highlighted Article

 

From: The Global Warming Policy Foundation

By: Francis Menton

Date: December, 2022

 

THE ENERGY STORAGE CONUNDRUM


Introduction and Executive Summary

Advanced economies – including most of Europe, much of the United States, Canada, Australia, New Zealand, and others – have embarked upon a quest to ‘decarbonise’ their economies and achieve ‘Net Zero’ emissions of carbon dioxide and other greenhouse gases. The Net Zero plans turn almost entirely on building large numbers of wind turbines and solar panels to replace generation facilities that use fossil fuels (coal, oil and natural gas) to produce electricity. The idea is that, as enough wind turbines and solar panels are built, the former coal, oil, and gas-burning central stations can gradually be closed, leaving an emissions-free electricity system.

But wind and solar facilities provide only intermittent power, which must be fully backed up by something – fossil fuel generators, nuclear plants, batteries, or some other form of energy storage – so that customer demand can be matched at times of low wind and sun, thus keeping the grid from failing. The governments in question have then mostly or entirely ruled out fossil fuels and nuclear as the backup, leaving some form of storage as the main or only remaining option. They have then simply assumed that storage in some form will become available. Their consideration of how much storage will be needed, how it will work, and how much it will cost has been entirely inadequate.

Energy storage to back up a predominantly wind/solar generation system to achieve Net Zero is an enormous problem, and very likely an unsolvable one. At this time, there is no proven and costed energy storage solution that can take a wind/solar electricity generation system all the way to Net Zero emissions, or anything close to it. Governments are simply setting forth blindly, without any real idea of how or whether the system they mandate might ultimately work or how much it will cost. The truth is that, barring some sort of miracle, there is no possibility that any suitable storage technology will be feasible, let alone at affordable cost, in any timeframe relevant to the announced plans of the politicians, if ever.

This report seeks to shine a light on the critical aspects of the energy storage problem that governments have been willfully ignoring.

Section 1 shows that full backup is indispensable in an electricity grid powered mainly by intermittent generation. Without it, there would be frequent blackouts, if not grid collapse. It doesn’t matter if one builds wind and/or solar facilities with capacity of ten or one hundred or even one thousand times peak electricity usage. On a calm night, or during days or weeks of deep wind/sun drought, those facilities will produce nothing, or close to it, and only full backup of some sort – that is, backup sufficient to supply all of peak demand for as long as it takes – will keep the grid from failing. (continue reading)

 

THE ENERGY STORAGE CONUNDRUM

 

Tags: Highlighted Article

Utility Regrets (Small) - ORIGINAL CONTENT

Electric utilities find themselves positioned between the renewable generation industries (wind and solar) and electricity end users. Their relationships with these groups are controlled or influenced by legislation and regulation at the federal, state and local levels. Utilities are coming to regret some decisions they have made in this highly politically charged environment as the percentage of renewable energy generation increases.

Several electric utilities agreed to serve residential and small commercial customers with on-site solar generation capacity using simple net metering, in which the customers’ electric meters run backwards when customer generation exceeds on-site energy demand. This was viewed as a trivial issue when on-site solar installations were  less common, but has become a significant issue as on-site solar generation has proliferated.

Residential and small commercial electric rates typically consist of a fixed monthly service charge and a variable consumption charge. These charges are set in rate cases filed with state utility commissions. It is common for the fixed monthly service charge to recover only a fraction of the utilities’ fixed costs (25-50%). The remainder of the fixed costs are recovered in the variable portion of the rate, based on the quantity of electricity sold to each customer class during a “test year”.

Simple net metering allows the on-site generating customers to be compensated not only for the current wholesale cost of avoided incremental electricity generated or purchased by the utilities, but also for the portion of the utilities’ fixed costs included in the variable portion of the rate. This causes the utilities to under-recover their fixed costs until the next rate case, when that portion of the fixed costs could be reallocated, increasing the variable rate paid by all customers in the class.

Simple net metering results in a subsidy from non-generating customers to on-site generating customers. Several electric utilities have approached their regulatory commissions to switch from simple net metering to an arrangement which compensates the on-site generating customers for only the utilities’ avoided wholesale cost of power. These efforts have been aggressively resisted by the solar energy industry and by on-site generation consumer groups and climate advocacy groups, because this compensation approach significantly reduces the on-site generation customers’ annual electricity cost savings.

Many customers’ solar purchase decisions were and are based on the assumption of continued simple net metering. Compensation at a reduced rate decreases the attractiveness on on-site solar for both existing and potential future solar generation customers. Existing on-site generation customers believe they are entitled to continue to benefit from the cost shifting to non-generating customers, since their purchase decisions were based on this compensation approach. Solar contractors see their future business volumes threatened by the reduced customer compensation per kilowatt hour returned to the grid.

Several state utility commissions have attempted to take the Solomonic approach to resolving the issue, suggesting customer compensation somewhere between the wholesale and retail cost of electricity. However, such an approach only reduces, but does not eliminate, the cross subsidy from non-generating to on-site generating customers. It remains unfair to the utilities and their non-generating customers.

 

Tags: Electric Power Generation, Electric Utilities
Search Older Blog Posts