Resource Adequacy Modeling for a High Renewable Future

Resource Adequacy Modeling for a High Renewable Future

Resource Adequacy Modeling for a High Renewable Future

The electric industry is undergoing a significant transition, decarbonizing production while supporting the electrification of the US economy. Key concepts such as resource adequacy need to be re-evaluated in the context of that transition. Resource adequacy in power system planning refers to the state of having enough generation capacity online and available to meet customer demand almost all the time,1 inclusive of the capability to reduce customer load through demand response. Resource adequacy is a fundamental component of reliability that assures the continuous, undisrupted operation of the electric system in real-time.2 Planning for power system resource adequacy, once a relatively straightforward engineering calculation, is now characterized by growing complexity and uncertainty in both supply and demand. The demand side is changing with the growth of electric vehicles (EVs),heat pumps, new types of demand response resources, and rooftop solar and storage. On the supply side, power grids are evolving rapidly from a system served by dispatchable resources to one reliant on variable energy resources (VERs) and duration-limited storage. In the face of increasing complexity, many of the tools power system planners relied on are now becoming obsolete.

In the new era of high penetration renewable grids, power system reliability models must include the impact of weather as a driver of energy generation. These models should also address correlations in production output between different VERs, storage state-of-charge limitations, and common mode failures such as the risk of freezing coal piles and natural gas equipment during adverse weather conditions. In addition to adapting to these changes, policy makers, regulators, and power system planners should also anticipate impacts to resource adequacy in the face of a changing climate, with more frequent extreme weather events and natural disasters. According to the California Energy Commission, “with climate change, extreme weather events that were previously considered low probability events must be accounted for in electric sector planning.”3 In other words, climate change increasingly means one cannot simply look to the past to understand the ongoing impacts of extreme weather events.

With the changes wrought by energy resource transition and a changing climate, the key to modern resource adequacy planning is capturing the influence of meteorology on power system operations, including the relationships between weather, load, renewable generation, and forced outages, as well as simulating how system elements perform in relation to each other. Power system planners may also benefit from leveraging climate models to capture projected climate change impacts such as increasing frequency of extreme weather events.

This paper briefly reviews resource adequacy planning concepts and recommends modeling tools and techniques needed to build the reliable and resilient power system of the future.

Traditional Resource Adequacy Planning

Electric utilities have used the resource planning process for decades to develop long-term, least-cost generation supply plans to serve expected customer demand. Resource adequacy planning ensures that a system has enough energy generation throughout the year to serve demand with an acceptably low chance of shortfalls. Resource adequacy is measured by the metrics described in Figure 1. Reliability metrics provide an indication of the probability of a shortfall of generation to meet load (LOLP), the frequency of shortfalls (LOLE and LOLH), and the severity of the shortfalls (EUE andMW Short).

The industry has traditionally framed resource adequacy in terms of procuring enough resources (primarily generation) to meet the seasonal peak load forecast, plus some contingency reserves to address generation and transmission failures and/or derates in the system.4 This approach and the metric used to define it is called the “reserve margin.” Planners establish a reserve margin target based on load forecast uncertainty and the probability of generation outages. Required reserve margins vary by system and jurisdiction, but planners frequently target a reserve margin of 15 percent to 18 percent to maintain resource adequacy. Figure 25shows the standard conceptualization of a load duration curve, rank ordering the level of a power system’s load for each hour of the year from highest to lowest on an average or median basis in a typical weather year. The installed reserve margin is a margin of safety to cover higher than expected load and/or unexpected losses in generation capacity due to outages. As we discuss later, the peak load hour, either on a hot summer day or a cold winter morning, is not necessarily the riskiest time for a renewable heavy power system. Every hour of the year must now be studied carefully, given the uncertainty about whether renewable resources will be able to generate when needed.

Today, most resource planning analyses rely on the “one day in ten years,” criteria, meaning that load does not exceed supply more than 24 hours in a 10-year period, or its equivalent metric of 2.4 hours loss of load hours(LOLH) per year. This analysis is performed at the “balancing authority” (BA) level. Traditionally, BAs were vertically integrated utilities with defined service territories. Modern independent system operators (ISOs) or regional transmission operators (RTOs) have created larger BAs that comprise many utility services territories and create market or compliance-based rules to maintain sufficient system capacity. BA’s typically conduct resource adequacy analysis based on their own load and resources. Resources are either located within its geographic region or have firm transmission deliverability into the BA territory. In other words, it is not considered prudent planning to assume your neighbor can assist you in tight conditions when their systems are also likely to be stressed. In real-life operations, BAs can and do receive assistance from other BAs if they have capacity to spare.

The standard metrics shown in Figure 1 are generally reported as mean values of simulated power system outcomes over a range of potential future states, but planners also need to understand and plan for the worst-case outcomes and associated probability of such outcomes. Figure 3 shows the mean and percentile values for loss of load hours for a power system ove a three-year period.

In Figure 3, on average, the power system is resource adequate, remaining below the target of 2.4 hours per year. However, if the power system planner were more risk averse, she might want to bring a higher percentile line under the 2.4-hour target. She would need to add more firm capacity, adding to customer cost. The 95th percentile is the worst-case outcome, providing additional information on the upper bound risk of outages for a given portfolio. Only power systems with no recourse to import energy in a shortage, such as an island, would consider planning to the 95th percentile due to its high cost.

Resource adequacy planning is fundamentally concerned with low probability events and planning for average outcomes; although a common practice, this planning is not sufficient and increasingly risky with more uncertain supply, such as renewables. In the past, planners only needed to worry about unusually highloads or high forced outages. Now, they must worry about unusually high loads during periods of unusually low renewable output and limited storage duration. Adding supply uncertainty and, as we discuss later, more extreme weather, compounds risks and thus requires a fundamental rethinking of planning for low probability, high impact tail events.

In the planning process, the annual values of LOLH/LOLE, EUE, etc. provide useful information for resource procurement decisions. Annual and seasonal outputs matter because they can guide planners in identifying resource needs. Resource planners also need to know the duration of the shortfall, i.e., whether the risk of shortfalls occurs over many consecutive hours or concentrates among a few hours. Figure 4 shows a “heatmap,” a color-coded visualization displaying the average risk of resource shortfalls by month (y-axis) and hour (x-axis). Understanding the risk of failing to meet load will assist planners in identifying the resources needed to maintain resource adequacy.

The heat map, which shows a utility service territory in California with heavy solar penetration, demonstrates that risk is not equal across all hours. The highest risk occurs in the late evening (about 10:00 PM) in September after sunset. Before the increased reliance on solar generation, resource adequacy risk was concentrated in the early afternoon when load peaked. With high solar penetration, the change in net load (load less renewable generation) shifts the risk to the evening, a phenomenon known colloquially as the “head of the duck.” Rather than simply planning for “peak hour,” the balancing authority must now address a risk profile spanning from June through October and occurring primarily after the sun sets.

Problems with Traditional Resource Planning with a High Renewable System

With weather emerging as a fundamental driver of power system conditions, planning for resource adequacy with high renewables and storage becomes an exercise in quantifying and managing increasing uncertainty on both the supply and demand side of the equation. On the load side, building electrification, electric vehicle adoption, and expected growth in customer-sited solar and storage are likely to have pronounced effects on future electric consumption. Uncertain load growth and changing daily consumption patterns increase the challenge of making sure that future resources can serve load around the clock. Simply modeling future load based on past load with added noise does not characterize uncertainty from demand side changes.

Supply-side changes from systems consisting of mostly dispatchable resources to systems comprised mostly of resources with limited to no dispatchability lead to the need for modeling supply in a probabilistic manner. Prior to the energy transition underway, most supply uncertainty came from forced outages of thermal resources. Because variable renewable energy now contributes most of the uncertainty in supply, power supply uncertainty cannot be accurately captured with legacy tools using average annual profiles to represent renewable resources. To explain these emerging gaps in reliability modeling, we evaluate past approaches versus current needs against the modeling components shown in Table 1.

Updating Reliability Planning for a New Energy Paradigm

Figure 5 shows an approach that includes the core components of the new energy paradigm: meteorology, variable renewable energy generation, forced outages, and energy limited storage. Weather, primarily in the form of temperature, but potentially including insolation, humidity, wind speed, etc., drives simulations of renewable generation and customer load. Generation outage simulations can be modeled as random (the traditional approach) or as correlated with extreme heat or cold events. Once the simulations are in place, models can compute multiple future paths on an hour-by-hour basis to determine when load cannot be fully served with the available resources. For every hour of the model time horizon, there are independent simulations of load, renewables, and forced outages to determine if load shedding must occur. If a particular model contains 100 simulations and four show a lack of resources to serve load for a particular hour, the hour in question would have a loss of load probability of 0.04 (4/100).

Energy storage presents a unique challenge in resource adequacy models. Unlike traditional resources, storage devices such as batteries, compressed air, or pumped-hydro act as both load and generation depending on whether they are charging or discharging. Modern resource adequacy models need to simulate this behavior when determining the capability of energy storage to serve load during periods of resource scarcity. What state of charge should we expect for energy storage at times when the storage is truly needed? Are batteries likely to be fully charged at 6:00PM on a weekday in August? What about grid charging versus closed systems where batteries must charge from a renewable resource? At the high end of renewable penetration, how much storage would be required to cover Dunkelflaute, the “dark doldrums,” that occur in the winter when wind ceases to blow for several days. Questions surrounding the effective load-carrying capability of energy storage significantly increase the complexity in modeling resource adequacy.

Dispatch models may be a useful way to estimate the contribution of storage under the full range of future conditions.

Figure 6 provides an illustration of modeling the use of batteries in resource adequacy. The figure shows battery storage in blue, load in orange, and the available thermal generation in grey. When load exceeds thermal generation, the system is forced to rely on battery discharge for capacity. If the event lasts long enough to fully discharge the battery, the green line (generation minus load) will turn negative, indicating a load shed event.

Although transmission capacity is not a traditional topic in resource adequacy analysis, this is changing due to increasing reliance on renewable resources located far away from load centers. Models may thus need to include representations of transmission lines that add congestion and derates that limit the ability of remote resources to transmit power to load centers, as depicted in Figure 7.

Figure 7 shows a model containing multiple load pockets separated by transmission corridors which requires the ability to allocate capacity across multiple regions. The allocations should occur in a manner that minimizes unserved energy across the combined region.

Importance of Simulations in Resource Adequacy Models Resource adequacy models are typically framed as a probabilistic analysis, since the conditions that lead to a shortage of generation to meet load are associated with low-probability extreme events. Monte Carlo simulations are well-suited to solve complex probabilistic questions such as those used for resource adequacy analysis, because they provide a calibrated distribution of outputs corresponding to the combination of inputs used. Algorithms constructed for such methods solve thousands of times with each solution using random draws for variables in the model to generate combinations of load, renewables, and forced outages, resulting in a wide range of outcomes to determine the risk of capacity deficits.

Simulations of random variables fit Monte Carlo methods by creating multiple future time series of the random variables, while maintaining correlation across time within variables (if wind is high in hour 1, it will likely be high in hour 2) and correlations between the variables, such as the strong relationship between temperature and load. If wind tends to be higher in the spring and fall, the simulations will exhibit that trend. Monte Carlo applications differ dramatically between resource adequacy models, with some models using a sequential approach that solves the model in hourly steps whereas others use techniques that solve the models quickly without stepping through each hour. Accurate representation of energy storage in resource adequacy models necessitates sequential solution techniques to account for the time dependencies for storage state of charge inherent in models.

The range of values covered in simulations captures future uncertainty in the simulated variables. Figure 8 shows 10 simulations of load derived from historical data. The load is highly variable and uncertain in the afternoon, driving a wider range of values in the simulations, while night timeload is more certain, as indicated in the simulations.

Traditional models used average or typical time profiles of load and renewables while focusing on generator outages as the primary source of uncertainty, greatly underestimating the risk of load shedding. Consider the chart in Figure 9 showing wind generation for a 25 MW farm over the first week of July in 2017, 2018, and 2019.The average values from the three years provide a profile that does not capture the true volatility seen in each individual year.

When using the Monte Carlo approach with weather as a fundamental driver, individual simulations represent independent futures for weather, load, and renewables.

Realistic simulations maintain the statistical properties of the underlying resource and correlation between resources and load. For example, if historic data show no correlation between load and wind generation, the simulations should maintain this relationship unless a reasonable expectation exists for correlations to change in the future.

Figure 10 shows four potential correlation scenarios.

In the top left quadrant, two wind farms have high correlation in hourly production. The top right quadrant displays output for two wind farms that are located further apart with no correlation between them. Load and a wind farm are displayed in the bottom left quadrant for historical and simulated data. The lower right quadrant shows the correlation of two solar farms. In each case, the simulated data (green dots) maintain the correlated production across resources. Models that fail to replicate the proper correlation between wind, solar, and load can underestimate the risk of load shedding. For example, if wind resources are not correlated with load, models relying on average wind and average load profiles could mischaracterize this relationship and completely misrepresent the system net load, which must be served with dispatchable resources.

Tools and Techniques to Model Resource Adequacy in a High Renewables Grid

The following criteria help ensure that resource adequacy models can provide valid risk assessments in planning decisions:

1. Simulate random variables as weather dependent

2. Benchmark simulations against historic data

3. Model generator outages as weather driven

4. Scale simulations to match future expectations

5. Include climate effects in simulations

Simulation of Random Variables

Capturing system dynamics in a realistic manner requires the use of simulations for weather, load, renewables, and forced outages. Average profiles do not capture the fidelity of behavior observed in power systems. Random draws from predefined probability distributions also fall short as load and renewables, as they neither maintain temporal correlations nor provide adequate representation of the real-world variables.

Benchmarking Simulations of Future States Against History

Benchmarking key variables against historic data using simulation techniques helps to assure validity. Benchmarking includes testing that simulations and historical data align in terms of statistical properties such as probability distributions, standard deviation, and correlations between variables. The model’s ability to represent realistic conditions depends on simulations resembling historical data as much as possible. Although this serves as a foundation for the analysis, the analyst may also choose to modify the simulation to account for climate change impacts in subsequent sensitivity analyses (as discussed later). Figure 11 compares percentiles from the historical distribution to percentiles from simulations in the left pane. The right pane compares standard deviations and coefficient of volatility between historical data and simulations. In both graphs, the summary statistics of historical data align with simulations and indicate that the models provide a good representation of the items.

Including Correlations in Generator Outages

Generator outages have traditionally been modeled as a random occurrence of equal probability all year long. The 2021 Winter Storm Uri demonstrated the impact of extreme weather events on entire classes of resources, for example, the impact of frozen equipment limiting fuel delivery and putting thousands of megawatts of natural gas capacity on outage simultaneously. This is known as a “common mode” failure in which a common root cause can impact both the generation equipment itself and the “upstream” infrastructure, such as fuel extraction, processing, and delivery systems. Another common mode failure may occur during extreme heat events, which tend to reduce thermal generator heat rates, derate power lines, increase fire risk to power lines, and spike load all at the same time. Models that assume forced outages occur with equal probability across all hours, with probabilities for each generator modeled independently, will likely underestimate the propensity of outages to occur simultaneously. Figure12 provides an example of generation simulations with weather driven outages where the period with minimum temperatures near zero degrees Fahrenheit corresponded with the lowest available generation.

Simulations Should Include Future Impacts from New Demand Side Technologies

System load is expected to grow annually based on projected population and economic growth, building electrification incentives, and electric vehicle adoption. Future drivers of load growth will evolve over time and should be explicitly included in the simulations. Modeling each load type individually and layering them will capture future changes in the load shapes across the day and seasons. Matching future expectations allows simulations to evolve further into the future based on projections and recent trends. Simulations should align with historical data in the near term, while adjusting to projections in the mid to long term.

Planning Reliability for a Changing Climate

Climate change effects are a critical consideration in all infrastructure planning. When planning for an adequate system, climate impacts are especially important to consider. Although historic data provides a good indicator of the relationship between load and temperature, the expectation of more frequent extreme weather events in the future can and should be taken into account in a modeling construct. To account for this, weather simulations should cover a wider range of temperatures than the historical data they are based on. This can be accomplished by running sensitivity cases in which simulation model parameters such as average temperature and standard deviation are increased to produce a wide range of future weather outcomes. Figure 13 shows the results of load simulations, with the blue dots showing load over the range of temperatures observed historically and the brown dots showing a wider range of temperatures than history would suggest.

Extreme weather can deliver a double whammy: weather-correlated generation outages coupled with load spikes, especially in a future with more electric heating. The result from a modeling perspective is that including extreme weather increases LOLE/LOLH by an amount that should raise flags for resource planners. Figure 14 shows two LOLH results, one from a model that assumes a higher risk of extreme weather and another that assumes future weather is the same as past weather. The higher LOLH values are strictly a result of weather uncertainty from climate change.

Conclusion

The electric grid is transitioning quickly from a system of large, dispatchable generators to a system reliant on high levels of variable renewable energy, energy storage, and bi-directional flow. Against this backdrop, analytical tools used for decision making regarding resource adequacy are more important than ever and those tools need to evolve to meet the modern grid challenges outlined in this paper. Models based in realistic weather-driven simulations more accurately capture the risk of load shedding due to inadequate generation. Simulations derived from historical data ensure models include load and generation patterns as well as correlations among resources and the ability to adjust to future climate conditions. Models that do not account for these factors may lead to decisions that underinvest in resources or invest in the wrong resources. Recent events in California and Texas indicate the importance of getting these projections right to keep the grid reliable.

To model resource adequacy in future power systems with high penetration of renewables, we recommend several enhancements in modeling tools and techniques. Modeling tools should simulate key structural variables and allow for validation of the simulations by benchmarking against the historical data used to create the simulations. While maintaining statistical properties derived from historical data, simulations should also include future expectations of load growth along with changes in seasonal and daily load shapes. Generation-forced outage simulations should include the possibility of correlated outages from extreme weather. Finally, climate change will drive more weather events in the power system and this risk should be accounted for in the models, at least in the form of sensitivity cases or stress tests.

____________________________________________

1 A perfectly reliable power system would be nearly unobtainable and far too expensive; therefore resource adequacy is about optimizing the trade-off between cost and power shortage risk.

2 NERC defines a reliable power system as having both “adequacy” (our topic) and “operating reliability.” Adequacy is the ability of the electric system to supply the aggregate electric power and energy requirements of the electricity consumers at all times, taking into account scheduled and reasonably expected unscheduled outages of system components. Operating reliability is the ability of the electric system to withstand sudden disturbances such as electric short circuits or unanticipated loss of system components. See https://www.nerc.com/docs/pc/Definition-of-ALR-approved-at-Dec-07-OC-PC-mtgs.pdf.

3 Draft CEC Preliminary 2022 Summer Supply Stack Analysis. California Energy Commission, available at https://www.energy.ca.gov/filebrowser/download/3655.

4 The actual term used by the National Energy Reliability Council (NERC) is “reference margin level”; however, it is nearly universally called“reserve margin” and we maintain that phrasing.

5 Pechman, C. Whither the FERC, National Regulatory Research Institute. January 2021, available at http://pubs.naruc.org/pub/46E267C1-155D-0A36-3108-22A019AB30F6.››

Speakers

No items found.

Resource Adequacy Modeling for a High Renewable Future

June 9, 2022

 | 

Publications

Resource Adequacy Modeling for a High Renewable Future

The electric industry is undergoing a significant transition, decarbonizing production while supporting the electrification of the US economy. Key concepts such as resource adequacy need to be re-evaluated in the context of that transition. Resource adequacy in power system planning refers to the state of having enough generation capacity online and available to meet customer demand almost all the time,1 inclusive of the capability to reduce customer load through demand response. Resource adequacy is a fundamental component of reliability that assures the continuous, undisrupted operation of the electric system in real-time.2 Planning for power system resource adequacy, once a relatively straightforward engineering calculation, is now characterized by growing complexity and uncertainty in both supply and demand. The demand side is changing with the growth of electric vehicles (EVs),heat pumps, new types of demand response resources, and rooftop solar and storage. On the supply side, power grids are evolving rapidly from a system served by dispatchable resources to one reliant on variable energy resources (VERs) and duration-limited storage. In the face of increasing complexity, many of the tools power system planners relied on are now becoming obsolete.

In the new era of high penetration renewable grids, power system reliability models must include the impact of weather as a driver of energy generation. These models should also address correlations in production output between different VERs, storage state-of-charge limitations, and common mode failures such as the risk of freezing coal piles and natural gas equipment during adverse weather conditions. In addition to adapting to these changes, policy makers, regulators, and power system planners should also anticipate impacts to resource adequacy in the face of a changing climate, with more frequent extreme weather events and natural disasters. According to the California Energy Commission, “with climate change, extreme weather events that were previously considered low probability events must be accounted for in electric sector planning.”3 In other words, climate change increasingly means one cannot simply look to the past to understand the ongoing impacts of extreme weather events.

With the changes wrought by energy resource transition and a changing climate, the key to modern resource adequacy planning is capturing the influence of meteorology on power system operations, including the relationships between weather, load, renewable generation, and forced outages, as well as simulating how system elements perform in relation to each other. Power system planners may also benefit from leveraging climate models to capture projected climate change impacts such as increasing frequency of extreme weather events.

This paper briefly reviews resource adequacy planning concepts and recommends modeling tools and techniques needed to build the reliable and resilient power system of the future.

Traditional Resource Adequacy Planning

Electric utilities have used the resource planning process for decades to develop long-term, least-cost generation supply plans to serve expected customer demand. Resource adequacy planning ensures that a system has enough energy generation throughout the year to serve demand with an acceptably low chance of shortfalls. Resource adequacy is measured by the metrics described in Figure 1. Reliability metrics provide an indication of the probability of a shortfall of generation to meet load (LOLP), the frequency of shortfalls (LOLE and LOLH), and the severity of the shortfalls (EUE andMW Short).

The industry has traditionally framed resource adequacy in terms of procuring enough resources (primarily generation) to meet the seasonal peak load forecast, plus some contingency reserves to address generation and transmission failures and/or derates in the system.4 This approach and the metric used to define it is called the “reserve margin.” Planners establish a reserve margin target based on load forecast uncertainty and the probability of generation outages. Required reserve margins vary by system and jurisdiction, but planners frequently target a reserve margin of 15 percent to 18 percent to maintain resource adequacy. Figure 25shows the standard conceptualization of a load duration curve, rank ordering the level of a power system’s load for each hour of the year from highest to lowest on an average or median basis in a typical weather year. The installed reserve margin is a margin of safety to cover higher than expected load and/or unexpected losses in generation capacity due to outages. As we discuss later, the peak load hour, either on a hot summer day or a cold winter morning, is not necessarily the riskiest time for a renewable heavy power system. Every hour of the year must now be studied carefully, given the uncertainty about whether renewable resources will be able to generate when needed.

Today, most resource planning analyses rely on the “one day in ten years,” criteria, meaning that load does not exceed supply more than 24 hours in a 10-year period, or its equivalent metric of 2.4 hours loss of load hours(LOLH) per year. This analysis is performed at the “balancing authority” (BA) level. Traditionally, BAs were vertically integrated utilities with defined service territories. Modern independent system operators (ISOs) or regional transmission operators (RTOs) have created larger BAs that comprise many utility services territories and create market or compliance-based rules to maintain sufficient system capacity. BA’s typically conduct resource adequacy analysis based on their own load and resources. Resources are either located within its geographic region or have firm transmission deliverability into the BA territory. In other words, it is not considered prudent planning to assume your neighbor can assist you in tight conditions when their systems are also likely to be stressed. In real-life operations, BAs can and do receive assistance from other BAs if they have capacity to spare.

The standard metrics shown in Figure 1 are generally reported as mean values of simulated power system outcomes over a range of potential future states, but planners also need to understand and plan for the worst-case outcomes and associated probability of such outcomes. Figure 3 shows the mean and percentile values for loss of load hours for a power system ove a three-year period.

In Figure 3, on average, the power system is resource adequate, remaining below the target of 2.4 hours per year. However, if the power system planner were more risk averse, she might want to bring a higher percentile line under the 2.4-hour target. She would need to add more firm capacity, adding to customer cost. The 95th percentile is the worst-case outcome, providing additional information on the upper bound risk of outages for a given portfolio. Only power systems with no recourse to import energy in a shortage, such as an island, would consider planning to the 95th percentile due to its high cost.

Resource adequacy planning is fundamentally concerned with low probability events and planning for average outcomes; although a common practice, this planning is not sufficient and increasingly risky with more uncertain supply, such as renewables. In the past, planners only needed to worry about unusually highloads or high forced outages. Now, they must worry about unusually high loads during periods of unusually low renewable output and limited storage duration. Adding supply uncertainty and, as we discuss later, more extreme weather, compounds risks and thus requires a fundamental rethinking of planning for low probability, high impact tail events.

In the planning process, the annual values of LOLH/LOLE, EUE, etc. provide useful information for resource procurement decisions. Annual and seasonal outputs matter because they can guide planners in identifying resource needs. Resource planners also need to know the duration of the shortfall, i.e., whether the risk of shortfalls occurs over many consecutive hours or concentrates among a few hours. Figure 4 shows a “heatmap,” a color-coded visualization displaying the average risk of resource shortfalls by month (y-axis) and hour (x-axis). Understanding the risk of failing to meet load will assist planners in identifying the resources needed to maintain resource adequacy.

The heat map, which shows a utility service territory in California with heavy solar penetration, demonstrates that risk is not equal across all hours. The highest risk occurs in the late evening (about 10:00 PM) in September after sunset. Before the increased reliance on solar generation, resource adequacy risk was concentrated in the early afternoon when load peaked. With high solar penetration, the change in net load (load less renewable generation) shifts the risk to the evening, a phenomenon known colloquially as the “head of the duck.” Rather than simply planning for “peak hour,” the balancing authority must now address a risk profile spanning from June through October and occurring primarily after the sun sets.

Problems with Traditional Resource Planning with a High Renewable System

With weather emerging as a fundamental driver of power system conditions, planning for resource adequacy with high renewables and storage becomes an exercise in quantifying and managing increasing uncertainty on both the supply and demand side of the equation. On the load side, building electrification, electric vehicle adoption, and expected growth in customer-sited solar and storage are likely to have pronounced effects on future electric consumption. Uncertain load growth and changing daily consumption patterns increase the challenge of making sure that future resources can serve load around the clock. Simply modeling future load based on past load with added noise does not characterize uncertainty from demand side changes.

Supply-side changes from systems consisting of mostly dispatchable resources to systems comprised mostly of resources with limited to no dispatchability lead to the need for modeling supply in a probabilistic manner. Prior to the energy transition underway, most supply uncertainty came from forced outages of thermal resources. Because variable renewable energy now contributes most of the uncertainty in supply, power supply uncertainty cannot be accurately captured with legacy tools using average annual profiles to represent renewable resources. To explain these emerging gaps in reliability modeling, we evaluate past approaches versus current needs against the modeling components shown in Table 1.

Updating Reliability Planning for a New Energy Paradigm

Figure 5 shows an approach that includes the core components of the new energy paradigm: meteorology, variable renewable energy generation, forced outages, and energy limited storage. Weather, primarily in the form of temperature, but potentially including insolation, humidity, wind speed, etc., drives simulations of renewable generation and customer load. Generation outage simulations can be modeled as random (the traditional approach) or as correlated with extreme heat or cold events. Once the simulations are in place, models can compute multiple future paths on an hour-by-hour basis to determine when load cannot be fully served with the available resources. For every hour of the model time horizon, there are independent simulations of load, renewables, and forced outages to determine if load shedding must occur. If a particular model contains 100 simulations and four show a lack of resources to serve load for a particular hour, the hour in question would have a loss of load probability of 0.04 (4/100).

Energy storage presents a unique challenge in resource adequacy models. Unlike traditional resources, storage devices such as batteries, compressed air, or pumped-hydro act as both load and generation depending on whether they are charging or discharging. Modern resource adequacy models need to simulate this behavior when determining the capability of energy storage to serve load during periods of resource scarcity. What state of charge should we expect for energy storage at times when the storage is truly needed? Are batteries likely to be fully charged at 6:00PM on a weekday in August? What about grid charging versus closed systems where batteries must charge from a renewable resource? At the high end of renewable penetration, how much storage would be required to cover Dunkelflaute, the “dark doldrums,” that occur in the winter when wind ceases to blow for several days. Questions surrounding the effective load-carrying capability of energy storage significantly increase the complexity in modeling resource adequacy.

Dispatch models may be a useful way to estimate the contribution of storage under the full range of future conditions.

Figure 6 provides an illustration of modeling the use of batteries in resource adequacy. The figure shows battery storage in blue, load in orange, and the available thermal generation in grey. When load exceeds thermal generation, the system is forced to rely on battery discharge for capacity. If the event lasts long enough to fully discharge the battery, the green line (generation minus load) will turn negative, indicating a load shed event.

Although transmission capacity is not a traditional topic in resource adequacy analysis, this is changing due to increasing reliance on renewable resources located far away from load centers. Models may thus need to include representations of transmission lines that add congestion and derates that limit the ability of remote resources to transmit power to load centers, as depicted in Figure 7.

Figure 7 shows a model containing multiple load pockets separated by transmission corridors which requires the ability to allocate capacity across multiple regions. The allocations should occur in a manner that minimizes unserved energy across the combined region.

Importance of Simulations in Resource Adequacy Models Resource adequacy models are typically framed as a probabilistic analysis, since the conditions that lead to a shortage of generation to meet load are associated with low-probability extreme events. Monte Carlo simulations are well-suited to solve complex probabilistic questions such as those used for resource adequacy analysis, because they provide a calibrated distribution of outputs corresponding to the combination of inputs used. Algorithms constructed for such methods solve thousands of times with each solution using random draws for variables in the model to generate combinations of load, renewables, and forced outages, resulting in a wide range of outcomes to determine the risk of capacity deficits.

Simulations of random variables fit Monte Carlo methods by creating multiple future time series of the random variables, while maintaining correlation across time within variables (if wind is high in hour 1, it will likely be high in hour 2) and correlations between the variables, such as the strong relationship between temperature and load. If wind tends to be higher in the spring and fall, the simulations will exhibit that trend. Monte Carlo applications differ dramatically between resource adequacy models, with some models using a sequential approach that solves the model in hourly steps whereas others use techniques that solve the models quickly without stepping through each hour. Accurate representation of energy storage in resource adequacy models necessitates sequential solution techniques to account for the time dependencies for storage state of charge inherent in models.

The range of values covered in simulations captures future uncertainty in the simulated variables. Figure 8 shows 10 simulations of load derived from historical data. The load is highly variable and uncertain in the afternoon, driving a wider range of values in the simulations, while night timeload is more certain, as indicated in the simulations.

Traditional models used average or typical time profiles of load and renewables while focusing on generator outages as the primary source of uncertainty, greatly underestimating the risk of load shedding. Consider the chart in Figure 9 showing wind generation for a 25 MW farm over the first week of July in 2017, 2018, and 2019.The average values from the three years provide a profile that does not capture the true volatility seen in each individual year.

When using the Monte Carlo approach with weather as a fundamental driver, individual simulations represent independent futures for weather, load, and renewables.

Realistic simulations maintain the statistical properties of the underlying resource and correlation between resources and load. For example, if historic data show no correlation between load and wind generation, the simulations should maintain this relationship unless a reasonable expectation exists for correlations to change in the future.

Figure 10 shows four potential correlation scenarios.

In the top left quadrant, two wind farms have high correlation in hourly production. The top right quadrant displays output for two wind farms that are located further apart with no correlation between them. Load and a wind farm are displayed in the bottom left quadrant for historical and simulated data. The lower right quadrant shows the correlation of two solar farms. In each case, the simulated data (green dots) maintain the correlated production across resources. Models that fail to replicate the proper correlation between wind, solar, and load can underestimate the risk of load shedding. For example, if wind resources are not correlated with load, models relying on average wind and average load profiles could mischaracterize this relationship and completely misrepresent the system net load, which must be served with dispatchable resources.

Tools and Techniques to Model Resource Adequacy in a High Renewables Grid

The following criteria help ensure that resource adequacy models can provide valid risk assessments in planning decisions:

1. Simulate random variables as weather dependent

2. Benchmark simulations against historic data

3. Model generator outages as weather driven

4. Scale simulations to match future expectations

5. Include climate effects in simulations

Simulation of Random Variables

Capturing system dynamics in a realistic manner requires the use of simulations for weather, load, renewables, and forced outages. Average profiles do not capture the fidelity of behavior observed in power systems. Random draws from predefined probability distributions also fall short as load and renewables, as they neither maintain temporal correlations nor provide adequate representation of the real-world variables.

Benchmarking Simulations of Future States Against History

Benchmarking key variables against historic data using simulation techniques helps to assure validity. Benchmarking includes testing that simulations and historical data align in terms of statistical properties such as probability distributions, standard deviation, and correlations between variables. The model’s ability to represent realistic conditions depends on simulations resembling historical data as much as possible. Although this serves as a foundation for the analysis, the analyst may also choose to modify the simulation to account for climate change impacts in subsequent sensitivity analyses (as discussed later). Figure 11 compares percentiles from the historical distribution to percentiles from simulations in the left pane. The right pane compares standard deviations and coefficient of volatility between historical data and simulations. In both graphs, the summary statistics of historical data align with simulations and indicate that the models provide a good representation of the items.

Including Correlations in Generator Outages

Generator outages have traditionally been modeled as a random occurrence of equal probability all year long. The 2021 Winter Storm Uri demonstrated the impact of extreme weather events on entire classes of resources, for example, the impact of frozen equipment limiting fuel delivery and putting thousands of megawatts of natural gas capacity on outage simultaneously. This is known as a “common mode” failure in which a common root cause can impact both the generation equipment itself and the “upstream” infrastructure, such as fuel extraction, processing, and delivery systems. Another common mode failure may occur during extreme heat events, which tend to reduce thermal generator heat rates, derate power lines, increase fire risk to power lines, and spike load all at the same time. Models that assume forced outages occur with equal probability across all hours, with probabilities for each generator modeled independently, will likely underestimate the propensity of outages to occur simultaneously. Figure12 provides an example of generation simulations with weather driven outages where the period with minimum temperatures near zero degrees Fahrenheit corresponded with the lowest available generation.

Simulations Should Include Future Impacts from New Demand Side Technologies

System load is expected to grow annually based on projected population and economic growth, building electrification incentives, and electric vehicle adoption. Future drivers of load growth will evolve over time and should be explicitly included in the simulations. Modeling each load type individually and layering them will capture future changes in the load shapes across the day and seasons. Matching future expectations allows simulations to evolve further into the future based on projections and recent trends. Simulations should align with historical data in the near term, while adjusting to projections in the mid to long term.

Planning Reliability for a Changing Climate

Climate change effects are a critical consideration in all infrastructure planning. When planning for an adequate system, climate impacts are especially important to consider. Although historic data provides a good indicator of the relationship between load and temperature, the expectation of more frequent extreme weather events in the future can and should be taken into account in a modeling construct. To account for this, weather simulations should cover a wider range of temperatures than the historical data they are based on. This can be accomplished by running sensitivity cases in which simulation model parameters such as average temperature and standard deviation are increased to produce a wide range of future weather outcomes. Figure 13 shows the results of load simulations, with the blue dots showing load over the range of temperatures observed historically and the brown dots showing a wider range of temperatures than history would suggest.

Extreme weather can deliver a double whammy: weather-correlated generation outages coupled with load spikes, especially in a future with more electric heating. The result from a modeling perspective is that including extreme weather increases LOLE/LOLH by an amount that should raise flags for resource planners. Figure 14 shows two LOLH results, one from a model that assumes a higher risk of extreme weather and another that assumes future weather is the same as past weather. The higher LOLH values are strictly a result of weather uncertainty from climate change.

Conclusion

The electric grid is transitioning quickly from a system of large, dispatchable generators to a system reliant on high levels of variable renewable energy, energy storage, and bi-directional flow. Against this backdrop, analytical tools used for decision making regarding resource adequacy are more important than ever and those tools need to evolve to meet the modern grid challenges outlined in this paper. Models based in realistic weather-driven simulations more accurately capture the risk of load shedding due to inadequate generation. Simulations derived from historical data ensure models include load and generation patterns as well as correlations among resources and the ability to adjust to future climate conditions. Models that do not account for these factors may lead to decisions that underinvest in resources or invest in the wrong resources. Recent events in California and Texas indicate the importance of getting these projections right to keep the grid reliable.

To model resource adequacy in future power systems with high penetration of renewables, we recommend several enhancements in modeling tools and techniques. Modeling tools should simulate key structural variables and allow for validation of the simulations by benchmarking against the historical data used to create the simulations. While maintaining statistical properties derived from historical data, simulations should also include future expectations of load growth along with changes in seasonal and daily load shapes. Generation-forced outage simulations should include the possibility of correlated outages from extreme weather. Finally, climate change will drive more weather events in the power system and this risk should be accounted for in the models, at least in the form of sensitivity cases or stress tests.

____________________________________________

1 A perfectly reliable power system would be nearly unobtainable and far too expensive; therefore resource adequacy is about optimizing the trade-off between cost and power shortage risk.

2 NERC defines a reliable power system as having both “adequacy” (our topic) and “operating reliability.” Adequacy is the ability of the electric system to supply the aggregate electric power and energy requirements of the electricity consumers at all times, taking into account scheduled and reasonably expected unscheduled outages of system components. Operating reliability is the ability of the electric system to withstand sudden disturbances such as electric short circuits or unanticipated loss of system components. See https://www.nerc.com/docs/pc/Definition-of-ALR-approved-at-Dec-07-OC-PC-mtgs.pdf.

3 Draft CEC Preliminary 2022 Summer Supply Stack Analysis. California Energy Commission, available at https://www.energy.ca.gov/filebrowser/download/3655.

4 The actual term used by the National Energy Reliability Council (NERC) is “reference margin level”; however, it is nearly universally called“reserve margin” and we maintain that phrasing.

5 Pechman, C. Whither the FERC, National Regulatory Research Institute. January 2021, available at http://pubs.naruc.org/pub/46E267C1-155D-0A36-3108-22A019AB30F6.››

About Ascend Analytics

Ascend Analytics is the leading provider of market intelligence and analytics solutions for the energy transition. The company’s offerings enable decision makers in power supply, procurement, and investment markets to plan, operate, monetize, and manage risk across any energy asset portfolio. From real-time to 30-year horizons, their forecasts and insights are at the foundation of over $50 billion in project financing assessments. Ascend provides energy market stakeholders with the clarity and confidence to successfully navigate the rapidly shifting energy landscape.

Latest

Publications