Production cost modeling simulates the operation of electric systems. It provides a lens into a highly uncertain future, allowing utilities to craft strategy and make critical decisions for their customers, shareholders, and stakeholders. The power and acuity of this lens will determine what resources will be deemed the most economic to provide a reliable, lower-carbon supply portfolio. Failure to do so will limit the planner's examination of what is knowable, blinding them by model-limited choice.
The dramatic decline in the cost of renewables and storage technologies and the societal push for decarbonization means planners must model more complex and uncertain portfolio options. Renewables and their meteorologically determined fuel supply are creating new dynamics that highlight the need for more powerful modeling tools to capture the increasing variability in the power supply and the ensuing effect on market price volatility.
This article highlights the benefits of using a new class of resource planning models to plan for a decarbonized future. Utilities, regulators, independent system operators, and other industry stakeholders rely heavily on modeling to support decision making for the allocation of scarce capital resources, as well as to ensure that the right resources are available to maintain a high level of reliability and resilience. This paper argues that the older generations of models that remain widely in use today fail to capture the emerging dynamics of a power grid supplied primarily by renewable energy. For this reason, industry decision makers are unknowingly burdened by “model-limited choice,” which can lead to imprudent investments in assets liable to become functionally useless and ultimately disallowed.
This paper provides a new terminology to classify a model’s ability to capture the new market dynamics, high-definition production cost models (HD PCMs) versus traditional production cost models (PCMs). HDPCMs use simulation to capture the stochastic nature of load and electricity production generated by renewable energy sources, as well as to drill down to a 5-minute level of temporal and spatial (i.e., nodal) granularity to capture the flexibility requirements of renewable integration. Further, HD PCMs mimic real-world uncertainty by simulating imperfect foresight of future system conditions between the day-ahead forecast and the real-time dispatch.Traditional PCMs are highly simplified because they were developed when computing power was a significant limitation. Today, resource planners can take advantage of the rapid increase in computing power provided by distributed computing to upgrade their analytical platforms to enable HD PCMs that provide more robust analysis.
Although traditional planning processes are steeped in regulatory precedents, they are increasingly at risk of failing to meet the prudence standard by not exploring what is known and knowable. In particular, the economic construct of power planning needs to capture the fundamental physical dynamics of high renewable systems and their impact on market prices and costs. Failure to do so will limit the planner’s examination of Making the Right Resource Choice Requires Making the Right Model Choice.1
Model-limited choice in the resource planning context is the mischaracterization of resource value, leading to inadequate support for business decisions and increasing business and customer risk. This occurs when the traditional planning tools designed to model the electricity system based on the generation technologies of the past are used to evaluate a renewable-rich, customer-centric, low-carbon future. The concept of model-limited choice and the need to avoid it is certainly not new:
“Model-limited choice is a type of bounded rationality that occurs when models are used as the dominant mechanism for identifying the options under investigation. When the range of options is determined by the ability to model them, the structure of the model used, and the process of incorporating model results into the decision process, modeling becomes a limiting force on the choices available.”2
What is new, however, is that the power system has evolved to become more complex and variable, making model-limited choice even more of a risk. These historic models increasingly miss the underlying dynamics of a changing electricity landscape and therefore fail to value emerging resources properly because of a legacy resource bias. Three model limitations commonly combine to undervalue flexible generation3 like battery storage, skewing resource valuation toward legacy resources.
Recognizing the need to encourage utilities to pursue amore granular modeling lens to capture the value of flexible generation and storage to integrate renewables, the National Association of Regulatory Commissioners (NARUC) adopted a unanimous resolution in November 2018 stating,
Utilities should develop, if appropriate, new modelling tools and new planning frameworks that allow for a more complete evaluation of flexible resources, such as energy storage; . . .including sub-hourly services.5
The NARUC resolution specifically calls on resource planners to update their modeling frameworks to capture the value of flexible resources. Flexible resources can be supply side resources such as storage, fast-start engines, and turbines, as well as demand-side resources such as demand response and other distributed energy resources (DERs).
Modelers use the term “deterministic” to refer to mathematical models where the outputs are determined by the inputs and calculated using static algorithms. F=MA or E=MC2 are deterministic models, where one set of inputs (“mass” for both equations) can only lead to a single estimate of the outputs (“force”and “energy”). Modern economic decision analysis should include stochastic analysis to meaningfully characterize uncertainty and risk, due to the inherently stochastic nature of power system behavior as a function of a chaotic natural process, weather.6
Weather across an interconnected, continental-sized power grid imparts heterogeneous effects on load, renewable generation, and thermal fleet performance. The confluence of these factors affects price formulation in power markets. Accurately modeling power price is important, because price is the construct through which we value resources and evaluate whether their physical and financial characteristics provide value to the system or are obsolete and should be retired.
Deterministic production cost modeling provides a single view of the future by using hourly weather normalized load, average wind and solar production, and market price fluctuations that have significantly less variability than actual observations. When renewables were a relatively small portion of supply, and load was relatively insensitive to weather, the lack of variability in modeling was not a key concern. As more renewables have entered the market, however, the impact of weather on pricing and planning has increased, causing both explained and unexplained effects on power prices. The explained components are the causal relationship of weather on load and renewables. The unexplained effects are a random component that cannot be fully explained by fundamental variables. Power prices thus have both explained components from the fundamentals of supply and demand and a significant unexplained random component referred to as a stochastic effect.
If power price formation is a stochastic process, why do most utility decision makers still use deterministic representations of the power system to make financial decisions? In the past, computing limitations made stochastic modeling impractical. Deterministic models were initially developed when computing power was significantly more costly. Today, computing capacity issues have been largely resolved, which enables the use of more sophisticated modeling tools. Although the hourly deterministic production cost modeling adequately informed regulatory and merchant decisions over the last three decades, today, the limitations of this approach have been exposed by high renewable penetration rates and the impact of weather becoming a significant part of the power supply. Given this change, it is becoming clear that the old ways of modeling will not support the future grid as we move toward decarbonization.
The impact of renewable generation on power price volatility demonstrates changing market dynamics, as shown in Figure 1. This chart plots a measurement of price volatility (congestion persistence or the summation of 5-minute real-time prices more than $100 and less $0) relative to the penetration of renewable energy in CAISO, ERCOT, and SPP. Each market demonstrates the same general trend—the volatility of power prices increases roughly linearly with renewable penetration rates.
Looking at Figure 1, the solid-color lines show the value of volatility through traditional production cost modeling at approximately an order of magnitude below observed market values. Compared with observed values, this represents a serious misstatement of real-world conditions in renewable heavy markets. Renewable production is intermittent and variable, which manifests as volatility—seasonally, daily, hourly, and by the minute. Because power systems must remain in near-perfect balance of load and generation at all times to maintain system frequency, some other resource must compensate to balance the renewables. Power markets reflect this requirement through the prices of ancillary services and real-time prices that reveal the actual supply-demand imbalance five minutes before the time interval. The growth in the volatility value, the y-axis, shows the aggregate market value that can be captured by flexible balancing resources such as batteries. This value has grown at the same time as the growth of intermittent renewable resources, as reflected by the dotted lines. In other words, intermittent renewables create a need for flexible balancing resources. Because renewables have been deployed since 2015 in the western market, increasing market price volatility has revealed a value capture opportunity for flexible resources to provide fast balancing services.
Because economically efficient markets represent the marginal cost of production through price signals, resources can be built more efficiently and provide greatest value when responding to a clear and transparent representation of marginal costs. For resource valuation, the modeling of markets needs to parallel the increases in price variability realized in actual market operations. Increasing volatility has important impacts on resource selection, because it signals the decreasing value of inflexible baseload resources while at the same time increasing the value of flexible generation to profit from rapid movements up and down in prices.
Increased renewable production creates volatility in power prices through the four principal mechanisms shown in Table 1.
The addition of utility-scale renewables generally means energy will increasingly be generated at locations distant from population centers, potentially creating additional transmission congestion. Battery storage may be able to alleviate some of this congestion to offset daily ramps when the sun sets. The ability of storage to significantly absorb volatility in price depends on the value storage realizes from the market. For example, a 4-hour battery operating in CAISO in 2018 dispatching against SP-15 hub prices earned 70 percent more in the real-time sub-hourly market than exclusively following prices on a day-ahead hourly basis. Battery storage’s ability to participate in both energy and ancillary service markets over the same period allows further revenue stacking, leading to a 160 percent improvement relative to a simple hourly one cycle arbitrage strategy.7 A combustion turbine operating to the same prices would earn $35/KW-year but generally realize no difference between hourly and sub-hourly prices, because of high start-up time and high start-up costs precluding its flexible operation to maximize value capture from 5- or 15-minute price spikes.8
To better understand the practical limitations of resource planning with model-limited choice on resource selection, the following analysis evaluates adding additional resources to a portfolio using the traditional PCM approach and the HD PCM approach. Each model minimized the net present value of costs to maintain the same level of reliability over a 20-year period. The traditional approach was run deterministically at an hourly time step with perfect foresight versus the HD PCM approach run stochastically at an hourly and 5-minute time-step with imperfect foresight. The analysis in Table 2 demonstrates conflicting results between the models, suggesting the lens used for modeling matters in a material way for resource selection.
The deterministic modeling valuations in rank order are shown in the left column of Table 2, whereas the right column shows how the rank order is nearly reversed with the more granular HD PCM that includes sub-hourly dynamics, stochastically simulated future states driven by variability in weather, and imperfect foresight. The implications of this disruptive change in ordering of least-cost resources can be summarized as “legacy modeling leads to legacy decisions.”
In the traditional PCM valuation, the Combined Cycle was the clear winner, and for the past several decades, the Combined Cycle was commonly the go-to new energy resource. The Frame was also highly rated; again, until recently Frames comprised most new peaking capacity. Aero deriviates are smaller, more flexible turbines that are generally more expensive per kilowatt than the larger Frame units. Internal combustion engines are highly flexible (they can ramp up to full load in less than five minutes and have no startup or shut down costs) but are small and have higher capital costs. The standalone battery and battery solar hybrids are the most flexible resources (with “free” fuel for the hybrid) but have a higher capital cost per kilowatt than conventional generation.
As a result of the 13 percent decrease in the net present value of cost revealed by the HD PCM, the stand-alone battery moved up to first place. The more constrained battery/solar hybrid’s cost declined 8 percent and moved into second place. The highly flexible thermal ICEs and aeroderivatives declined 5 percent in cost, improving their ranking. Although the CC and the Frame were adequately captured by the traditional PCM, their relative inflexibility meant they did not exhibit the additional “flexibility premium” uncovered by the HD PCM valuation. This significant change in selected resources demonstrates that legacy models do a reasonable job of valuing traditional resources but do a poor job of valuing the new types of technologies that are needed for renewable integration, because the traditional model systematically understates the value of flexible generation. Thus, with increasing renewables generation, traditional PCM’s are systematically biased and inconsistent in valuing the economics of flexible generating resources.
Even though the difference in the numbers is not large, the importance of ranking in the decision-making process cannot be overstated. Cognitively, decision makers often take note of the rankings and immediately frame the top choice as the “best” or “preferred”option. A lack of understanding or trust in newer technologies may lead utility decision makers to focus on the legacy resource, since it represents the traditional valuation method. Getting the decision analysis correct with HD PCM tools may help move utility leaders to understand the value of flexible resources in the new energy system paradigm.
The deterministic, hourly simplified model has been used for a long time and enjoys the inertia of regulatory precedent. However, utility leaders and regulators are bound not by precedent but rather by the prudence standard. Meeting the prudence standard requires making decisions based on what is known and knowable and requires using all available information, highest level tools, and best practices. With the recognition that more robust models lead to better decisions, we can more confidently chart the path toward the low-carbon, reliable, decentralized, secure, and resilient grid we need to succeed in the 21st century.
_________________________________________
1 Pechman, C. Regulating Power, the Economics of Electricity in the Information Age. Kluwer Academic Press, 1993.
2 Pechman, C. 1993.
3 Highly flexible generation has low or no start-up and shut down costs, no minimum run times, and can ramp quickly or even instantaneously in the case of batteries.
4 Net load = Load - Renewables
5 NARUC, “EL-4/ERE-1 Resolution on Modeling Energy Storage and Other Flexible Resources,” November 16, 2018.
6 Stochastic (probabilistic) models use simulation techniques to capture many possible outcomes of processes that include randomnessand correlation between variables.
7 Author’s analysis using historic day-ahead and real-time market prices in an optimization engine with imperfect foresight adjustment.
8 Energy Information Administration, “Capital Cost and Performance Characteristic Estimates for Utility Scale Electric Power GeneratingTechnologies,” February 2020.
Production cost modeling simulates the operation of electric systems. It provides a lens into a highly uncertain future, allowing utilities to craft strategy and make critical decisions for their customers, shareholders, and stakeholders. The power and acuity of this lens will determine what resources will be deemed the most economic to provide a reliable, lower-carbon supply portfolio. Failure to do so will limit the planner's examination of what is knowable, blinding them by model-limited choice.
The dramatic decline in the cost of renewables and storage technologies and the societal push for decarbonization means planners must model more complex and uncertain portfolio options. Renewables and their meteorologically determined fuel supply are creating new dynamics that highlight the need for more powerful modeling tools to capture the increasing variability in the power supply and the ensuing effect on market price volatility.
This article highlights the benefits of using a new class of resource planning models to plan for a decarbonized future. Utilities, regulators, independent system operators, and other industry stakeholders rely heavily on modeling to support decision making for the allocation of scarce capital resources, as well as to ensure that the right resources are available to maintain a high level of reliability and resilience. This paper argues that the older generations of models that remain widely in use today fail to capture the emerging dynamics of a power grid supplied primarily by renewable energy. For this reason, industry decision makers are unknowingly burdened by “model-limited choice,” which can lead to imprudent investments in assets liable to become functionally useless and ultimately disallowed.
This paper provides a new terminology to classify a model’s ability to capture the new market dynamics, high-definition production cost models (HD PCMs) versus traditional production cost models (PCMs). HDPCMs use simulation to capture the stochastic nature of load and electricity production generated by renewable energy sources, as well as to drill down to a 5-minute level of temporal and spatial (i.e., nodal) granularity to capture the flexibility requirements of renewable integration. Further, HD PCMs mimic real-world uncertainty by simulating imperfect foresight of future system conditions between the day-ahead forecast and the real-time dispatch.Traditional PCMs are highly simplified because they were developed when computing power was a significant limitation. Today, resource planners can take advantage of the rapid increase in computing power provided by distributed computing to upgrade their analytical platforms to enable HD PCMs that provide more robust analysis.
Although traditional planning processes are steeped in regulatory precedents, they are increasingly at risk of failing to meet the prudence standard by not exploring what is known and knowable. In particular, the economic construct of power planning needs to capture the fundamental physical dynamics of high renewable systems and their impact on market prices and costs. Failure to do so will limit the planner’s examination of Making the Right Resource Choice Requires Making the Right Model Choice.1
Model-limited choice in the resource planning context is the mischaracterization of resource value, leading to inadequate support for business decisions and increasing business and customer risk. This occurs when the traditional planning tools designed to model the electricity system based on the generation technologies of the past are used to evaluate a renewable-rich, customer-centric, low-carbon future. The concept of model-limited choice and the need to avoid it is certainly not new:
“Model-limited choice is a type of bounded rationality that occurs when models are used as the dominant mechanism for identifying the options under investigation. When the range of options is determined by the ability to model them, the structure of the model used, and the process of incorporating model results into the decision process, modeling becomes a limiting force on the choices available.”2
What is new, however, is that the power system has evolved to become more complex and variable, making model-limited choice even more of a risk. These historic models increasingly miss the underlying dynamics of a changing electricity landscape and therefore fail to value emerging resources properly because of a legacy resource bias. Three model limitations commonly combine to undervalue flexible generation3 like battery storage, skewing resource valuation toward legacy resources.
Recognizing the need to encourage utilities to pursue amore granular modeling lens to capture the value of flexible generation and storage to integrate renewables, the National Association of Regulatory Commissioners (NARUC) adopted a unanimous resolution in November 2018 stating,
Utilities should develop, if appropriate, new modelling tools and new planning frameworks that allow for a more complete evaluation of flexible resources, such as energy storage; . . .including sub-hourly services.5
The NARUC resolution specifically calls on resource planners to update their modeling frameworks to capture the value of flexible resources. Flexible resources can be supply side resources such as storage, fast-start engines, and turbines, as well as demand-side resources such as demand response and other distributed energy resources (DERs).
Modelers use the term “deterministic” to refer to mathematical models where the outputs are determined by the inputs and calculated using static algorithms. F=MA or E=MC2 are deterministic models, where one set of inputs (“mass” for both equations) can only lead to a single estimate of the outputs (“force”and “energy”). Modern economic decision analysis should include stochastic analysis to meaningfully characterize uncertainty and risk, due to the inherently stochastic nature of power system behavior as a function of a chaotic natural process, weather.6
Weather across an interconnected, continental-sized power grid imparts heterogeneous effects on load, renewable generation, and thermal fleet performance. The confluence of these factors affects price formulation in power markets. Accurately modeling power price is important, because price is the construct through which we value resources and evaluate whether their physical and financial characteristics provide value to the system or are obsolete and should be retired.
Deterministic production cost modeling provides a single view of the future by using hourly weather normalized load, average wind and solar production, and market price fluctuations that have significantly less variability than actual observations. When renewables were a relatively small portion of supply, and load was relatively insensitive to weather, the lack of variability in modeling was not a key concern. As more renewables have entered the market, however, the impact of weather on pricing and planning has increased, causing both explained and unexplained effects on power prices. The explained components are the causal relationship of weather on load and renewables. The unexplained effects are a random component that cannot be fully explained by fundamental variables. Power prices thus have both explained components from the fundamentals of supply and demand and a significant unexplained random component referred to as a stochastic effect.
If power price formation is a stochastic process, why do most utility decision makers still use deterministic representations of the power system to make financial decisions? In the past, computing limitations made stochastic modeling impractical. Deterministic models were initially developed when computing power was significantly more costly. Today, computing capacity issues have been largely resolved, which enables the use of more sophisticated modeling tools. Although the hourly deterministic production cost modeling adequately informed regulatory and merchant decisions over the last three decades, today, the limitations of this approach have been exposed by high renewable penetration rates and the impact of weather becoming a significant part of the power supply. Given this change, it is becoming clear that the old ways of modeling will not support the future grid as we move toward decarbonization.
The impact of renewable generation on power price volatility demonstrates changing market dynamics, as shown in Figure 1. This chart plots a measurement of price volatility (congestion persistence or the summation of 5-minute real-time prices more than $100 and less $0) relative to the penetration of renewable energy in CAISO, ERCOT, and SPP. Each market demonstrates the same general trend—the volatility of power prices increases roughly linearly with renewable penetration rates.
Looking at Figure 1, the solid-color lines show the value of volatility through traditional production cost modeling at approximately an order of magnitude below observed market values. Compared with observed values, this represents a serious misstatement of real-world conditions in renewable heavy markets. Renewable production is intermittent and variable, which manifests as volatility—seasonally, daily, hourly, and by the minute. Because power systems must remain in near-perfect balance of load and generation at all times to maintain system frequency, some other resource must compensate to balance the renewables. Power markets reflect this requirement through the prices of ancillary services and real-time prices that reveal the actual supply-demand imbalance five minutes before the time interval. The growth in the volatility value, the y-axis, shows the aggregate market value that can be captured by flexible balancing resources such as batteries. This value has grown at the same time as the growth of intermittent renewable resources, as reflected by the dotted lines. In other words, intermittent renewables create a need for flexible balancing resources. Because renewables have been deployed since 2015 in the western market, increasing market price volatility has revealed a value capture opportunity for flexible resources to provide fast balancing services.
Because economically efficient markets represent the marginal cost of production through price signals, resources can be built more efficiently and provide greatest value when responding to a clear and transparent representation of marginal costs. For resource valuation, the modeling of markets needs to parallel the increases in price variability realized in actual market operations. Increasing volatility has important impacts on resource selection, because it signals the decreasing value of inflexible baseload resources while at the same time increasing the value of flexible generation to profit from rapid movements up and down in prices.
Increased renewable production creates volatility in power prices through the four principal mechanisms shown in Table 1.
The addition of utility-scale renewables generally means energy will increasingly be generated at locations distant from population centers, potentially creating additional transmission congestion. Battery storage may be able to alleviate some of this congestion to offset daily ramps when the sun sets. The ability of storage to significantly absorb volatility in price depends on the value storage realizes from the market. For example, a 4-hour battery operating in CAISO in 2018 dispatching against SP-15 hub prices earned 70 percent more in the real-time sub-hourly market than exclusively following prices on a day-ahead hourly basis. Battery storage’s ability to participate in both energy and ancillary service markets over the same period allows further revenue stacking, leading to a 160 percent improvement relative to a simple hourly one cycle arbitrage strategy.7 A combustion turbine operating to the same prices would earn $35/KW-year but generally realize no difference between hourly and sub-hourly prices, because of high start-up time and high start-up costs precluding its flexible operation to maximize value capture from 5- or 15-minute price spikes.8
To better understand the practical limitations of resource planning with model-limited choice on resource selection, the following analysis evaluates adding additional resources to a portfolio using the traditional PCM approach and the HD PCM approach. Each model minimized the net present value of costs to maintain the same level of reliability over a 20-year period. The traditional approach was run deterministically at an hourly time step with perfect foresight versus the HD PCM approach run stochastically at an hourly and 5-minute time-step with imperfect foresight. The analysis in Table 2 demonstrates conflicting results between the models, suggesting the lens used for modeling matters in a material way for resource selection.
The deterministic modeling valuations in rank order are shown in the left column of Table 2, whereas the right column shows how the rank order is nearly reversed with the more granular HD PCM that includes sub-hourly dynamics, stochastically simulated future states driven by variability in weather, and imperfect foresight. The implications of this disruptive change in ordering of least-cost resources can be summarized as “legacy modeling leads to legacy decisions.”
In the traditional PCM valuation, the Combined Cycle was the clear winner, and for the past several decades, the Combined Cycle was commonly the go-to new energy resource. The Frame was also highly rated; again, until recently Frames comprised most new peaking capacity. Aero deriviates are smaller, more flexible turbines that are generally more expensive per kilowatt than the larger Frame units. Internal combustion engines are highly flexible (they can ramp up to full load in less than five minutes and have no startup or shut down costs) but are small and have higher capital costs. The standalone battery and battery solar hybrids are the most flexible resources (with “free” fuel for the hybrid) but have a higher capital cost per kilowatt than conventional generation.
As a result of the 13 percent decrease in the net present value of cost revealed by the HD PCM, the stand-alone battery moved up to first place. The more constrained battery/solar hybrid’s cost declined 8 percent and moved into second place. The highly flexible thermal ICEs and aeroderivatives declined 5 percent in cost, improving their ranking. Although the CC and the Frame were adequately captured by the traditional PCM, their relative inflexibility meant they did not exhibit the additional “flexibility premium” uncovered by the HD PCM valuation. This significant change in selected resources demonstrates that legacy models do a reasonable job of valuing traditional resources but do a poor job of valuing the new types of technologies that are needed for renewable integration, because the traditional model systematically understates the value of flexible generation. Thus, with increasing renewables generation, traditional PCM’s are systematically biased and inconsistent in valuing the economics of flexible generating resources.
Even though the difference in the numbers is not large, the importance of ranking in the decision-making process cannot be overstated. Cognitively, decision makers often take note of the rankings and immediately frame the top choice as the “best” or “preferred”option. A lack of understanding or trust in newer technologies may lead utility decision makers to focus on the legacy resource, since it represents the traditional valuation method. Getting the decision analysis correct with HD PCM tools may help move utility leaders to understand the value of flexible resources in the new energy system paradigm.
The deterministic, hourly simplified model has been used for a long time and enjoys the inertia of regulatory precedent. However, utility leaders and regulators are bound not by precedent but rather by the prudence standard. Meeting the prudence standard requires making decisions based on what is known and knowable and requires using all available information, highest level tools, and best practices. With the recognition that more robust models lead to better decisions, we can more confidently chart the path toward the low-carbon, reliable, decentralized, secure, and resilient grid we need to succeed in the 21st century.
_________________________________________
1 Pechman, C. Regulating Power, the Economics of Electricity in the Information Age. Kluwer Academic Press, 1993.
2 Pechman, C. 1993.
3 Highly flexible generation has low or no start-up and shut down costs, no minimum run times, and can ramp quickly or even instantaneously in the case of batteries.
4 Net load = Load - Renewables
5 NARUC, “EL-4/ERE-1 Resolution on Modeling Energy Storage and Other Flexible Resources,” November 16, 2018.
6 Stochastic (probabilistic) models use simulation techniques to capture many possible outcomes of processes that include randomnessand correlation between variables.
7 Author’s analysis using historic day-ahead and real-time market prices in an optimization engine with imperfect foresight adjustment.
8 Energy Information Administration, “Capital Cost and Performance Characteristic Estimates for Utility Scale Electric Power GeneratingTechnologies,” February 2020.
Dr. Dorris, a pioneer of innovative energy modeling solutions for over 20 years, is the CEO and Founder of Ascend Analytics, an industry-leading, 170+ person energy analytics software and consulting company. His work revolutionized modeling power markets to support real-time operations, portfolio management, and resource planning. Throughout his career, Dr. Dorris has performed independent market assessment to support over $50 billion dollars in project financing and advised to multiple boards of directors. Industry leaders consistently turn to Dr. Dorris for expert testimony and executive presentations. He previously served as the lead expert witness for Merrill Lynch in the Enron proceedings in addition to numerous regulatory proceedings. Prior to founding Ascend, Dr Dorris served as CEO of e-Acumen, a 60-person energy analytics software firm.