Tag Archives: storage

Paradigm change: building out the grid with renewables requires a different perspective

Several observers have asserted that we will require baseload generation, probably nuclear, to decarbonize the power grid. Their claim is that renewable generation isn’t reliable enough and too distant from load centers to power an electrified economy.

Problem is that this perspective relies on a conventional approach to understanding and planning for future power needs. That conventional approach generally planned to meet the highest peak loads of the year with a small margin and then used the excess capacity to produce the energy needed in the remainder of the hours. This premise was based on using consumable fuel to store energy for use in hours when electricity was needed.

Renewables such as solar and wind present a different paradigm. Renewables capture and convert energy to electricity as it becomes available. The next step is to stored that energy using technologies such as batteries. That means that the system needs to be built to meet energy requirements, not peak loads.

Hydropower-dominated systems have already been built in this manner. The Pacific Northwest’s complex on the Columbia River and its branches for half a century had so much excess peak capacity that it could meet much of California’s summer demand. Meeting energy loads during drought years was the challenge. The Columbia River system could store up to 40% of the annual runoff in its reservoirs to assure sufficient supply.

For solar and wind, we will build enough capacity that is multiples of the annual peak load so that we can generate enough energy to meet those loads that occur when the sun isn’t shining and wind isn’t blowing. For example in a system relying on solar power, the typical demand load factor is 60%, i.e., the average load is 60% of the peak or maximum load. A typical solar photovoltaic capacity factor is 20%, i.e., it generates an average output that is 20% of the peak output. In this example system, the required solar capacity would be three times the peak demand on the system to produce sufficient stored electricity. The amount of storage capacity would equal the peak demand (plus a small reserve margin) less the amount of expected renewable generation during the peak hour.

As a result, comparing the total amount of generation capacity installed to the peak demand becomes irrelevant. Instead we first plan for total energy need and then size the storage output to meet the peak demand. (And that storage may be virtually free as it is embodied in our EVs.) This turns the conventional planning paradigm on its head.

Electric vehicles as the next smartphone

In 2006 a cell phone was portable phone that could send text messages. It was convenient but not transformative. No one seriously thought about dropping their landlines.

And then the iPhone arrived. Almost overnight consumers began to use it like their computer. They emailed, took pictures and sent them to their friends, then searched the web, then played complex games and watched videos. Social media exploded and multiple means of communicating and sharing proliferated. Landlines (and cable) started to disappear, and personal computer sales slowed. (And as a funny side effect, the younger generation seemed to quit talking on the phone.) The cell phone went from a means of one-on-one communication to a multi-faceted electronic tool that has become our pocket computer.

The U.S. population owning a smartphone has gone from 35% to 85% in the last decade. We could achieve similar penetration rates for electric vehicles (EVs) if we rethink and repackage how we market EVs to become our indispensable “energy management tool.” EVs can offer much more than conventional cars and we need to facilitate and market these advantages to sell them much faster.

EV pickups with spectacular features are about to be offered. These EVs may be a game changer for a different reason than what those focused on transportation policy think of–they offer households the opportunity for near complete energy independence. These pick ups have both enough storage capacity to power a house for several days and are designed to supply power to many other uses, not just driving. Combined with solar panels installed both at home and in business lots, the trucks can carry energy back and forth between locations. This has an added benefit of increasing reliability (local distribution outages are 15 times more likely than system levels ones) and resilience in the face of increasing extreme events.

This all can happen because cars are parked 90-95% of the time. That offers power source reliability in the same range as conventional generation, and the dispersion created by a portfolio of smaller sources further enhances that availability. Another important fact is that the total power capacity for autos on California’s road is over 2,000 gigawatts. Compared to California’s peak load of about 63 gigawatts, this is more than 30 times more capacity than we need. If we simply get to 20% penetration of EVs of which half have interconnective control abilities, we’ll have three times more capacity than we would need to meet our highest demands. There are other energy management issues, but solving them are feasible when we realize there will not be a real physical constraint.

Further, used EV batteries can be used as stationary storage, either in home or at renewable generation to mitigate transmission investments. EVs can transport energy between work and home from solar panels.

The difference between these EVs and the current models is akin to the difference between flip phones and smart phones. One is a single function device and the we use the latter to manage our lives. The marketing of EVs should shift course to emphasize these added benefits that are not possible with a conventional vehicle. The barriers are not technological, but only regulatory (from battery warranties and utility interconnection rules).

As part of this EV marketing focus, automakers should follow two strategies, both drawn from smart phones. The first is that EV pick ups should be leased as a means of keeping model features current. It facilitates rolling out industry standards quickly (like installing the latest Android update) and adding other yet-more attractive features. It also allows for more environmentally-friendly disposal of obsolete EVs. Materials can be more easily recycled and batteries no longer usable for driving (generally below 70% capacity) can be repurposed for stand-alone storage.

The second is to offer add on services. Smart phone companies have media streaming, data management and all sorts of other features beyond simple communication. Automakers can offer demand management to lower, or even eliminate, utility bills and appliance and space conditioning management placed onboard so a homeowner need not install a separate system that is not easily updated.

Advanced power system modeling need not mean more complex modeling

A recent article by E3 and Form Energy in Utility Dive calls for more granular temporal modeling of the electric power system to better capture the constraints of a fully-renewable portfolio and the requirements for supporting technologies such as storage. The authors have identified the correct problem–most current models use a “typical week” of loads that are an average of historic conditions and probabilistic representations of unit availability. This approach fails to capture the “tail” conditions where renewables and currently available storage are likely to be sufficient.

But the answer is not a full blown hour by hour model of the entire year with many permutations of the many possibilities. These system production simulation models already take too long to run a single scenario due to the complexity of this giant “transmission machine.” Adding the required uncertainty will cause these models to run “in real time” as some modelers describe it.

Instead a separate analysis should first identify the conditions under which renewables + current technology storage are unlikely to meet demand sufficiently. These include drought that limits hydropower, extreme weather, and extended weather that limits renewable production. Then these conditions can input into the current models to assess how the system responds.

The two important fixes which has always been problem in these models are to energy-limited resources and unit commitment algorithms. Both of these are complex problems, and these models have not done well in scheduling seasonal hydropower pondage storage and in deciding which units to commit to meet a high demand several days ahead. (And these problems are also why relying solely on hourly bulk power pricing doesn’t give an accurate measure of the true market value of a resource.) But focusing on these two problems is much easier than trying to incorporating the full range of uncertainty for all 8,760 hours for at least a decade into the future.

We should not confuse precision with accuracy. The current models can be quite precise on specific metrics such as unit efficiency as different load points, but they can be inaccurate because they don’t capture the effect of load and fuel price variations. We should not be trying to achieve spurious precision through more complete granular modeling–we should be focusing on accuracy in the narrow situations that matter.

Nuclear vs. storage: which is in our future?

Two articles with contrasting views of the future showed up in Utility Dive this week. The first was an opinion piece by an MIT professor referencing a study he coauthored comparing the costs of an electricity network where renewables supply more than 40% of generation compared to using advanced nuclear power. However, the report’s analysis relied on two key assumptions:

  1. Current battery storage costs are about $300/kW-hr and will remain static into the future.
  2. Current nuclear technology costs about $76 per MWh and advanced nuclear technology can achieve costs of $50 per MWh.

The second article immediately refuted the first assumption in the MIT study. A report from BloombergNEF found that average battery storage prices fell to $156/kW-hr in 2019, and projected further decreases to $100/kW-hr by 2024.

The reason that this price drop is so important is that, as the MIT study pointed out, renewables will be producing excess power at certain times and underproducing during other peak periods. MIT assumes that system operators will have to curtail renewable generation during low load periods and run gas plants to fill in at the peaks. (MIT pointed to California curtailing about 190 GWh in April. However, that added only 0.1% to the CAISO’s total generation cost.) But if storage is so cheap, along with inexpensive solar and wind, additional renewable capacity can be built to store power for the early evening peaks. This could enable us to free ourselves from having to plan for system peak periods and focus largely on energy production.

MIT’s second assumption is not validated by recent experience. As I posted earlier, the about to be completed Vogtle nuclear plant will cost ratepayers in Georgia and South Carolina about $100 per MWh–more than 30% more than the assumption used by MIT. PG&E withdrew its relicensing request for Diablo Canyon because the utility projected the cost to be $100 to $120 per MWh. Another recent study found nuclear costs worldwide exceeded $100/MWh and it takes an average of a decade finish a plant.

Another group at MIT issued a report earlier intended to revive interest in using nuclear power. I’m not sure of why MIT is so focused on this issue and continuing to rely on data and projections that are clearly outdated or wrong, but it does have one of the leading departments in nuclear science and engineering. It’s sad to see that such a prestigious institution is allowing its economic self interest to cloud its vision of the future.

What do you see in the future of relying on renewables? Is it economically feasible to build excess renewable capacity that can supply enough storage to run the system the rest of the day? How would the costs of this system compare to nuclear power at actual current costs? Will advanced nuclear power drop costs by 50%? Let us know your thoughts and add any useful references.

The Business Roundtable takes the wrong lesson from California’s energy costs

solar-panel-price-drop-global-solar-installations-bnef

The California Business Roundtable authored an article in the San Francisco Chronicle claiming that the we only need to look to California’s energy prices to see what would happen with the “Green New Deal” proposed by the Congressional Democrats.

That article has several errors and is misleading in others aspects. First, California’s electricity rates are high because of the renewable contracts signed nearly a decade ago when renewables were just evolving and much higher cost. California’s investment was part of the reason that solar and wind costs are now lower than existing coals plants (new study shows 75% of coal plants are uneconomic) and competitive with natural gas. Batteries that increase renewable operations have almost become cost effective. It also claims that reliability has “gone down” when in fact we still have a large reserve margin. The California Independent System Operator in fact found a 23% reserve margin when the target is only 17%. We also have the ability to install batteries quickly to solve that issue. PG&E is installing over 500 MW of batteries right now to replace a large natural gas plant.

For the rest of the U.S., consumers will benefit from these lower costs today. Californians have paid too much for their power to date, due to mismanagement by PG&E and the other utilities, but elsewhere will be able to avoid these foibles.

(Graphic: BNEF)

Reverse auctions for storage gaining favor

power_auction_xl_721_420_80_s_c1

Two recent reports highlight the benefits of using “reverse auctions”. In a reverse auction, the buyer specifies a quantity to be purchased, and sellers bid to provide a portion of that quantity.  An article in Utility Dive summarizes some of the experiences with renewable market auctions.  A separate report in the Review of Environmental Economics and Policy goes further to lay out five guidelines:

  1. Encourage a Large Number of Auction Participants
  2. Limit the Amount of Auctioned Capacity
  3. Leverage Policy Frameworks and Market Structures
  4. Earmark a Portion of Auctioned Capacity for Less-mature Technologies
  5. Balance Penalizing Delivery Failures and Fostering Competition

This policy prescription requires well-informed policy makers balancing different factors–not a task that is well suited to a state legislature. How to develop such a coherent policy can done in two ways. The first is to let the a state commission work through a proceeding to set an overall target and structure. But perhaps a more fruitful approach would be to let local utilities, such as California’s community choice aggregators (CCAs) to set up individual auctions, maybe even setting their own storage targets and then experimenting with different approaches.

California has repeatedly made errors by overly relying on centralized market structures that overcommit or mismatch resource acquisition. This arises because a mistake by a single central buyer is multiplied across all load while a mistake by one buyer within a decentralized market is largely isolated to the load of that one buyer. Without perfect foresight and a distinct lack of mechanisms to appropriately share risk between buyers and sellers, we should be designing an electricity market that mitigates risks to consumers rather than trying to achieve a mythological “optimal” result.

Relying on short term changes diminishes the promise of energy storage

1.-a-typical-lithium-ion-battery-system-nps-768x576

I posted this response on EDF’s blog about energy storage:

This post accepts too easily the conventional industry “wisdom” that the only valid price signals come from short term responses and effects. In general, storage and demand response is likely to lead to increased renewables investment even if in the short run GHG emissions increase. This post hints at that possibility, but it doesn’t make this point explicitly. (The only exception might be increased viability of baseloaded coal plants in the East, but even then I think that the lower cost of renewables is displacing retiring coal.)

We have two facts about the electric grid system that undermine the validity of short-term electricity market functionality and pricing. First, regulatory imperatives to guarantee system reliability causes new capacity to be built prior to any evidence of capacity or energy shortages in the ISO balancing markets. Second, fossil fueled generation is no longer the incremental new resource in much of the U.S. electricity grid. While the ISO energy markets still rely on fossil fueled generation as the “marginal” bidder, these markets are in fact just transmission balancing markets and not sources for meeting new incremental loads. Most of that incremental load is now being met by renewables with near zero operational costs. Those resources do not directly set the short-term prices. Combined with first shortcoming, the total short term price is substantially below the true marginal costs of new resources.

Storage policy and pricing should be set using long-term values and emission changes based on expected resource additions, not on tomorrow’s energy imbalance market price.

MIT tries to resurrect nuclear power

6e0c32214e80ee9f4fbabf2e4ffe6fcd

I received a notice of a new MIT study entitled “The Future of Nuclear Energy in a Carbon-Constrained World” which looks at the technological, regulatory and economic changes required to make nuclear power viable again. A summary states

The findings are that new policy models and cost-cutting technologies would help nuclear play vital role in climate solutions. Progress in reducing carbon emissions requires a broad range of actions to effectively leverage nuclear energy.

However, nothing in the summary reveals the paradigm-shattering innovation that will be required to make nuclear power competitive with a diverse fleet of renewables plus storage that would achieve the same goals. The cost of a solar plant plus storage with today’s technology still costs less than a current technology nuclear plant. That alternative fleet would also provide better reliability by diversifying the generation sources through smaller plants and avoid any radiation contamination risk.

The nuclear industry must clearly demonstrate that it can get past the many hurdles that led to the recent cancellation of two projects in the southeast U.S. Reviving nuclear power will require more than fantasies about what might be.

Looking for pumped storage in all the wrong places

castaic_power_plant

LADWP is proposing to spend $3 billion on a pumped storage facility at the Hoover Dam on the Colorado River. Yet, LADWP has not been using extensively its aging 1,247 MW Castaic pumped storage plant on the State Water Project in the pumping recovery mode. Instead, LADWP runs it more like a standard hydropower plant, and uses pumping to supplement and extend the peak power generation, rather than using it to store excess day time power. And the SWP’s 759 MW pumped storage plant at the Hyatt-Thermalito powerhouse at Lake Oroville has been not been used effectively for decades.

The more prudent course would seem to be to focus on refurbishing and updating existing facilities, with variable speed pumps for example, to deliver utility scale storage that can capture excess renewable energy generation nearer large load centers. The State Water Contractors should be incented to upgrade these facilities through contracts with the state’s electric utilities. Unfortunately, no direct market mechanism exists to provide a true value for these resources so long at the California Public Utilities Commission and the California Independent System Operator avoid developing full pricing. As it stands, the current pricing scheme socializes and subsidizes a number of electricity services such as transmission, unit commitment decisions, and reliability services.

Fighting the last war: Study finds solar + storage uneconomic now  | from Utility Dive

“A Rochester Institute of Technology study says a customer must face high electricity bills and unfavorable net metering or feed-in policies for grid defection to work.”

Yet…this study used current battery costs (at $350/KW-Hr), ignoring probably cost decreases, and then made more restrictive assumptions about how such a system might work. It’s not clear if “defection” meant complete self sufficiency, or reducing the generation portion (which in California about half of electricity bill.) Regardless, the study shows that grid defection is cost-effective in Hawaii, confirm the RMI findings. Even so, RMI said it would take at least 10 years before such defection was cost-effective in even the high-cost states like New York and California.

A more interesting study would be to look at the “break-even” cost thresholds for solar panels and batteries to make these competitive with utility service. Then planners and decision makers could assess the likelihood of reaching those levels within a range of time periods.

Source: A study throws cold water on residential solar-plus-storage economics | Utility Dive