In the LA Times – looking for alternative solutions to storm outages

I was interviewed by a Los Angeles Times reporter about the recent power outages in Northern California as result of the wave of storms. Our power went out for 48 hours New Year’s Eve and again for 12 hours the next weekend:

After three days without power during this latest storm series, Davis resident Richard McCann said he’s seriously considering implementing his own microgrid so he doesn’t have to rely on PG&E.

“I’ve been thinking about it,” he said. McCann, whose work focuses on power sector analysis, said his home lost power for about 48 hours beginning New Year’s Eve, then lost it again after Saturday for about 12 hours.

While the storms were severe across the state, McCann said Davis did not see unprecedented winds or flooding, adding to his concerns about the grid’s reliability.

He said he would like to see California’s utilities “distributing the system, so people can be more independent.”

“I think that’s probably a better solution rather than trying to build up stronger and stronger walls around a centralized grid,” McCann said.

Several others were quoted in the article offering microgrids as a solution to the ongoing challenge.

Widespread outages occurred in Woodland and Stockton despite winds not being exceptionally strong beyond recent experience. Given the widespread outages two years ago and the three “blue sky” multi hour outages we had in 2022 (and none during the September heat storm when 5,000 Davis customers lost power), I’m doubtful that PG&E is ready for what’s coming with climate change.

PG&E instead is proposing to invest up to $40 billion in the next eight years to protect service reliability for 4% of their customers via undergrounding wires in the foothills which will raise our rates up to 70% by 2030! There’s an alternative cost effective solution that would be 80% to 95% less sitting before the Public Utilities Commission but unlikely to be approved. There’s another opportunity to head off PG&E and send some of that money towards fixing our local grid coming up this summer under a new state law.

While winds have been strong, they have not been at the 99%+ range of experience that should lead to multiple catastrophic outcomes in short order. And having two major events within a week, plus the outage in December 2020 shows that these are not statistically unusual. We experienced similar fierce winds without such extended outages. Prior to 2020, Davis only experienced two extended outages in the previous two decades in 1998 and 2007. Clearly the lack of maintenance on an aging system has caught up with PG&E. PG&E should reimagine its rural undergrounding program to mitigate wildfire risk to use microgrids instead. That will free up most of the billons it plans to spend on less than 4% of its customer base to instead harden its urban grid.

Per Capita: Climate needs more than just good will

I wrote this guest column in the Davis Enterprise about the City’s Climate Action and Adaptation Plan. (Thank you John Mott-Smith for extending the privilege.)

Dear Readers, the guest column below was written by Richard McCann, a Davis resident and expert on energy and climate action plans.

————

The city of Davis is considering its first update of its Climate Action and Adaptation Plan since 2010 with a 2020-2040 Plan. The city plans to update the CAAP every couple of years to reflect changing conditions, technologies, financing options, laws and regulations.

The plan does not and cannot achieve a total reduction in greenhouse gas emissions simply because we do not control all of the emission sources — almost three-quarters of our emissions are from vehicles that are largely regulated by state and federal laws. But it does lay out a means to putting a serious dent in the overall amount. 

The CAAP offers a promising future and accepts that we have to protect ourselves as the climate worsens. Among the many benefits we can look forward to are avoiding volatile gas prices while driving cleaner, quieter cars; faster and more controllable cooking while eliminating toxic indoor air; and air conditioning and heating without having to make two investments while paying less.

To better adapt, we’ll have a greener landscape, filtered air for rental homes, and community shelter hubs powered by microgrids to ride out more frequent extreme weather.

We have already seen that adding solar panels raises the value of a house by as much as $4,000 per installed kilowatt (so a 5 kilowatt system adds $20,000). We can expect similar increases in home values with these new technologies due to the future savings, safety and convenience. 

Several state and federal laws and rules foretell what is coming. By 2045 California aims to be at zero net GHG emissions. That will require retiring all of the residential and commercial gas distribution lines. PG&E has already started a program to phase out its lines. A change in state rules will remove from the market several large natural gas appliances such as furnaces by 2030.

In addition, PG&E will no longer offer subsidies to developers to install gas lines to new homes starting next year. The U.S. Environmental Protection Agency appears poised to push further the use of electric appliances in areas with poor air quality such the Sacramento Valley. (Renewable gas and hydrogen will be too expensive and there won’t be enough to go around.)

Without sales to new customers or for replaced furnaces, the cost of maintaining the gas system will rise substantially so switching to electricity for cooking and water heating will save even more money. The CAAP anticipates this transition by having residents begin switching earlier. 

In addition, the recently enacted federal Inflation Reduction Act offers between $400 and $800 billion into funding these types of changes. The California Energy Commission’s budget for this year went from $1 billion to $10 billion to finance these transitions. The CAAP lays out a process for acquiring these financial sources for Davis and its residents. 

That said, some have objected to the CAAP as being too draconian and infringing on personal choices. The fact is that we are now in the midst of a climate emergency — the City Council endorsed this concern with a declaration in 2019. We’re already behind schedule to head off the worst of the threatening impacts. 

We won’t be able to rely solely on voluntary actions to achieve the reductions we need. That the CAAP has to include these actions proves that people have not been acting on their own despite a decade of cajoling since the last CAAP. While we’ve been successful at encouraging voluntary compliance with easy tasks like recycling, we’ve used mandatory permitting requirements to gain compliance with various building standards including energy efficiency measures. (These are usually enforced at point-of-sale of a house.)

We have a choice of mandatory ordinances, incentives through taxes or fees, and subsidies from grants and funds — voluntary just won’t deliver what’s needed. We might be able to financially help those least able to afford changing stoves, heaters or cars, but those funds will be limited. The ability to raise taxes or fees is restricted due to various provisions in the state’s constitution. So we are left with mandatory measures, applied at the most opportune moments. 

Switching to electricity for cooking and water heating may involve some costs, some or most of which will be offset by lower energy costs (especially as gas rates go up.) If you have an air conditioner, you’re likely already set up for a heat pump to replace your furnace — it’s a simple swap. Even so, you can avoid some costs by using a 120-volt induction cooktop instead of 240 volts, and installing a circuit-sharing plug or breaker for large loads to avoid panel upgrades. 

The CAAP will be fleshed out and evolve for at least the next decade. Change is coming and will be inevitable given the dire situation. But this change gives us opportunities to clean our environment and make our city more livable.  

The fundamental truth of marginal and average costs

Opponents of increased distributed energy resources who advocate for centralized power distribution insist that marginal costs are substantially below retail rates–as little as 6 cents per kilowatt-hour. Yet average costs generally continue to rise. For example, a claim has been repeatedly asserted that the marginal cost of transmission in California is less than a penny a kilowatt-hour. Yet PG&E’s retail transmission rate component went from 1.469 cents per kWh in 2013 to 4.787 cents in 2022. (SDG&E’s transmission rate is now 7.248 cents!) By definition, the marginal cost must be higher than 4.8 cents (and likely much higher) to increase that much.

Average costs equals the sum of marginal costs. Or inversely, marginal cost equals the incremental change in average costs when adding a unit of demand or supply. The two concepts are interlinked so that one must speak of one when speaking of the other.

The chart at the top of this post shows the relationship of marginal and average costs. Most importantly, it is not mathematically possible to have rising average costs when marginal costs are below average costs. So any assertion that transmission marginal costs are less than the average costs of transmission given that average costs are rising must be mathematically false.

Don’t get too excited about the fusion breakthrough yet

The U.S. Department of Energy announced on December 13 that a net positive fusion reaction achieved at the Lawrence Livermore National Laboratory. While impressive, this one last aside raises another substantial barrier:

“(T)he fusion reaction creates neutrons that significantly stress equipment, and could potentially destroy that equipment.”

While the momentary burst produced about 150% more energy than the input from the lasers, the lasers required about 150 times more energy than their output.

The technology won’t be ready for use until at least 2060, which is a decade after the goal of achieving net zero carbon emissions. That means that we need to plan and progress without relying on this energy source.

David Mitchell in the LA Times: As drought drives prices higher, millions of Californians struggle to pay for water

M.Cubed Partner David Mitchell was interviewed for an article on rising residential water rates in California in this October 24 article:

Across the state, water utility prices are escalating faster than other “big ticket” items such as college tuition or medical costs, according to David Mitchell, an economist specializing in water.

“Cost containment is going to become an important issue for the sector in the coming years” as climate change worsens drought and water scarcity, he said.

The price of water on the Nasdaq Veles California Water Index, which is used primarily for agriculture, hit $1,028.86 for an acre-foot on Oct. 20 — a roughly 40% increase since the start of the year. An acre-foot of water, or approximately 326,000 gallons, is enough to supply three Southern California households for a year.

Mitchell said there are short- and long-term factors contributing to rising water costs.

Long-term factors include the replacement of aging infrastructure, new treatment standards, and investments in insurance, projects and storage as hedges against drought.

In the short term, however, drought restrictions play a significant role. When water use drops, urban water utilities — which mostly have fixed costs — earn less revenue. They adjust their rates to recover that revenue, either during or after the drought.

“So it’s not right now a pretty picture,” Mitchell said.

David Mitchell’s practice areas include benefit-cost analysis, regional economic impact assessment, utility rate setting and financial planning, and natural resource valuation. Mr. Mitchell has in-depth knowledge of the water supply, water quality and environmental management challenges confronting natural resource management agencies.

How to effectively compensate labor facing technological innovation

Many social and economics changes generate net benefits, but often there are big losers. The advent and adoption of automobiles led to the demise of horse stables and carriage makers as one example. Most of those workers were able to shift to other jobs, such as at car manufacturers, so everyone largely benefited. But that’s not necessarily true today. Automation is displacing manufacturing employment (much more so than imports [link]) and new well paying jobs are not being created that are accessible to those who lost those jobs. How could we compensate these workers for their lost opportunities?

We protect owners of intellectual property rights such as patents and copyrights from intrusion into their markets with similar products and services. If someone wants to invent a new communication device that uses the existing cell phone network, they almost certainly have to pay Qualcomm a licensing fee for chips that rely in part on the underlying technology invented by Qualcomm. This protects investors from outright appropriation of the economic value that an investor in Qualcomm has created. It also allows Qualcomm investors to share in the economic gains by another company so Qualcomm is much less likely to oppose such innovations.

We could create a similar labor property right that protects the current economic value of workers by giving them a share of the economic gains created by a new innovation. Amazon could pay a portion of its profits as a “licensing” fee to workers displaced by online ordering and deliveries, or Ford workers could receive payouts based on the added profits created by using robots for car assembly. These fees would better align the interests of existing labor with beneficial innovation rather than putting them in opposition.

This proposal also would address the problem of a growing income gap as greater returns accrue to investors. Displaced workers would share in growing wealth rather than being sidelined in search for a job that that they are less likely to be qualified for. In addition, this would reduce the downward pressure on wages created by automation.

The program could be established as a form of universal basic income (UBI) to pay those who are not yet retired a share of society’s wealth creation. Certainly capitalism is greased by creative destruction, but there are many who do not want and are not prepared for the extreme risks that go with rapid innovation. We do not need to give risk-taking investors all of the economic gains to incentivize innovation, and we would likely lessen political opposition to such changes.

Do small modular reactors (SMR) hold real promise?

The economic analyses of the projected costs for small modular reactors (SMRs) appear to rely on two important assumptions: 1) that the plants will run at capacity factors of current nuclear plants (i.e., 70%-90%+) and 2) that enough will be built quickly enough to gain from “learning by doing” on scale as has occurred with solar, wind and battery technologies. The problem with these assumptions is that they require that SMRs crowd out other renewables with little impact on gas-fired generation.

To achieve low costs in nuclear power requires high capacity factors, that is the total electricity output relative to potential output. The Breakthrough Institute study, for example, assumes a capacity factor greater than 80% for SMRs. The problem is that the typical system load factor, that is the average load divided by the peak load, ranges from 50% to 60%. A generation capacity factor of 80% means that the plant is producing 20% more electricity than the system needs. It also means that other generation sources such as solar and wind will be pushed aside by this amount in the grid. Because the SMRs cannot ramp up and down to the same degree as load swings, not only daily but also seasonally, the system will still need load following fossil-fuel plants or storage. It is just the flip side of filling in for the intermittency of renewables.

To truly operate within the generation system in a manner that directly displaces fossil fuels, an SMR will have to operate at a 60% capacity factor or less. Accommodating renewables will lower that capacity factor further. Decreasing the capacity factor from 80% to 60% will increase the cost of an SMR by a third. This would increase the projected cost in the Breakthrough Institute report for 2050 from $41 per megawatt-hour to $55 per megawatt-hour. Renewables with storage are already beating this cost in 2022 and we don’t need to wait 30 years.

And the Breakthrough Institute study relies questionable assumptions about learning by doing in the industry. First, it assumes that conventional nuclear will experience a 5% learning benefit (i.e., costs will drop 5% for each doubling of capacity). In fact, the industry shows a negative learning rate--costs per kilowatt have been rising as more capacity is built. It is not clear how the SMR industry will reverse this trait. Second, the learning by doing effect in this industry is likely to be on a per plant rather than per megawatt or per turbine basis as has been the case with solar and turbines. The very small unit size for solar and turbine allows for off site factory production with highly repetitive assembly, whereas SMRs will require substantial on-site fabrication that will be site specific. SMR learning rates are more likely to follow those for building construction than other new energy technologies.

Finally, the report does not discuss the risk of catastrophic accidents. The probability of a significant accident is about 1 per 3,700 reactor operating years. Widespread deployment of SMRs will vastly increase the annual risk because that probability is independent of plant size. Building 1,000 SMRs could increase the risk to such a level that these accidents could be happening once every four years.

The Fukushima nuclear plant catastrophe is estimated to have cost $300 billion to $700 billion. The next one could cost in excess of $1 trillion. This risk adds a cost of $11 to $27 per megawatt-hours.

Adding these risk costs on top of the adjusted capacity factor, the cost ranges rises to $65 to $82 per megawatt-hour.

The real lessons from California’s 2000-01 electricity crisis and what they mean for today’s markets

The recent reliability crises for the electricity markets in California and Texas ask us to reconsider the supposed lessons from the most significant extended market crisis to date– the 2000-01 California electricity crisis. I wrote a paper two decades ago, The Perfect Mess, that described the circumstances leading up to the event. There have been two other common threads about supposed lessons, but I do not accept either as being true solutions and are instead really about risk sharing once this type of crisis ensues rather than being useful for preventing similar market misfunctions. Instead, the real lesson is that load serving entities (LSEs) must be able to sign long-term agreements that are unaffected and unfettered directly or indirectly by variations in daily and hourly markets so as to eliminate incentives to manipulate those markets.

The first and most popular explanation among many economists is that consumers did not see the swings in the wholesale generation prices in the California Power Exchange (PX) and California Independent System Operator (CAISO) markets. In this rationale, if consumers had seen the large increases in costs, as much as 10-fold over the pre-crisis average, they would have reduced their usage enough to limit the gains from manipulating prices. Consumers should have shouldered the risks in the markets in this view and their cumulative creditworthiness could have ridden out the extended event.

This view is not valid for several reasons. The first and most important is that the compensation to utilities for stranded assets investment was predicated on calculating the difference between a fixed retail rate and the utilities cost of service for transmission and distribution plus the wholesale cost of power in the PX and CAISO markets. Until May 2000, that difference was always positive and the utilities were well on the way to collecting their Competition Transition Charge (CTC) in full before the end of the transition period March 31, 2002. The deal was if the utilities were going to collect their stranded investments, then consumers rates would be protected for that period. The risk of stranded asset recovery was entirely the utilities’ and both the California Public Utilities Commission in its string of decisions and the State Legislature in Assembly Bill 1890 were very clear about this assignment.

The utilities had chosen to support this approach linking asset value to ongoing short term market valuation over an upfront separation payment proposed by Commissioner Jesse Knight. The upfront payment would have enabled linking power cost variations to retail rates at the outset, but the utilities would have to accept the risk of uncertain forecasts about true market values. Instead, the utilities wanted to transfer the valuation risk to ratepayers, and in return ratepayers capped their risk at the current retail rates as of 1996. Retail customers were to be protected from undue wholesale market risk and the utilities took on that responsibility. The utilities walked into this deal willingly and as fully informed as any party.

As the transition period progressed, the utilities transferred their collected CTC revenues to their respective holding companies to be disbursed to shareholders instead of prudently them as reserves until the end of the transition period. When the crisis erupted, the utilities quickly drained what cash they had left and had to go to the credit markets. In fact, if they had retained the CTC cash, they would not have had to go the credit markets until January 2001 based on the accounts that I was tracking at the time and PG&E would not have had a basis for declaring bankruptcy.

The CTC left the market wide open to manipulation and it is unlikely that any simple changes in the PX or CAISO markets could have prevented this. I conducted an analysis for the CPUC in May 2000 as part of its review of Pacific Gas & Electric’s proposed divestiture of its hydro system based on a method developed by Catherine Wolfram in 1997. The finding was that a firm owning as little as 1,500 MW (which included most merchant generators at the time) could profitably gain from price manipulation for at least 2,700 hours in a year. The only market-based solution was for LSEs including the utilities to sign longer-term power purchase agreements (PPAs) for a significant portion (but not 100%) of the generators’ portfolios. (Jim Sweeney briefly alludes to this solution before launching to his preferred linkage of retail rates and generation costs.)

Unfortunately, State Senator Steve Peace introduced a budget trailer bill in June 2000 (as Public Utilities Code Section 355.1, since repealed) that forced the utilities to sign PPAs only through the PX which the utilities viewed as too limited and no PPAs were consummated. The utilities remained fully exposed until the California Department of Water Resources took over procurement in January 2001.

The second problem was a combination of unavailable technology and billing systems. Customers did not yet have smart meters and paper bills could lag as much as two months after initial usage. There was no real way for customers to respond in near real time to high generation market prices (even assuming that they would have been paying attention to such an obscure market). And as we saw in the Texas during Storm Uri in 2021, the only available consumer response for too many was to freeze to death.

This proposed solution is really about shifting risk from utility shareholders to ratepayers, not a realistic market solution. But as discussed above, at the core of the restructuring deal was a sharing of risk between customers and shareholders–a deal that shareholders failed to keep when they transferred all of the cash out of their utility subsidiaries. If ratepayers are going to take on the entire risk (as keeps coming up) then either authorized return should be set at the corporate bond debt rate or the utilities should just be publicly owned.

The second explanation of why the market imploded was that the decentralization created a lack of coordination in providing enough resources. In this view, the CDWR rescue in 2001 righted the ship, but the exodus of the community choice aggregators (CCAs) again threatens system integrity again. The preferred solution for the CPUC is now to reconcentrate power procurement and management with the IOUs, thus killing the remnants of restructuring and markets.

The problem is that the current construct of the PCIA exit fee similarly leaves the market open to potential manipulation. And we’ve seen how virtually unfettered procurement between 2001 and the emergence of the CCAs resulted in substantial excess costs.

The real lessons from the California energy crisis are two fold:

  • Any stranded asset recovery must be done as a single or fixed payment based on the market value of the assets at the moment of market formation. Any other method leaves market participants open to price manipulation. This lesson should be applied in the case of the exit fees paid by CCAs and customers using distributed energy resources. It is the only way to fairly allocate risks between customers and shareholders.
  • LSEs must be able unencumbered in signing longer term PPAs, but they also should be limited ahead of time in the ability to recover stranded costs so that they have significant incentives to prudently procure resources. California’s utilities still lack this incentive.

That California owns its water doesn’t mean that the state can just take it back without paying for it

The researchers at the UC Davis Center for Watershed Sciences wrote an insightful blog on “Considerations for Developing An Environmental Water Right in California.” However one passage jumped out at me that has troubling economic implications:

The potential for abuse is particularly troubling when the State is using public funds to buy water, which technically belongs to the people of the state and which the State can already regulate to achieve the same aims.

It’s not helpful to refer to the fiction that the somehow the state can award water rights, on which entities make economic investments based on private uses, and then turn around and try to claim that the state can just take those rights back without any compensation. That’s a foolish perspective that will lead mispricing and misallocation of water use. It is reasonable to assert that the state can claim a right of first refusal on transactions or even that a rights holder can’t withhold sale of a water right to the state, but in either case, the rights holder does receive compensation. The state’s right can easily be interpreted as that of a landlord who has a long term lease agreement with a tenant, and the tenancy agreement can be terminated with compensation to that tenant.

Close Diablo Canyon? More distributed solar instead

More calls for keeping Diablo Canyon have coming out in the last month, along with a proposal to match the project with a desalination project that would deliver water to somewhere. (And there has been pushback from opponents.) There are better solutions, as I have written about previously. Unfortunately, those who are now raising this issue missed the details and nuances of the debate in 2016 when the decision was made, and they are not well informed about Diablo’s situation.

One important fact is that it is not clear whether continued operation of Diablo is safe. Unit No. 1 has one of the most embrittled containment vessels in the U.S. that is at risk during a sudden shutdown event.

Another is that the decision would require overriding a State Water Resources Control Board decision that required ending the use of once-through cooling with ocean water. That cost was what led to the closure decision, which was 10 cents per kilowatt-hour at current operational levels and in excess of 12 cents in more likely operations.

So what could the state do fairly quickly for 12 cents per kWh instead? Install distributed energy resources focused on commercial and community-scale solar. These projects cost between 6 and 9 cents per kWh and avoid transmission costs of about 4 cents per kWh. They also can be paired with electric vehicles to store electricity and fuel the replacement of gasoline cars. Microgrids can mitigate wildfire risk more cost effectively than undergrounding, so we can save another $40 billion there too. Most importantly they can be built in a matter of months, much more quickly than grid-scale projects.

As for the proposal to build a desalination plant, pairing one with Diablo would both be overkill and a logistical puzzle. The Carlsbad plant produces 56,000 acre-feet annually for San Diego County Water Agency. The Central Coast where Diablo is located has a State Water Project allocation of 45,000 acre-feet which is not even used fully now. That plant uses 35 MW or 1.6% of Diablo’s output. A plant built to use all of Diablo’s output could produce 3.5 million acre-feet, but the State Water Project would need to be significantly modified to move the water either back to the Central Valley or beyond Santa Barbara to Ventura. All of that adds up to a large cost on top of what is already a costly source of water of $2,500 to $2,800 per acre-foot.