Tag Archives: economics

David Mitchell in the LA Times: As drought drives prices higher, millions of Californians struggle to pay for water

M.Cubed Partner David Mitchell was interviewed for an article on rising residential water rates in California in this October 24 article:

Across the state, water utility prices are escalating faster than other “big ticket” items such as college tuition or medical costs, according to David Mitchell, an economist specializing in water.

“Cost containment is going to become an important issue for the sector in the coming years” as climate change worsens drought and water scarcity, he said.

The price of water on the Nasdaq Veles California Water Index, which is used primarily for agriculture, hit $1,028.86 for an acre-foot on Oct. 20 — a roughly 40% increase since the start of the year. An acre-foot of water, or approximately 326,000 gallons, is enough to supply three Southern California households for a year.

Mitchell said there are short- and long-term factors contributing to rising water costs.

Long-term factors include the replacement of aging infrastructure, new treatment standards, and investments in insurance, projects and storage as hedges against drought.

In the short term, however, drought restrictions play a significant role. When water use drops, urban water utilities — which mostly have fixed costs — earn less revenue. They adjust their rates to recover that revenue, either during or after the drought.

“So it’s not right now a pretty picture,” Mitchell said.

David Mitchell’s practice areas include benefit-cost analysis, regional economic impact assessment, utility rate setting and financial planning, and natural resource valuation. Mr. Mitchell has in-depth knowledge of the water supply, water quality and environmental management challenges confronting natural resource management agencies.

Is the NASDAQ water futures market transparent enough?

Futures markets are settled either physically with actual delivery of the contracted product, or via cash based on the difference in the futures contract price and the actual purchase price. The NASDAQ Veles California Water Index future market is a cash settled market. In this case, the “actual” price is constructed by a consulting firm based on a survey of water transactions. Unfortunately this method may not be full reflective of the true market prices and, as we found in the natural gas markets 20 years ago, these can be easily manipulated.

Most commodity futures markets, such at the crude oil or pork bellies, have a specific delivery point, such as Brent North Sea Crude or West Texas Intermediate at Cushing, Oklahoma or Chicago for some livestock products. There is also an agreed upon set of standards for the commodities such as quality and delivery conditions. The problem with the California Water Index is that these various attributes are opaque or even unknown.

Two decades ago I compiled the most extensive water transfer database to date in the state. I understand the difficulty of collecting this information and properly classifying it. The bottom line is that there is not a simple way to clearly identify what is the “water transfer price” at any given time.

Water supplied for agricultural and urban water uses in California has many different attributes. First is where the water is delivered and how it is conveyed. While water pumped from the Delta gets the most attention, surface water comes from many other sources in the Sacramento and San Joaquin Valleys, as well as from the Colorado River. The cost to move this water greatly varies by location ranging from gravity fed to a 4,000 foot lift over the Tehachapis.

Second is the reliability and timing of availability. California has the most complex set of water rights in the U.S. and most watersheds are oversubscribed. A water with a senior right delivered during the summer is more valuable than a junior right delivered in the winter.

Third is the quality of the water. Urban districts will compete for higher quality sources, and certain agricultural users can use higher salinity sources than others.

A fourth dimension is that water transfers are signed for different periods and delivery conditions as well as other terms that directly impact prices.

All of these factors lead to a spread in prices that are not well represented by a single price “index”. This becomes even more problematic when a single entity such as the Metropolitan Water District enters the market and purchases one type of water which they skews the “average.” Bart Thompson at Stanford has asked whether this index will reflect local variations sufficiently.

Finally, many of these transactions are private deals between public agencies who do not reveal key attributes these transfers, particularly price, because there is not an open market reporting requirement. A subsequent study of the market by the Public Policy Institute of California required explicit cooperation from these agencies and months of research. Whether a “real time” index is feasible in this setting is a key question.

The index managers have not been transparent about how the index is constructed. The delivery points are not identified, nor are the sources. Whether transfers are segmented by water right and term is not listed. Whether certain short term transfers such as the State Water Project Turnback Pool are included is not listed. Without this information, it is difficult to measure the veracity of the reported index, and equally difficult to forecast the direction of the index.

The housing market has many of these same attributes, which is one reason why you can’t buy a house from a central auction house or from a dealer. There are just too many different dimensions to be considered. There is housing futures market, but housing has one key difference from the water transfer market–the price and terms are publicly reported to a government agency (usually a county assessor). Companies such as CoreLogic collect and publish this data (that is distributed by Zillow and Redfin.)

In 2000, natural gas prices into California were summarized in a price index reported by Natural Gas Intelligence. The index was based a phone survey that did not require verification of actual terms. As part of the electricity crisis that broke that summer, gas traders found that they could manipulate gas prices for sales to electricity generators higher by simply misreporting those prices or by making multiple sequential deals that ratcheted up the price. The Federal Energy Regulatory Commission and Commodity Futures Trading Commission were forced to step in and establish standards for price reporting.

The NASDAQ Veles index has many of the same attributes as the gas market had then but perhaps with even less regulatory protections. It is not clear how a federal agency could compel public agencies, including the U.S. Bureau of Reclamation, to report and document prices. Oversight of transactions by water districts is widely dispersed and usually assigned to the local governing board.

Trying to introduce a useful mechanism to this market sounds like an attractive option, but the barriers that have impeded other market innovations may be too much.

ERCOT has the peak period scarcity price too high

The freeze and resulting rolling outages in Texas in February highlighted the unique structure of the power market there. Customers and businesses were left with huge bills that have little to do with actual generation expenses. This is a consequence of the attempt by Texas to fit into an arcane interpretation of an economic principle where generators should be able to recover their investments from sales in just a few hours of the year. Problem is that basic of accounting for those cashflows does not match the true value of the power in those hours.

The Electric Reliability Council of Texas (ERCOT) runs an unusual wholesale electricity market that supposedly relies solely on hourly energy prices to provide the incentives for incenting new generation investment. However, ERCOT is using the same type of administratively-set subsidies to create enough potential revenue to cover investment costs. Further, a closer examination reveals that this price adder is set too high relative to actual consumer value for peak load power. All of this leads to a conclusion relying solely on short-run hourly prices as a proxy for the market value that accrues to new entrants is a misplaced metric.

The total ERCOT market first relies on side payments to cover commitment costs (which creates barriers to entry but that’s a separate issue) and second, it transfers consumer value through to the Operating Reserve Demand Curve (ORDC) that uses a fixed value of lost load (VOLL) in an arbitrary manner to create “opportunity costs” (more on that definition at a later time) so the market can have sufficient scarcity rents. This second price adder is at the core of ERCOT’s incentive system–energy prices alone are insufficient to support new generation investment. Yet ERCOT has ignored basic economics and set this value too high based on both available alternatives to consumers and basic regional budget constraints.

I started with an estimate of the number of hours where prices need the ORDC to be at full VOLL of $9000/MWH to recover the annual revenue requirements of combustion turbine (CT) investment based on the parameters we collected for the California Energy Commission. It turns out to be about 20 to 30 hours per year. Even if the cost in Texas is 30% less, this is still more 15 hours annually, every single year or on average. (That has not been happening in Texas to date.) Note for other independent system operators (ISO) such as the California ISO (CAISO), the price cap is $1,000 to $2,000/MWH.

I then calculated the cost of a customer instead using a home generator to meet load during those hours assuming a life of 10 to 20 years on the generator. That cost should set a cap on the VOLL to residential customers as the opportunity cost for them. The average unit is about $200/kW and an expensive one is about $500/kW. That cost ranges from $3 to $5 per kWh or $3,000 to $5,000/MWH. (If storage becomes more prevalent, this cost will drop significantly.) And that’s for customers who care about periodic outages–most just ride out a distribution system outage of a few hours with no backup. (Of course if I experienced 20 hours a year of outage, I would get a generator too.) This calculation ignores the added value of using the generator for other distribution system outages created by events like a hurricane hitting every few years, as happens in Texas. That drives down this cost even further, making the $9,000/MWH ORDC adder appear even more distorted.

The second calculation I did was to look at the cost of an extended outage. I used the outages during Hurricane Harvey in 2017 as a useful benchmark event. Based on ERCOT and U.S. Energy Information Reports reports, it looks like 1.67 million customers were without power for 4.5 days. Using the Texas gross state product (GSP) of $1.9 trillion as reported by the St. Louis Federal Reserve Bank, I calculated the economic value lost over 4.5 days, assuming a 100% loss, at $1.5 billion. If we assume that the electricity outage is 100% responsible for that loss, the lost economic value per MWH is just under $5,000/MWH. This represents the budget constraint on willingness to pay to avoid an outage. In other words, the Texas economy can’t afford to pay $9,000/MWH.

The recent set of rolling blackouts in Texas provides another opportunity to update this budget constraint calculation in a different circumstance. This can be done by determining the reduction in electricity sales and the decrease in state gross product in the period.

Using two independent methods, I come up with an upper bound of $5,000/MWH, and likely much less. One commentator pointed out that ERCOT would not be able achieve a sufficient planning reserve level at this price, but that statement is based on the premises that short-run hourly prices reflect full market values and will deliver the “optimal” resource mix. Neither is true.

This type of hourly pricing overemphasizes peak load reliability value and undervalues other attributes such as sustainability and resilience. These prices do not reflect the full incremental cost of adding new resources that deliver additional benefits during non-peak periods such as green energy, nor the true opportunity cost that is exercised when a generator is interconnected rather than during later operations. Texas has overbuilt its fossil-fueled generation thanks to this paradigm. It needs an external market based on long-run incremental costs to achieve the necessary environmental goals.

Drawing too many conclusions about electric vehicles from an obsolete data set

The Energy Institute at Haas at the University of California published a study allegedly showing that electric vehicles are driven about only one-third of the average standard car in California. I responded with a response on the blog.

Catherine Wolfram writes, “But, we do not see any detectable changes in our results from 2014 to 2017, and some of the same factors were at play over this time period. This makes us think that newer data might not be dramatically different, but we don’t know.“

A recent study likely is delivering a biased estimate of future EV use. The timing of this study reminds me of trying to analyze cell phone use in the mid-2000s. Now household land lines are largely obsolete, and we use phones even more than we did then. The period used for the analysis was during a dramatically changing period more akin to solar panel evolution just before and after 2010, before panels were ubiquitous. We can see this evolution here for example. Comparing the Nissan Leaf, we can see that the range has increased 50% between the 2018 and 2021 models.

The primary reason why this data set is seeing such low mileage is because is almost certain that the vast majority of the households in the survey also have a standard ICE vehicle that they use for their extended trips. There were few or no remote fast charge stations during that time and even Tesla’s had limited range in comparison. In addition, it’s almost certain that EV households were concentrated in urban households that have a comparatively low VMT. (Otherwise, why do studies show that these same neighborhoods have low GHG emissions on average?) Only about one-third of VMT is associated with commuting, another third with errands and tasks and a third with travel. There were few if any SUV EVs that would be more likely to be used for errands, and EVs have been smaller vehicles until recently.

As for copurchased solar panel installation, these earlier studies found that 40% or more of EV owners have solar panels, and solar rooftop penetration has grown faster than EV adoption since these were done.

I’m also not sure that the paper has captured fully workplace and parking structure charging. The logistical challenges of gaining LCFS credits could be substantial enough for employers and municipalities to not bother. This assumption requires a closer analysis of which entities are actually claiming these credits.

A necessary refinement is to compare this data to the typical VMT for these types of households, and to compare the mileage for model types. Smaller commuter models average less annual VMT according to the California Energy Commission’s vehicle VMT data set derived from the DMV registration file and the Air Resources Board’s EMFAC model. The Energy Institute analysis arrives at the same findings that EV studies in the mid 1990s found with less robust technology. That should be a flag that something is amiss in the results.

“What are public benefits of conveyance?” presented to the California Water Commission

Maven’s Notebook posted a summary of presentations to the California Water Commission by Richard McCann of M.Cubed, Steve Hatchett of Era Economics, and David Sunding of the Brattle Group. Many of my slides are included.

The Commission is developing a framework that might be used to identify how shares of conveyance costs might be funded by the state of California. The Commission previously awarded almost $3 billion in bond financing for a dozen projects under the Proposition 1B Water Storage Investment Program (WSIP). That process used a prescribed method including a Technical Guide that determined the eligible public benefits for financing by the state. M.Cubed supported the application by Irvine Ranch Water District and Rio Bravo-Rosedale Water Storage District for the Kern Fan water bank.

Davis, like many communities, needs a long-term vision

The Davis Vanguard published an article about the need to set out a vision for where the City of Davis wants to go if we want to have a coherent set of residential and commercial development decisions:

How do we continue to provide high quality of life for the residents of Davis, as the city on the one hand faces fiscal shortfalls and on the other hand continues to price the middle class and middle tier out of this community? A big problem that we have not addressed is the lack of any long term community vision. 

The article set out a series of questions that focused on assumptions and solutions. But we should not start the conversation with choosing a growth rate and then picking a set of projects that fit into that projection.

We need to start with asking a set of questions that derive from the thesis of the article:

  • – What is the composition that we want of this community? What type of diversity? How do we accommodate students? What are the ranges of statewide population growth that we need to plan for?
  • – To achieve that community composition, what is the range of target housing price? Given the projected UCD enrollment targets (which are basically out of our control), how much additional housing is needed under different scenarios of additional on campus housing?
  • – What is the jobs mix that supports that community composition under different scenarios? What’s the job mix that minimizes commuting and associated GHG emissions? 
  • – What’s the mix of businesses, jobs and housing that move toward fiscal stability for the City in these scenarios? 
  • – Then in the end we arrive at a set of preferred growth rates that are appropriate for the scenarios that we’ve constructed. We can then develop our general plan to accommodate these preferred scenarios. 

My wife and I put forward one vision for Davis to focus on sustainable food development as an economic engine. I’m sure there’s other viable ideas. We need a forum that dives into these and formulates our economic plan rather than just bumbling along as we seem to be doing now. This is only likely to get worse with the fundamental changes after the pandemic.

I’ll go further to say that one of the roots of this problem is the increasing opaqueness of City decision making. “Playing it safe” is the byword for City planning, just when that’s what is most likely to hurt us. That’s why we proposed a fix to the fundamental way decisions are made by the City.

There’s a long list of poor decisions created by this opaqueness that shows how this has cost the City tens of millions of dollars. He points out symptoms of a much deeper problem that is impeding us from developing a long term vision.

It may seem like so much “inside baseball” to focus on the nuts and bolts of process, but its that process that is at the root of the crisis, as boring as that may seem. 

 

How to choose a water system model

The California Water & Environmental Modeling Forum (CWEMF) has proposed to update its water modeling protocol guidance, last issued in 2000. This modeling protocol applies to many other settings, including electricity production and planning (which I am familiar with). I led the review of electricity system simulation models for the California Energy Commission, and asked many of these questions then.

Questions that should be addressed in water system modeling include:

  • Models can be used for either short-term operational or long term planning purposes—models rarely can serve both masters. The model should be chosen for its analytic focus is on predicting with accuracy and/or precision a particular outcome (usually for short term operations) or identifying resilience and sustainability.
  • There can be a trade off between accuracy and precision. And focusing overly so on precision in one aspect of a model is unlikely to improve the overall accuracy of the model due to the lack of precision elsewhere. In addition, increased precision also increases processing time, thus slowing output and flexibility.
  • A model should be able to produce multiple outcomes quickly as a “scenario generator” for analyzing uncertainty, risk and vulnerability. The model should be tested for accuracy when relaxing key constraints that increase processing time. For example, in an electricity production model, relaxing the unit commitment algorithm increased processing speed twelve fold while losing only 7 percent in accuracy, mostly in the extreme tail cases.
  • Water models should be able to use different water condition sequences rather than relying on historic traces. In the latter case, models may operate as though the future is known with certainty.
  • Water management models should include the full set of opportunity costs for water supply, power generation, flood protection and groundwater pumping. This implies that some type of linkage should exist between these types of models.

Underlying economics of polarization

Matthew Kahn, USC economics professor, writes about a new book, Why We’re Polarized,

Rising polarization is taking place because there is now a fundamental disagreement across our society concerning who has the property rights to different resources.

While Kahn is correct about property rights being at the core of the dispute, he glosses over the real issue by going off to discuss game theory and bargaining. That real issue is how different groups in society gained those property rights, whether its entitlement to jobs or use of natural resources or control of social mores. Much of these property rights were gained through coercion of some form, such as slavery, land grabs or paternalistic social structures. Resolving these requires agreeing first on basic societal morality and ethics, and then turning to how to resolve the redistribution of those rights, rather than just plunging straight into bargaining.

Commentary on the “The Road from Serfdom”

Danielle Allen writes eloquently in the December issue of the Atlantic Monthly in the “The Road from Serfdom” about how too many Americans rightfully feel disenfranchised today and many of the reasons why they feel that way. Her description of how we got here is well worth the read. However, she misattributes the roles of economists (and lawyers) and errors in their recent prognostications on how economic progress would unfold.

Allen blames much of the current economic woes on the rise of economists in policymaking. She talks about how economists superseded lawyers in that role, implying that lawyers were somehow better connected to society. The real transformation happened several decades earlier when lawyers took over from the broader set of general citizenry. Just as she identifies how economists (of which I am one) are trained to think in one fashion, lawyers are similarly taught to think in another way that tends to focus on identifying constraints and relying on precedent. Lawyers are also taught that the available solutions require directives through laws and contentious conflict resolution. Lawyers are rarely instructed in how actual institutions work, contrary to Allen’s assertion–lawyers usually learn that as on-the-job training. In fact, it is economists who developed institutional economics that studies the role of such organizations. Economists arrived to propose solutions that could work through incentives and choice and negotiated solutions. So we traded one set of technocrats for another set. Perhaps we have not done well by either set, but we also should not ignore why we chose those professions guide us.

The mistakes that economists made were not as simplistic as Allen describes. She points to a claim that economists did not understand how disruption would impact specific communities and what two decades of disruption would look like in those communities. As contrary examples, I wrote here about how climate change will impact communities, and about how we need to compensate coal mining communities as part of our reductions in greenhouse gas emissions, and even the shaky foundations of benefit-cost analysis.  Instead economists did not foresee two important transformations since the 1970s. (Economists made a similar mistake after the fall of the Berlin Wall, failing to acknowledge that markets need well functioning institutions and laws to facilitate beneficial transactions.)  The first was that agglomeration of knowledge industries (technological and financial) would be so geographically intensive and that these industries would accrue so much wealth. The second was that Americans would become so much less mobile, both geographically and socially. There are many social and policy factors that have led to these trends, but these stories are much more complex than Allen describes. No one could have foreseen these unprecedented changes that have shattered the lives of too many people that have remained behind in communities emptied of economic purpose.

That said, identifying the rise of the ideologies of Nobel Prize winners Friedrich Hayek and Milton Friedman (who were economists) as a key source of our conundrum is accurate. Allen does not discuss the parallel rise of the fantasies of Ayn Rand that fueled the mythologies of Hayek and Friedman. Rand’s work was a surprising path for spreading those ideologies, particularly given how bad her writing was. We now have a core of elites who believe that they somehow are “self made” with no outside help and even overcoming the “parasites” of society. That will be a difficult self image to overcome.

Using floods to replenish groundwater

ALMOND  ORCHARD FLOODING

M.Cubed produced four reports for Sustainable Conservation on using floodwaters to recharge aquifers in California’s Central Valley. The first is on expected costs. The next three are a set on the benefits, participation incentives and financing options for using floodwaters in wetter years to replenish groundwater aquifers. We found that costs would range around $100 per acre-foot, and beneficiaries include not only local farmers, but also downstream communities with lower flood control costs, upstream water users with more space for storage instead of flood control, increased hydropower generation, and more streamside habitat. We discussed several different approaches to incentives based on our experience in a range of market-based regulatory settings and the water transfer market.

With the PPIC’s release of Water and the Future of the San Joaquin Valley, which forecasts a loss of 500,000 acres of agricultural production due to reduced groundwater pumping under the State Groundwater Management Act (SGMA), local solutions that mitigate groundwater restrictions should be moving to the fore.

Don Cameron at Terranova Ranch started doing this deliberately earlier this decade, and working with Phil Bachand and UC Davis, more study has shown the effectiveness, and the lack of risk to crops, from this strategy. The Department of Water Resources has implemented the Flood-MAR program to explore this alternative further. The Flood-MAR whitepaper explores many of these issues, but its list of beneficiaries is incomplete, and the program appears to not yet moved on to how to effectively implement these programs integrated with the local SGMA plans. Our white papers could be useful starting points for that discussion.

(Image Source: Chico Enterprise-Record)