The California Water & Environmental Modeling Forum (CWEMF) has proposed to update its water modeling protocol guidance, last issued in 2000. This modeling protocol applies to many other settings, including electricity production and planning (which I am familiar with). I led the review of electricity system simulation models for the California Energy Commission, and asked many of these questions then.
Questions that should be addressed in water system modeling include:
Models can be used for either short-term operational or long term planning purposes—models rarely can serve both masters. The model should be chosen for its analytic focus is on predicting with accuracy and/or precision a particular outcome (usually for short term operations) or identifying resilience and sustainability.
There can be a trade off between accuracy and precision. And focusing overly so on precision in one aspect of a model is unlikely to improve the overall accuracy of the model due to the lack of precision elsewhere. In addition, increased precision also increases processing time, thus slowing output and flexibility.
A model should be able to produce multiple outcomes quickly as a “scenario generator” for analyzing uncertainty, risk and vulnerability. The model should be tested for accuracy when relaxing key constraints that increase processing time. For example, in an electricity production model, relaxing the unit commitment algorithm increased processing speed twelve fold while losing only 7 percent in accuracy, mostly in the extreme tail cases.
Water models should be able to use different water condition sequences rather than relying on historic traces. In the latter case, models may operate as though the future is known with certainty.
Water management models should include the full set of opportunity costs for water supply, power generation, flood protection and groundwater pumping. This implies that some type of linkage should exist between these types of models.
Rising polarization is taking place because there is now a fundamental disagreement across our society concerning who has the property rights to different resources.
While Kahn is correct about property rights being at the core of the dispute, he glosses over the real issue by going off to discuss game theory and bargaining. That real issue is how different groups in society gained those property rights, whether its entitlement to jobs or use of natural resources or control of social mores. Much of these property rights were gained through coercion of some form, such as slavery, land grabs or paternalistic social structures. Resolving these requires agreeing first on basic societal morality and ethics, and then turning to how to resolve the redistribution of those rights, rather than just plunging straight into bargaining.
Danielle Allen writes eloquently in the December issue of the Atlantic Monthly in the “The Road from Serfdom” about how too many Americans rightfully feel disenfranchised today and many of the reasons why they feel that way. Her description of how we got here is well worth the read. However, she misattributes the roles of economists (and lawyers) and errors in their recent prognostications on how economic progress would unfold.
Allen blames much of the current economic woes on the rise of economists in policymaking. She talks about how economists superseded lawyers in that role, implying that lawyers were somehow better connected to society. The real transformation happened several decades earlier when lawyers took over from the broader set of general citizenry. Just as she identifies how economists (of which I am one) are trained to think in one fashion, lawyers are similarly taught to think in another way that tends to focus on identifying constraints and relying on precedent. Lawyers are also taught that the available solutions require directives through laws and contentious conflict resolution. Lawyers are rarely instructed in how actual institutions work, contrary to Allen’s assertion–lawyers usually learn that as on-the-job training. In fact, it is economists who developed institutional economics that studies the role of such organizations. Economists arrived to propose solutions that could work through incentives and choice and negotiated solutions. So we traded one set of technocrats for another set. Perhaps we have not done well by either set, but we also should not ignore why we chose those professions guide us.
The mistakes that economists made were not as simplistic as Allen describes. She points to a claim that economists did not understand how disruption would impact specific communities and what two decades of disruption would look like in those communities. As contrary examples, I wrote here about how climate change will impact communities, and about how we need to compensate coal mining communities as part of our reductions in greenhouse gas emissions, and even the shaky foundations of benefit-cost analysis. Instead economists did not foresee two important transformations since the 1970s. (Economists made a similar mistake after the fall of the Berlin Wall, failing to acknowledge that markets need well functioning institutions and laws to facilitate beneficial transactions.) The first was that agglomeration of knowledge industries (technological and financial) would be so geographically intensive and that these industries would accrue so much wealth. The second was that Americans would become so much less mobile, both geographically and socially. There are many social and policy factors that have led to these trends, but these stories are much more complex than Allen describes. No one could have foreseen these unprecedented changes that have shattered the lives of too many people that have remained behind in communities emptied of economic purpose.
That said, identifying the rise of the ideologies of Nobel Prize winners Friedrich Hayek and Milton Friedman (who were economists) as a key source of our conundrum is accurate. Allen does not discuss the parallel rise of the fantasies of Ayn Rand that fueled the mythologies of Hayek and Friedman. Rand’s work was a surprising path for spreading those ideologies, particularly given how bad her writing was. We now have a core of elites who believe that they somehow are “self made” with no outside help and even overcoming the “parasites” of society. That will be a difficult self image to overcome.
M.Cubed produced four reports for Sustainable Conservation on using floodwaters to recharge aquifers in California’s Central Valley. The first is on expected costs. The next three are a set on the benefits, participation incentives and financing options for using floodwaters in wetter years to replenish groundwater aquifers. We found that costs would range around $100 per acre-foot, and beneficiaries include not only local farmers, but also downstream communities with lower flood control costs, upstream water users with more space for storage instead of flood control, increased hydropower generation, and more streamside habitat. We discussed several different approaches to incentives based on our experience in a range of market-based regulatory settings and the water transfer market.
Don Cameron at Terranova Ranch started doing this deliberately earlier this decade, and working with Phil Bachand and UC Davis, more study has shown the effectiveness, and the lack of risk to crops, from this strategy. The Department of Water Resources has implemented the Flood-MAR program to explore this alternative further. The Flood-MAR whitepaper explores many of these issues, but its list of beneficiaries is incomplete, and the program appears to not yet moved on to how to effectively implement these programs integrated with the local SGMA plans. Our white papers could be useful starting points for that discussion.
Two recent reports highlight the benefits of using “reverse auctions”. In a reverse auction, the buyer specifies a quantity to be purchased, and sellers bid to provide a portion of that quantity. An article in Utility Dive summarizes some of the experiences with renewable market auctions. A separate report in the Review of Environmental Economics and Policy goes further to lay out five guidelines:
Encourage a Large Number of Auction Participants
Limit the Amount of Auctioned Capacity
Leverage Policy Frameworks and Market Structures
Earmark a Portion of Auctioned Capacity for Less-mature Technologies
Balance Penalizing Delivery Failures and Fostering Competition
This policy prescription requires well-informed policy makers balancing different factors–not a task that is well suited to a state legislature. How to develop such a coherent policy can done in two ways. The first is to let the a state commission work through a proceeding to set an overall target and structure. But perhaps a more fruitful approach would be to let local utilities, such as California’s community choice aggregators (CCAs) to set up individual auctions, maybe even setting their own storage targets and then experimenting with different approaches.
California has repeatedly made errors by overly relying on centralized market structures that overcommit or mismatch resource acquisition. This arises because a mistake by a single central buyer is multiplied across all load while a mistake by one buyer within a decentralized market is largely isolated to the load of that one buyer. Without perfect foresight and a distinct lack of mechanisms to appropriately share risk between buyers and sellers, we should be designing an electricity market that mitigates risks to consumers rather than trying to achieve a mythological “optimal” result.
The media and the public appears to have confused the Green Party’s platform calling for 100% renewable energy by 2030 with the goals in the Joint Resolution for a Green New Deal introduced by Senator Edward Markey (D-MA) and Representative Alexandria Ocasio-Cortez (D-NY). The Joint Resolution calls for a “10-year national mobilization,” but contains no deadlines other than zero greenhouse-gas emissions by 2050, which is 30+ years from now. Given that we went from horse and buggies and wood stoves to widespread automobile use and electrification in 30 years at the beginning of the twentieth century, such a transformation doesn’t seem imposing.
A just released study on the effects of the Berkeley, California soda tax of one cent per ounce found that soda consumption has fallen 52% over the last four years. That is a remarkable price elasticity. Assuming a 20-ounce bottle costs $1.99, with a tax of 20 cents, that implies a price elasticity of -5. In other words, for every 1% o price increase, demand falls 5%. The study relied on household surveys, which are not always reliable about consumption quantities, so it would be interesting to see actual sales data.