Tag Archives: greenhouse gases

Obstacles to nuclear power, but how much do we really need it?

Jonathan Rauch writes in the Atlantic Monthly about the innovations in nuclear power technology that might overcome its troubled history. He correctly identifies the core of the problem for nuclear power, although it extends even further than he acknowledges. Recent revelations about the fragility of France’s once-vaunted nuclear fleet illustrates deeper management problems with the technology. Unfortunately he is too dismissive of the safety issues and even the hazardous duties that recovery crews experienced at both Chernobyl and Fukushima. Both of those accidents cost those nations hundreds of billions of dollars. As a result of these issues, nuclear power around the world now costs over 10 cents per kilowatt-hour. Grid-scale solar and wind power in contrast costs less than four cents and even adding storage no more than doubles that cost. And this ignores the competition of small-scale distributed energy resources (DER) that could break the utility monopoly required to pay for nuclear power.

Yet Rauch’s biggest error is in asserting without sufficient evidence that nuclear power is required to achieve greenhouse gas emission reductions. Numerous studies (including for California) show that we can get to a 90% emission free and beyond power grid with current technologies and no nuclear. We have two decades to figure out how to get to the last 10% or less, or to determine if we even need to.

The problem with the new nuclear technologies such as small modular reactors (SMR) is that they must be built on a wide scale as a high proportion of the power supply to achieve the technological cost reductions of the type that we have seen for solar and batteries. And to get a low enough cost per kilowatt-hour, those units must run constantly in baseload mode, which only exacerbates the variable output issue for renewables instead of solving it. Running in a load following mode will increase the cost per kilowatt-hour by 50%.

We should continue research in this technology because there may be a breakthrough that solves these dilemmas. But we should not plan on needing it to save our future. We have been disappointed too many times already by empty promises from this industry.

Paradigm change: building out the grid with renewables requires a different perspective

Several observers have asserted that we will require baseload generation, probably nuclear, to decarbonize the power grid. Their claim is that renewable generation isn’t reliable enough and too distant from load centers to power an electrified economy.

Problem is that this perspective relies on a conventional approach to understanding and planning for future power needs. That conventional approach generally planned to meet the highest peak loads of the year with a small margin and then used the excess capacity to produce the energy needed in the remainder of the hours. This premise was based on using consumable fuel to store energy for use in hours when electricity was needed.

Renewables such as solar and wind present a different paradigm. Renewables capture and convert energy to electricity as it becomes available. The next step is to stored that energy using technologies such as batteries. That means that the system needs to be built to meet energy requirements, not peak loads.

Hydropower-dominated systems have already been built in this manner. The Pacific Northwest’s complex on the Columbia River and its branches for half a century had so much excess peak capacity that it could meet much of California’s summer demand. Meeting energy loads during drought years was the challenge. The Columbia River system could store up to 40% of the annual runoff in its reservoirs to assure sufficient supply.

For solar and wind, we will build enough capacity that is multiples of the annual peak load so that we can generate enough energy to meet those loads that occur when the sun isn’t shining and wind isn’t blowing. For example in a system relying on solar power, the typical demand load factor is 60%, i.e., the average load is 60% of the peak or maximum load. A typical solar photovoltaic capacity factor is 20%, i.e., it generates an average output that is 20% of the peak output. In this example system, the required solar capacity would be three times the peak demand on the system to produce sufficient stored electricity. The amount of storage capacity would equal the peak demand (plus a small reserve margin) less the amount of expected renewable generation during the peak hour.

As a result, comparing the total amount of generation capacity installed to the peak demand becomes irrelevant. Instead we first plan for total energy need and then size the storage output to meet the peak demand. (And that storage may be virtually free as it is embodied in our EVs.) This turns the conventional planning paradigm on its head.

Per Capita: Climate needs more than just good will

I wrote this guest column in the Davis Enterprise about the City’s Climate Action and Adaptation Plan. (Thank you John Mott-Smith for extending the privilege.)

Dear Readers, the guest column below was written by Richard McCann, a Davis resident and expert on energy and climate action plans.

————

The city of Davis is considering its first update of its Climate Action and Adaptation Plan since 2010 with a 2020-2040 Plan. The city plans to update the CAAP every couple of years to reflect changing conditions, technologies, financing options, laws and regulations.

The plan does not and cannot achieve a total reduction in greenhouse gas emissions simply because we do not control all of the emission sources — almost three-quarters of our emissions are from vehicles that are largely regulated by state and federal laws. But it does lay out a means to putting a serious dent in the overall amount. 

The CAAP offers a promising future and accepts that we have to protect ourselves as the climate worsens. Among the many benefits we can look forward to are avoiding volatile gas prices while driving cleaner, quieter cars; faster and more controllable cooking while eliminating toxic indoor air; and air conditioning and heating without having to make two investments while paying less.

To better adapt, we’ll have a greener landscape, filtered air for rental homes, and community shelter hubs powered by microgrids to ride out more frequent extreme weather.

We have already seen that adding solar panels raises the value of a house by as much as $4,000 per installed kilowatt (so a 5 kilowatt system adds $20,000). We can expect similar increases in home values with these new technologies due to the future savings, safety and convenience. 

Several state and federal laws and rules foretell what is coming. By 2045 California aims to be at zero net GHG emissions. That will require retiring all of the residential and commercial gas distribution lines. PG&E has already started a program to phase out its lines. A change in state rules will remove from the market several large natural gas appliances such as furnaces by 2030.

In addition, PG&E will no longer offer subsidies to developers to install gas lines to new homes starting next year. The U.S. Environmental Protection Agency appears poised to push further the use of electric appliances in areas with poor air quality such the Sacramento Valley. (Renewable gas and hydrogen will be too expensive and there won’t be enough to go around.)

Without sales to new customers or for replaced furnaces, the cost of maintaining the gas system will rise substantially so switching to electricity for cooking and water heating will save even more money. The CAAP anticipates this transition by having residents begin switching earlier. 

In addition, the recently enacted federal Inflation Reduction Act offers between $400 and $800 billion into funding these types of changes. The California Energy Commission’s budget for this year went from $1 billion to $10 billion to finance these transitions. The CAAP lays out a process for acquiring these financial sources for Davis and its residents. 

That said, some have objected to the CAAP as being too draconian and infringing on personal choices. The fact is that we are now in the midst of a climate emergency — the City Council endorsed this concern with a declaration in 2019. We’re already behind schedule to head off the worst of the threatening impacts. 

We won’t be able to rely solely on voluntary actions to achieve the reductions we need. That the CAAP has to include these actions proves that people have not been acting on their own despite a decade of cajoling since the last CAAP. While we’ve been successful at encouraging voluntary compliance with easy tasks like recycling, we’ve used mandatory permitting requirements to gain compliance with various building standards including energy efficiency measures. (These are usually enforced at point-of-sale of a house.)

We have a choice of mandatory ordinances, incentives through taxes or fees, and subsidies from grants and funds — voluntary just won’t deliver what’s needed. We might be able to financially help those least able to afford changing stoves, heaters or cars, but those funds will be limited. The ability to raise taxes or fees is restricted due to various provisions in the state’s constitution. So we are left with mandatory measures, applied at the most opportune moments. 

Switching to electricity for cooking and water heating may involve some costs, some or most of which will be offset by lower energy costs (especially as gas rates go up.) If you have an air conditioner, you’re likely already set up for a heat pump to replace your furnace — it’s a simple swap. Even so, you can avoid some costs by using a 120-volt induction cooktop instead of 240 volts, and installing a circuit-sharing plug or breaker for large loads to avoid panel upgrades. 

The CAAP will be fleshed out and evolve for at least the next decade. Change is coming and will be inevitable given the dire situation. But this change gives us opportunities to clean our environment and make our city more livable.  

Do small modular reactors (SMR) hold real promise?

The economic analyses of the projected costs for small modular reactors (SMRs) appear to rely on two important assumptions: 1) that the plants will run at capacity factors of current nuclear plants (i.e., 70%-90%+) and 2) that enough will be built quickly enough to gain from “learning by doing” on scale as has occurred with solar, wind and battery technologies. The problem with these assumptions is that they require that SMRs crowd out other renewables with little impact on gas-fired generation.

To achieve low costs in nuclear power requires high capacity factors, that is the total electricity output relative to potential output. The Breakthrough Institute study, for example, assumes a capacity factor greater than 80% for SMRs. The problem is that the typical system load factor, that is the average load divided by the peak load, ranges from 50% to 60%. A generation capacity factor of 80% means that the plant is producing 20% more electricity than the system needs. It also means that other generation sources such as solar and wind will be pushed aside by this amount in the grid. Because the SMRs cannot ramp up and down to the same degree as load swings, not only daily but also seasonally, the system will still need load following fossil-fuel plants or storage. It is just the flip side of filling in for the intermittency of renewables.

To truly operate within the generation system in a manner that directly displaces fossil fuels, an SMR will have to operate at a 60% capacity factor or less. Accommodating renewables will lower that capacity factor further. Decreasing the capacity factor from 80% to 60% will increase the cost of an SMR by a third. This would increase the projected cost in the Breakthrough Institute report for 2050 from $41 per megawatt-hour to $55 per megawatt-hour. Renewables with storage are already beating this cost in 2022 and we don’t need to wait 30 years.

And the Breakthrough Institute study relies questionable assumptions about learning by doing in the industry. First, it assumes that conventional nuclear will experience a 5% learning benefit (i.e., costs will drop 5% for each doubling of capacity). In fact, the industry shows a negative learning rate--costs per kilowatt have been rising as more capacity is built. It is not clear how the SMR industry will reverse this trait. Second, the learning by doing effect in this industry is likely to be on a per plant rather than per megawatt or per turbine basis as has been the case with solar and turbines. The very small unit size for solar and turbine allows for off site factory production with highly repetitive assembly, whereas SMRs will require substantial on-site fabrication that will be site specific. SMR learning rates are more likely to follow those for building construction than other new energy technologies.

Finally, the report does not discuss the risk of catastrophic accidents. The probability of a significant accident is about 1 per 3,700 reactor operating years. Widespread deployment of SMRs will vastly increase the annual risk because that probability is independent of plant size. Building 1,000 SMRs could increase the risk to such a level that these accidents could be happening once every four years.

The Fukushima nuclear plant catastrophe is estimated to have cost $300 billion to $700 billion. The next one could cost in excess of $1 trillion. This risk adds a cost of $11 to $27 per megawatt-hours.

Adding these risk costs on top of the adjusted capacity factor, the cost ranges rises to $65 to $82 per megawatt-hour.

A reply: two different ways California can keep the lights on amid climate change

Mike O’Boyle from Energy Innovation wrote an article in the San Francisco Chronicle listing four ways other than building more natural gas plants to maintain reliability in the state. He summarizes a set of solutions for when the electricity grid can get 85% of its supply from renewable sources, presumably in the next decade. He lists four options specifically:

  • Off shore wind
  • Geothermal
  • Demand response and management
  • Out of state imports

The first three make sense, although the amount of geothermal resources is fairly limited relative to the state’s needs. The problem is the fourth one.

California already imports about a fifth of its electric energy. If we want other states to also electrify their homes and cars, we need to allow them to use their own in-state resources. Further, the cost of importing power through transmission lines is much higher than conventional analyses have assumed. California is going to have to meet as much of its demands internally as possible.

Instead, we should be pursuing two other options:

  • Dispersed microgrids with provisions for conveying output among several or many customers who can share the system without utility interaction. Distributed solar has already reduced the state’s demand by 12% to 20% since 2006. This will require that the state modify its laws regulating transactions among customers and act to protect the investments of those customers against utility interests.
  • Replacing natural gas in existing power plants with renewable biogas. A UC Riverside study shows a potential of 68 billion cubic feet which is about 15% of current gas demand for electricity production. Instead of using this for home cooking, it can meet the limited peak day demands of the electricity grid.

Both of these solutions can be implemented much more quickly than an expanded transmission grid and building new resources in other states. They just take political will.

What “Electrify Everything” has wrong about “reduce, reuse, recycle”

Saul Griffith has written a book that highlights the role of electrification in achieving greenhouse gas emission reductions, and I agree with his basic premise. But he misses important aspects about two points. First, the need to reduce, reuse and recycle goes well beyond just energy consumption. And second, we have the ability to meet most if not all of our energy needs with the lowest impact renewable sources.

Reduce, reuse and recycle is not just about energy–it’s also about reducing consumption of natural resources such as minerals and biomass, as well as petroleum and methane used for plastics, and pollution caused by that consumption. In many situations, energy savings are only a byproduct. Even so, almost always the cheapest way to meet an energy need is to first reduce its use. That’s what energy efficiency is about. So we don’t want to just tell consumers to continue along their merry way, just switch it up with electricity. A quarter to a third our global GHG emissions are from resource consumption, not energy use.

In meeting our energy needs, we can largely rely on solar and wind supplemented with biofuels. Griffith asserts that the U.S. would need 2% of its land mass to supply the needed electricity, but his accounting makes three important errors. First, placing renewables doesn’t eliminate other uses of that land, particularly for wind. Acreage devoted to wind in particular can be used also for different types of farming and even open space. In comparison, fossil-fuel and nuclear plants completely displace any other land use. Turbine technology is evolving to limit avian mortality (and even then its tall buildings and household cats that cause most bird deaths). Second most of the solar supply can be met on rooftops and covering parking lots. These locations are cost effective compared to grid scale sources once we account for transmission costs. And third, our energy storage is literally driving down the road–in our new electric vehicles. A 100% EV fleet in California will have enough storage to meet 30 times the current peak load. A car owner will be able to devote less than 5% of their battery capacity to meet their home energy needs. All of this means that the real footprint can be much less than 1%.

Nuclear power has never lived up to its promise and is expensive compared to other low-emission options. While the direct costs of current-technology nuclear power is more than 12 cents a kilowatt-hour when adding transmission, grid-scale renewables are less than half of that, and distributed energy resources are at least comparable with almost no land-use footprint and able to provide better reliability and resilience. In addition, the potential of catastrophic events at nuclear plants adds another 1 to 3 cents per kilowatt-hour. Small modular reactors (SMR) have been promoted as a game changer, but we have been waiting for two decades. Nuclear or green hydrogen may emerge as economically-viable options, but we shouldn’t base our plans on that.

Guidelines For Better Net Metering; Protecting All Electricity Customers And The Climate

Authors Ahmad Faruqui, Richard McCann and Fereidoon Sioshansi[1] respond to Professor Severin Borenstein’s much-debated proposal to reform California’s net energy metering, which was first published as a blog and later in a Los Angeles Times op-ed.

Deciding if solar installation is suboptimal requires that the initial premises be specified correctly

A recent article “Heterogeneous Solar Capacity Benefits, Appropriability, and the Costs of Suboptimal Siting” in the Journal of the Association of Environmental and Resource Economists finds that distributed solar (e.g., rooftop solar) is not being installed a manner that “optimally” mitigates air pollution damages from electricity generation across the U.S. Unfortunately the paper is built on two premises that do not reflect the reality of available options and appropriate pricing signals.

First, the authors appear to be relying on the premise that sufficient solar, grid-scale or distributed, can be installed cost-effectively across the U.S. While the paper includes geographic variations in generation per installed kilowatt of capacity, it says nothing about the similarly widely varying costs per kilowatt-hour. They do not acknowledge that panels in the Pacific Northwest will cost twice that of those in the Desert Southwest. This importance of this disparity is compounded by the underestimate of the social cost of carbon and the possible conflation of sulfur dioxide and particulate matter damages. The currently accepted social cost of GHG emissions developed by the U.S. Environmental Protection Agency (US EPA) is ranges from $50 to $150 per tonne in 2030 (and recent studies have estimated that this is too low), compared to the outdated $41 per tonne in the article. Most of the SO2 damages arise from creating PM so there is likely double counting for these criteria pollutants. (The study also ignore the strong correlation between GHG and SO2 emissions as coal is the biggest source of both.) The study also fails to account for the enormous transmission costs that would be incurred moving solar output from the Desert Southwest to the Northeast to mitigate the purported damages.

Second, the authors try to claim that rooftop solar has not relieved transmission congestion by looking at grid congestion prices. The problem is that this method is like looking at an empty barn and saying a horse never lived there. Congestion pricing is based on the current transmission capacity situation. It says nothing about the history of transmission congestion or the ability and efforts to look forward to mitigate congestion. The study found that congestion prices were often negative or small in areas with substantial rooftop solar capacity. That doesn’t show that the solar capacity has little value–instead it shows that it actually relieved the congestion effectively–a completely opposite conclusion.

In contrast, the California Independent System Operator (CAISO) calculated in 2017 (contemporaneously with the article’s baseline) that at least $2.6 billion in transmission projects had been deferred. And given the utilities’ poor records on load forecasting, these savings have likely grown substantially. CAISO had anticipated and already relieved the congestion that the authors’ purported metric was searching for.

This disparity in economic results highlights the nature of investing in long-lived infrastructure that requires multiple years to build–one cannot wait for a shortfall to emerge to respond because that’s too late. Instead, one must anticipate those events and act even when its uncertain. This study is yet another example of how relying on the premise that short-run electricity market prices are reflective of long-run marginal costs is mistaken and should be set aside for policy analysis.

What “Don’t Look Up” really tells us

The movie Don’t Look Up has been getting “two thumbs up” from a certain political segment for speaking to truth in their view. An existential threat from a comet is used metaphorically to describe the resistance to the import of climate change risk. After watching the film I have a somewhat different take away that speaks a different truth to those viewers who found the message resonating most. Instead of blaming our political system, we should have a different take away that we can act on collectively.

Don’t Look Up reveals several errors and blind spots in the scientific and activist communities in communicating with the public and influencing decision making. The first is a mistaken belief that the public is actually interested in scientific study beyond parlor room tricks. The second is believing that people will act solely based on shrill warnings from scientists acting as high priests. The third (which isn’t addressed in the film) is failing to fully acknowledge what people see that they may lose by responding to these calls for change. Instead these communities should reconsider what they focus on and how they communicate.

The movie opens with the first error–the astronomers’ long winded attempt to explain all of the analysis that went into their prediction. Most people don’t see how science has any direct influence on their lives–how is digging up dinosaurs or discovering the outer bounds of the universe relevant to every day living? It’s a failure of our education system, but we can’t correct to help now. Over the last several years the message on climate change has changed to highlight the apparent effects on storms and heat waves, but someone living in Kansas doesn’t see how rising sea levels will affect them. A long explanation about the mechanics and methods just loses John Q. Public (although there is a small cadre that is fascinated) and they tune out. It’s hard to be disciplined with a simple message when you find the deeper complexity interesting, but that’s what it will take.

Shrill warnings have never been well received, no matter the call. We see that today with the resistance to measures to suppress the COVID-19 pandemic. James Hansen at NASA first raised the alarm about climate change in the 1980s but he was largely ignored due to his righteousness and arrogance in public. He made a serious error in stepping well outside of his expertise to assert particular solutions. The public has always looked to who they view as credible, regardless of their credentials, for guidance. Academics have too often assumed that they deserve this respect simply because they have “the” credential. That much of the public views science as mysterious with little more basis than religion does not help the cause. Instead, finding the right messengers is key to being successful.

Finally, and importantly overlooked in the film, a call to action of this magnitude requires widespread changes in behaviors and investments. People generally have worked hard to achieve what they have and are risk averse to such changes that may severely erode their financial well-being. For example, as many as 1 in 5 private sector jobs are tied to automobiles and fossil fuel production. One might extoll the economic benefits of switching to renewable electricity but workers and investors in these sectors are uncertain about their futures with no clear pathways to share in this new prosperity. Without addressing a truly valid means of resolving these risks beyond the tired “retraining” shibboleth, this core and its sympathizers will resist meaningful change.

Effecting these solutions likely require sacrifice from those who benefit from these changes. Pointing to benefit-cost analyses that rely on a “faux” hypothetical transaction to justify these solutions really is no better than the wealthy asserting asserting that they deserve to keep most of their financial gains simply because that’s how the market works. Compensating owners of these assets and making what appears to be inefficient decisions to maintain impacted communities may seem unfair for a variety of reasons, but we need to overcome our biases embedded in our favored solutions to move forward.

What to do about Diablo Canyon?

The debate over whether to close Diablo Canyon has resurfaced. The California Public Utilities Commission, which support from the Legislature, decided in 2018 to close Diablo by 2025 rather than proceed to relicensing. PG&E applied in 2016 to retire the plant rather than relicense due to the high costs that would make the energy uneconomic. (I advised the Joint CCAs in this proceeding.)

Now a new study from MIT and Stanford finds potential savings and emission reductions from continuing operation. (MIT in particular has been an advocate for greater use of nuclear power.) Others have written opinion articles on either side of the issue. I wrote the article below in the Davis Enterprise addressing this issue. (It was limited to 900 words so I couldn’t cover everything.)

IT’S OK TO CLOSE DIABLO CANYON NUCLEAR PLANT
A previous column (by John Mott-Smith) asked whether shutting down the Diablo Canyon nuclear plant is risky business if we don’t know what will replace the electricity it produces. John’s friend Richard McCann offered to answer his question. This is a guest column, written by Richard, a universally respected expert on energy, water and environmental economics.

John Mott-Smith asked several questions about the future of nuclear power and the upcoming closure of PG&E’s Diablo Canyon Power Plant in 2025. His main question is how are we going to produce enough reliable power for our economy’s shift to electricity for cars and heating. The answers are apparent, but they have been hidden for a variety of reasons.
I’ve worked on electricity and transportation issues for more than three decades. I began my career evaluating whether to close Sacramento Municipal Utility District’s Rancho Seco Nuclear Generating Station and recently assessed the cost to relicense and continue operations of Diablo after 2025.
Looking first at Diablo Canyon, the question turns almost entirely on economics and cost. When the San Onofre Nuclear Generating Station closed suddenly in 2012, greenhouse gas emissions rose statewide the next year, but then continued a steady downward trend. We will again have time to replace Diablo with renewables.
Some groups focus on the risk of radiation contamination, but that was not a consideration for Diablo’s closure. Instead, it was the cost of compliance with water quality regulations. The power plant currently uses ocean water for cooling. State regulations required changing to a less impactful method that would have cost several billion dollars to install and would have increased operating costs. PG&E’s application to retire the plant showed the costs going forward would be at least 10 to 12 cents per kilowatt-hour.
In contrast, solar and wind power can be purchased for 2 to 10 cents per kilowatt-hour depending on configuration and power transmission. Even if new power transmission costs 4 cents per kilowatt-hour and energy storage adds another 3 cents, solar and wind units cost about 3 cents, which totals at the low end of the cost for Diablo Canyon.
What’s even more exciting is the potential for “distributed” energy resources, where generation and power management occurs locally, even right on the customers’ premises rather than centrally at a power plant. Rooftop solar panels are just one example—we may be able to store renewable power practically for free in our cars and trucks.
Automobiles are parked 95% of the time, which means that an electric vehicle (EV) could store solar power at home or work during the day and for use at night. When we get to a vehicle fleet that is 100% EVs, we will have more than 30 times the power capacity that we need today. This means that any individual car likely will only have to use 10% of its battery capacity to power a house, leaving plenty for driving the next day.
With these opportunities, rooftop and community power projects cost 6 to 10 cents per kilowatt-hour compared with Diablo’s future costs of 10 to 12 cents.
Distributed resources add an important local protection as well. These resources can improve reliability and resilience in the face of increasing hazards created by climate change. Disruptions in the distribution wires are the cause of more than 95% of customer outages. With local generation, storage, and demand management, many of those outages can be avoided, and electricity generated in our own neighborhoods can power our houses during extreme events. The ad that ran during the Olympics for Ford’s F-150 Lightning pick-up illustrates this potential.
Opposition to this new paradigm comes mainly from those with strong economic interests in maintaining the status quo reliance on large centrally located generation. Those interests are the existing utilities, owners, and builders of those large plants plus the utility labor unions. Unfortunately, their policy choices to-date have led to extremely high rates and necessitate even higher rates in the future. PG&E is proposing to increase its rates by another third by 2024 and plans more down the line. PG&E’s past mistakes, including Diablo Canyon, are shown in the “PCIA” exit fee that [CCA] customers pay—it is currently 20% of the rate. Yolo County created VCEA to think and manage differently than PG&E.
There may be room for nuclear generation in the future, but the industry has a poor record. While the cost per kilowatt-hour has gone down for almost all technologies, even fossil-fueled combustion turbines, that is not true for nuclear energy. Several large engineering firms have gone bankrupt due to cost overruns. The global average cost has risen to over 10 cents per kilowatt-hour. Small modular reactors (SMR) may solve this problem, but we have been promised these are just around the corner for two decades now. No SMR is in operation yet.
Another problem is management of radioactive waste disposal and storage over the course of decades, or even millennia. Further, reactors fail on a periodic basis and the cleanup costs are enormous. The Fukuyama accident cost Japan $300 to $750 billion. No other energy technology presents such a degree of catastrophic failure. This liability needs to be addressed head on and not ignored or dismissed if the technology is to be pursued.