Connect with us

Science

How much more water and power does AI computing demand? Tech firms don't want you to know

Published

on

How much more water and power does AI computing demand? Tech firms don't want you to know

Every time someone uses ChatGPT to write an essay, create an image or advise them on planning their day, the environment pays a price.

A query on the chatbot that uses artificial intelligence is estimated to require at least 10 times more electricity than a standard search on Google.

If all Google searches similarly used generative AI, they might consume as much electricity as a country the size of Ireland, calculates Alex de Vries, the founder of Digiconomist, a website that aims to expose the unintended consequences of digital trends.

Yet someone using ChatGPT or another artificial intelligence application has no way of knowing how much power their questions will consume as they are processed in the tech companies’ enormous data centers.

De Vries said the skyrocketing energy demand of AI technologies will no doubt require the world to burn more climate-warming oil, gas and coal.

Advertisement

“Even if we manage to feed AI with renewables, we have to realize those are limited in supply, so we’ll be using more fossil fuels elsewhere,” he said. “The ultimate outcome of this is more carbon emissions.”

AI is also thirsty for water. ChatGPT gulps roughly a 16-ounce bottle in as few as 10 queries, calculates Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside, and his colleagues.

The increasing consumption of energy and water by AI has raised concerns in California and around the globe. Experts have detailed how it could stall the transition to green energy, while increasing consumer’s electric bills and the risk of blackouts.

To try to prevent those consequences, De Vries, Ren and other experts are calling on the tech companies to disclose to users how much power and water their queries will consume.

“I think the first step is to have more transparency,” Ren said. The AI developers, he said, “tend to be secretive about their energy usage and their water consumption.”

Advertisement

Ren said users should be told on the websites where they are asked to type in their queries how much energy and water their requests will require. He said this would be similar to how Google now tells people searching for airline flights how much carbon emissions the trip will generate.

“If we had that knowledge,” he said, “then we could make more informed decisions.”

Data centers — enormous warehouses of computer servers that support the internet — have long been big power users. But the specialized computer chips required for generative AI use far more electricity because they are designed to read through vast amounts of data.

The new chips also generate so much heat that even more power and water is needed to keep them cool.

Even though the benefits and risks of AI aren’t yet fully known, companies are increasingly incorporating the technology into existing products.

Advertisement

In May, for example, Google announced that it was adding what it called “AI Overviews” to its search engine. Whenever someone now types a question into Google search, the company’s AI generates an answer from the search results, which is highlighted at the top.

Not all of Google’s AI-generated answers have been correct, including when it told a user to add Elmer’s glue to pizza sauce to keep cheese from sliding off the crust.

But searchers who don’t want those AI-generated answers or want to avoid the extra use of power and water can’t turn off the feature.

“Right now, the user doesn’t have the option to opt out,” Ren said.

Google did not respond to questions from The Times.

Advertisement

OpenAI, the company that created ChatGPT, responded with a prepared statement, but declined to answer specific questions, such as how much power and water the chatbot used.

“AI can be energy-intensive and that’s why we are constantly working to improve efficiencies,” OpenAI said. “We carefully consider the best use of our computing power and support our partners’ efforts to achieve their sustainability goals. We also believe that AI can play a key role in accelerating scientific progress in the discovery of climate solutions.”

Three years ago, Google vowed to reach “net-zero” — where its emissions of greenhouse gases would be equal to what it removed — by 2030.

The company isn’t making progress toward that goal. In 2023, its total carbon emissions increased by 13%, the company disclosed in a July report. Since 2019, its emissions are up 48%.

“As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute,” the company said in the report.

Advertisement

Google added that it expects its emissions to continue to rise before dropping sometime in the future. It did not say when that may be.

The company also disclosed that its data centers consumed 6.1 billion gallons of water in 2023 — 17% more than the year before.

“We’re committed to developing AI responsibly by working to address its environmental footprint,” the report said.

De Vries said he was disappointed Google had not disclosed in the report how much AI was adding to its power needs. The company said in the report that such a “distinction between AI and other workloads” would “not be meaningful.”

By not separately reporting the power use of AI, he said, it is impossible to calculate just how much more electricity Google search was now using with the addition of AI Overviews.

Advertisement

“While capable of delivering the required info,” he said, “they are now withholding it.”

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Science

When A.I.’s Output Is a Threat to A.I. Itself

Published

on

When A.I.’s Output Is a Threat to A.I. Itself

The internet is becoming awash in words and images generated by artificial intelligence.

Sam Altman, OpenAI’s chief executive, wrote in February that the company generated about 100 billion words per day — a million novels’ worth of text, every day, an unknown share of which finds its way onto the internet.

A.I.-generated text may show up as a restaurant review, a dating profile or a social media post. And it may show up as a news article, too: NewsGuard, a group that tracks online misinformation, recently identified over a thousand websites that churn out error-prone A.I.-generated news articles.

Advertisement

In reality, with no foolproof methods to detect this kind of content, much will simply remain undetected.

All this A.I.-generated information can make it harder for us to know what’s real. And it also poses a problem for A.I. companies. As they trawl the web for new data to train their next models on — an increasingly challenging task — they’re likely to ingest some of their own A.I.-generated content, creating an unintentional feedback loop in which what was once the output from one A.I. becomes the input for another.

In the long run, this cycle may pose a threat to A.I. itself. Research has shown that when generative A.I. is trained on a lot of its own output, it can get a lot worse.

Here’s a simple illustration of what happens when an A.I. system is trained on its own output, over and over again:

Advertisement

This is part of a data set of 60,000 handwritten digits.

When we trained an A.I. to mimic those digits, its output looked like this.

This new set was made by an A.I. trained on the previous A.I.-generated digits. What happens if this process continues?

Advertisement

After 20 generations of training new A.I.s on their predecessors’ output, the digits blur and start to erode.

After 30 generations, they converge into a single shape.

Advertisement

While this is a simplified example, it illustrates a problem on the horizon.

Imagine a medical-advice chatbot that lists fewer diseases that match your symptoms, because it was trained on a narrower spectrum of medical knowledge generated by previous chatbots. Or an A.I. history tutor that ingests A.I.-generated propaganda and can no longer separate fact from fiction.

Just as a copy of a copy can drift away from the original, when generative A.I. is trained on its own content, its output can also drift away from reality, growing further apart from the original data that it was intended to imitate.

Advertisement

In a paper published last month in the journal Nature, a group of researchers in Britain and Canada showed how this process results in a narrower range of A.I. output over time — an early stage of what they called “model collapse.”

The eroding digits we just saw show this collapse. When untethered from human input, the A.I. output dropped in quality (the digits became blurry) and in diversity (they grew similar).

How an A.I. that draws digits “collapses” after being trained on its own output

If only some of the training data were A.I.-generated, the decline would be slower or more subtle. But it would still occur, researchers say, unless the synthetic data was complemented with a lot of new, real data.

Degenerative A.I.

Advertisement

In one example, the researchers trained a large language model on its own sentences over and over again, asking it to complete the same prompt after each round.

When they asked the A.I. to complete a sentence that started with “To cook a turkey for Thanksgiving, you…,” at first, it responded like this:

Even at the outset, the A.I. “hallucinates.” But when the researchers further trained it on its own sentences, it got a lot worse…

An example of text generated by an A.I. model.

Advertisement

After two generations, it started simply printing long lists.

An example of text generated by an A.I. model after being trained on its own sentences for 2 generations.

And after four generations, it began to repeat phrases incoherently.

Advertisement

An example of text generated by an A.I. model after being trained on its own sentences for 4 generations.

“The model becomes poisoned with its own projection of reality,” the researchers wrote of this phenomenon.

Advertisement

This problem isn’t just confined to text. Another team of researchers at Rice University studied what would happen when the kinds of A.I. that generate images are repeatedly trained on their own output — a problem that could already be occurring as A.I.-generated images flood the web.

They found that glitches and image artifacts started to build up in the A.I.’s output, eventually producing distorted images with wrinkled patterns and mangled fingers.

When A.I. image models are trained on their own output, they can produce distorted images, mangled fingers or strange patterns.

A.I.-generated images by Sina Alemohammad and others.

Advertisement

“You’re kind of drifting into parts of the space that are like a no-fly zone,” said Richard Baraniuk, a professor who led the research on A.I. image models.

The researchers found that the only way to stave off this problem was to ensure that the A.I. was also trained on a sufficient supply of new, real data.

While selfies are certainly not in short supply on the internet, there could be categories of images where A.I. output outnumbers genuine data, they said.

For example, A.I.-generated images in the style of van Gogh could outnumber actual photographs of van Gogh paintings in A.I.’s training data, and this may lead to errors and distortions down the road. (Early signs of this problem will be hard to detect because the leading A.I. models are closed to outside scrutiny, the researchers said.)

Why collapse happens

Advertisement

All of these problems arise because A.I.-generated data is often a poor substitute for the real thing.

This is sometimes easy to see, like when chatbots state absurd facts or when A.I.-generated hands have too many fingers.

But the differences that lead to model collapse aren’t necessarily obvious — and they can be difficult to detect.

When generative A.I. is “trained” on vast amounts of data, what’s really happening under the hood is that it is assembling a statistical distribution — a set of probabilities that predicts the next word in a sentence, or the pixels in a picture.

For example, when we trained an A.I. to imitate handwritten digits, its output could be arranged into a statistical distribution that looks like this:

Advertisement

Distribution of A.I.-generated data

Examples of
initial A.I. output:

Advertisement

The distribution shown here is simplified for clarity.

The peak of this bell-shaped curve represents the most probable A.I. output — in this case, the most typical A.I.-generated digits. The tail ends describe output that is less common.

Notice that when the model was trained on human data, it had a healthy spread of possible outputs, which you can see in the width of the curve above.

But after it was trained on its own output, this is what happened to the curve:

Advertisement

Distribution of A.I.-generated data when trained on its own output

It gets taller and narrower. As a result, the model becomes more and more likely to produce a smaller range of output, and the output can drift away from the original data.

Meanwhile, the tail ends of the curve — which contain the rare, unusual or surprising outcomes — fade away.

This is a telltale sign of model collapse: Rare data becomes even rarer.

If this process went unchecked, the curve would eventually become a spike:

Advertisement

Distribution of A.I.-generated data when trained on its own output

This was when all of the digits became identical, and the model completely collapsed.

Why it matters

This doesn’t mean generative A.I. will grind to a halt anytime soon.

The companies that make these tools are aware of these problems, and they will notice if their A.I. systems start to deteriorate in quality.

Advertisement

But it may slow things down. As existing sources of data dry up or become contaminated with A.I. “slop,” researchers say it makes it harder for newcomers to compete.

A.I.-generated words and images are already beginning to flood social media and the wider web. They’re even hiding in some of the data sets used to train A.I., the Rice researchers found.

“The web is becoming increasingly a dangerous place to look for your data,” said Sina Alemohammad, a graduate student at Rice who studied how A.I. contamination affects image models.

Big players will be affected, too. Computer scientists at N.Y.U. found that when there is a lot of A.I.-generated content in the training data, it takes more computing power to train A.I. — which translates into more energy and more money.

“Models won’t scale anymore as they should be scaling,” said ​​Julia Kempe, the N.Y.U. professor who led this work.

Advertisement

The leading A.I. models already cost tens to hundreds of millions of dollars to train, and they consume staggering amounts of energy, so this can be a sizable problem.

‘A hidden danger’

Finally, there’s another threat posed by even the early stages of collapse: an erosion of diversity.

And it’s an outcome that could become more likely as companies try to avoid the glitches and “hallucinations” that often occur with A.I. data.

This is easiest to see when the data matches a form of diversity that we can visually recognize — people’s faces:

Advertisement

This set of A.I. faces was created by the same Rice researchers who produced the distorted faces above. This time, they tweaked the model to avoid visual glitches.

A grid of A.I.-generated faces showing variations in their poses, expressions, ages and races.

This is the output after they trained a new A.I. on the previous set of faces. At first glance, it may seem like the model changes worked: The glitches are gone.

Advertisement

After one generation of training on A.I. output, the A.I.-generated faces appear more similar.

After two generations …

After two generations of training on A.I. output, the A.I.-generated faces are less diverse than the original image.

Advertisement

After three generations …

After three generations of training on A.I. output, the A.I.-generated faces grow more similar.

After four generations, the faces all appeared to converge.

After four generations of training on A.I. output, the A.I.-generated faces appear almost identical.

Advertisement

This drop in diversity is “a hidden danger,” Mr. Alemohammad said. “You might just ignore it and then you don’t understand it until it’s too late.”

Just as with the digits, the changes are clearest when most of the data is A.I.-generated. With a more realistic mix of real and synthetic data, the decline would be more gradual.

Advertisement

But the problem is relevant to the real world, the researchers said, and will inevitably occur unless A.I. companies go out of their way to avoid their own output.

Related research shows that when A.I. language models are trained on their own words, their vocabulary shrinks and their sentences become less varied in their grammatical structure — a loss of “linguistic diversity.”

And studies have found that this process can amplify biases in the data and is more likely to erase data pertaining to minorities.

Ways out

Perhaps the biggest takeaway of this research is that high-quality, diverse data is valuable and hard for computers to emulate.

Advertisement

One solution, then, is for A.I. companies to pay for this data instead of scooping it up from the internet, ensuring both human origin and high quality.

OpenAI and Google have made deals with some publishers or websites to use their data to improve A.I. (The New York Times sued OpenAI and Microsoft last year, alleging copyright infringement. OpenAI and Microsoft say their use of the content is considered fair use under copyright law.)

Better ways to detect A.I. output would also help mitigate these problems.

Google and OpenAI are working on A.I. “watermarking” tools, which introduce hidden patterns that can be used to identify A.I.-generated images and text.

But watermarking text is challenging, researchers say, because these watermarks can’t always be reliably detected and can easily be subverted (they may not survive being translated into another language, for example).

Advertisement

A.I. slop is not the only reason that companies may need to be wary of synthetic data. Another problem is that there are only so many words on the internet.

Some experts estimate that the largest A.I. models have been trained on a few percent of the available pool of text on the internet. They project that these models may run out of public data to sustain their current pace of growth within a decade.

“These models are so enormous that the entire internet of images or conversations is somehow close to being not enough,” Professor Baraniuk said.

To meet their growing data needs, some companies are considering using today’s A.I. models to generate data to train tomorrow’s models. But researchers say this can lead to unintended consequences (such as the drop in quality or diversity that we saw above).

There are certain contexts where synthetic data can help A.I.s learn — for example, when output from a larger A.I. model is used to train a smaller one, or when the correct answer can be verified, like the solution to a math problem or the best strategies in games like chess or Go.

Advertisement

And new research suggests that when humans curate synthetic data (for example, by ranking A.I. answers and choosing the best one), it can alleviate some of the problems of collapse.

Companies are already spending a lot on curating data, Professor Kempe said, and she believes this will become even more important as they learn about the problems of synthetic data.

But for now, there’s no replacement for the real thing.

About the data

To produce the images of A.I.-generated digits, we followed a procedure outlined by researchers. We first trained a type of a neural network known as a variational autoencoder using a standard data set of 60,000 handwritten digits.

Advertisement

We then trained a new neural network using only the A.I.-generated digits produced by the previous neural network, and repeated this process in a loop 30 times.

To create the statistical distributions of A.I. output, we used each generation’s neural network to create 10,000 drawings of digits. We then used the first neural network (the one that was trained on the original handwritten digits) to encode these drawings as a set of numbers, known as a “latent space” encoding. This allowed us to quantitatively compare the output of different generations of neural networks. For simplicity, we used the average value of this latent space encoding to generate the statistical distributions shown in the article.

Continue Reading

Science

Video: Boeing Starliner Astronauts Will Return to Earth in SpaceX Vehicle

Published

on

Video: Boeing Starliner Astronauts Will Return to Earth in SpaceX Vehicle

new video loaded: Boeing Starliner Astronauts Will Return to Earth in SpaceX Vehicle

transcript

transcript

Boeing Starliner Astronauts Will Return to Earth in SpaceX Vehicle

NASA announced that two astronauts aboard the International Space Station will have their stay extended by several months and that they will return on a SpaceX capsule because of problems with the Boeing Starliner.

“NASA has decided that Butch and Suni will return with Crew 9 next February and that Starliner will return uncrewed. A test flight by nature is neither safe nor routine. And so the decision to keep Butch and Suni aboard the International Space Station and bring the Boeing Starliner home uncrewed is the result of a commitment to safety.” “I talked with Butch and Suni both yesterday and today. They support the agency’s decision fully, and they’re ready to continue this mission on board I.S.S. as members of the Expedition 71 crew. Their families are doing well. Their families understand, just like the crew members when they launch, there’s always an opportunity, there’s always a possibility that they could be up there much longer than they anticipate. So the families understand that. I’m not saying it’s not hard. It is hard. It’s difficult.”

Advertisement

Recent episodes in Science

Continue Reading

Science

Earthquake risks and rising costs: The price of operating California's last nuclear plant

Published

on

Earthquake risks and rising costs: The price of operating California's last nuclear plant

Under two gargantuan domes of thick concrete and steel that rise along California’s rugged Central Coast, subatomic particles slam into uranium, triggering one of the most energetic reactions on Earth.

Amid coastal bluffs speckled with brush and buckwheat, Diablo Canyon Nuclear Power Plant uses this energy to spin two massive copper coils at a blistering 30 revolutions per second. In 2022, these generators — about the size of school buses — produced 6% of Californians’ power and 11% of their non-fossil energy.

Yet it comes at almost double the cost of other low-carbon energy sources and, according to the federal agency that oversees the plant, carries a roughly 1 in 25,000 chance of suffering a Chernobyl-style nuclear meltdown before its scheduled decommissioning in just five years — due primarily to nearby fault lines.

Aggressive and impactful reporting on climate change, the environment, health and science.

Advertisement

As Gov. Gavin Newsom’s administration looks to the aging reactor to help ease the state’s transition to renewable energy, Diablo Canyon is drawing renewed criticism from those who say the facility is too expensive and too dangerous to continue operating.

Diablo is just the latest in a series of plants built in the atomic frenzy of the 1970s and ’80s seeking an operating license renewal from the federal Nuclear Regulatory Commission as the clock on their initial 40-year run ticks down. As the price of wind and solar continues to drop, the criticisms against Diablo reflect a nationwide debate.

Two men walk past two massive turbine generator.

Tom Jones, right, a regulatory and environmental senior director at PG&E, and Jerel Strickland, a senior licensing and spent nuclear storage consultant, walk past one of two massive turbine-generator units inside the turbine building at Diablo Canyon Nuclear Power Plant recently.

(Genaro Molina/Los Angeles Times)

Advertisement

The core of the debate lives in the quaint coastal town of San Luis Obispo, just 12 miles inland from the concrete domes, where residents expected Diablo Canyon to shut down over the next year after its license expired.

Instead, Newsom struck a deal on the last possible day of the state’s 2021-22 legislative session to keep the plant running until 2030, citing worries over summer blackouts as the state transitions to clean energy. The activists who had negotiated the shutdown with PG&E and the state six years prior were left stunned.

Today, the plant is still buzzing with life: Nuclear fission, in the deep heart of the plant, continues to superheat water to 600 degrees at 150 times atmospheric pressure. Generators continue to whir with a haunting and deafening hum that reverberates throughout the massive turbine deck.

Left untouched, nuclear fission erupts into a runaway chain reaction that can heat the core of a nuclear plant to thousands of degrees, liquifying the metal around it into radioactive lava.

So, operators have to constantly stifle the reaction to keep it under control.

Advertisement

In the event of an earthquake, they need to stop the reaction as quickly as possible. But if the shaking is so rapid and intense that the plant is critically damaged before it can shut down, operators could become helpless in preventing a meltdown.

Silhouetted man in front of a display.

Tom Jones, senior director of Regulatory Environmental and Repurposing at PG&E, talks about how the Diablo Canyon Nuclear Power Plant operates.

(Genaro Molina/Los Angeles Times)

A man's profile is reflected in a display that illustrates atomic fission.

Tom Jones, senior director of Regulatory Environmental and Repurposing at PG&E, is reflected in a display that explains the fission process at Diablo Canyon Nuclear Power Plant recently.

(Genaro Molina/Los Angeles Times)

Advertisement

Diablo Canyon is built to endure specific intensities and speeds of shaking — but predicting how likely an earthquake is to exceed those specifications is no easy task. Earthquakes are the result of deeply complex underground motion and forces, and they’re notoriously chaotic.

In order to start estimating the seismic safety of the plant, geophysicists have to understand: first, where the faults are; second, how much they’re slipping to trigger earthquakes; and finally, when those quakes hit, how much shaking they cause.

Earthquakes account for about 65% of the risk for a worst-case scenario meltdown. Potential internal fires at the plant make up another 18%. The last 17% is made up of everything from aircraft impacts and meteorites to sink holes and snow.

In assessing the likelihood of all these threats, the Nuclear Regulatory Commission estimates that in any given year, each of Diablo Canyon’s two reactor units has a roughly 1 in 12,000 chance of experiencing a nuclear meltdown similar to Japan’s Fukushima disaster.

Likewise, there’s about a 1 in 127,000 chance a failure will cause the plant to release exorbitant amounts radioactive material into the atmosphere before residents could evacuate, creating a Chernobyl-style disaster.

Advertisement

This means that, every year, nearby residents have roughly the same chance of seeing a nuclear meltdown as dying in a car crash. Also, in any given year, they’re about 50 times more likely to face a mass-casualty radioactive catastrophe than get struck by lightning.

Diablo Canyon employees work around the clock to ensure the risk is as small as possible. “Our safety culture, it’s always on the top of my mind,” said Maureen Zawalick, the vice president of business and technical services at Diablo. “It’s in my DNA.”

A woman stands on a boat as a nuclear power plant rises on the shore behind her.

Maureen Zawalick, PG&E Business and Technical Services vice president, in her office at the Diablo Canyon Power Plant.

(Genaro Molina/Los Angeles Times)

The plant is the only one in the U.S. with a dedicated geoscience team that studies the region’s seismic landscape. And like other nuclear facilities, Diablo has done countless tests on its equipment, hosted walkthroughs with regulators to identify possible points of failure and generated thousands of pages of analysis on the facility’s ability to withstand the largest earthquake possible at the site.

Advertisement

Earthquake precautions include massive metal dampers that are fixed to essential infrastructure, such as the duct carrying the control rooms’ air supply. In the event of a tremor, monstrous concrete pillars penetrate deep into the bedrock to keep the building and essential infrastructure grounded. The hefty concrete walls reinforced with steel rebar as thick as a human arm safely distribute the forces throughout the structure to prevent critical cracks or collapses.

If the plant loses power, there are backup generators for the backup generators.

A worker rolls a utility cart past a billboard.

A worker pushes a utility cart past a billboard that lists employee goals at Diablo Canyon Nuclear Power Plant recently.

(Genaro Molina/Los Angeles Times)

Operators spend a fifth of their time on the job training for every possible nightmare. Diablo has a simulator on site that’s an exact replica of the Unit One control room. It’s capable of putting operators through the worst conditions imaginable. It shakes with the vigor of a real earthquake. The lights flicker and the analog dials spin back up as emergency power comes online.

Advertisement

For everyone working on site — including the senior leadership team — safety is personal. Should something go wrong, their lives are on the line.

“With any source of energy, there is risk,” said Zawalick. “All the independent assessments, all the audits, all the third party reviews, all of that …. is what gives me the confidence and the security and the safety of why I’ve been out here almost 30 years.” Her office is no more than 500 feet from the reactors.

“If there ever was an earthquake of any magnitude in this community,” she said, “I would grab my two daughters and we’d come here.”

A woman's profile is silhouetted in a picture window that overlooks an industrial site and the ocean.

Maureen Zawalick, PG&E Business and Technical Services vice president, looks out her office window at Diablo Canyon Nuclear Power Plant recently.

(Genaro Molina/Los Angeles Times)

Advertisement

Many critics charge that the risks are understated — due in part to a cozy relationship between industry and regulators. (Some scientists involved with one of Diablo Canyon’s two independent review organizations have collaborated on scientific papers with PG&E staff and funding.)

The Nuclear Regulatory Commission also oversees the plant and conducts its own investigations. In July, the government agency dismissed all three formal criticisms against Diablo’s seismic safety in the plant’s license renewal process.

Sam Blakeslee, a San Luis Opispo geophysicist and former state senator and Assembly member, has a list of technical concerns — primarily the lack of shaking data close to fault lines, which are used to inform the models that predict earthquake motion at the plant — but he likens the core of his concern to the NASA Challenger disaster.

NASA publicly touted a strong safety culture and low chances of things going wrong. Yet, the investigation found political and public pressures had corrupted the safety from the top down.

He argues this is a possibility for any large organization dealing with complex and potentially dangerous systems. Therefore, people need to constantly hold the plant accountable.

Advertisement

“That’s why I tend to try to make sure that the community voice is present,” he said, “ because we are the ones that will pay the price.”

In 2022, Newsom introduced a proposal to keep Diablo Canyon open past its two reactors’ 2024 and 2025 shutdown dates. His proposal, distributed to lawmakers just three weeks before the end of the legislative session, set off a flurry of negotiations among PG&E, the governor and the Legislature.

After discussion drew on past midnight, the Legislature passed the bill.

But it comes at a cost.

While the average price of solar and wind have dropped dramatically over the past 15 years, nuclear’s has been steadily rising. In 2009, solar cost three times what nuclear did, and wind was about even with it. Now, nuclear is over two times the cost of both renewables.

Advertisement

Technical advancements have slashed the price of renewable energy, but nuclear power has faced more outages, equipment replacements and increasingly stringent and expensive safety requirements in the wake of the Fukushima disaster.

One study from MIT researchers found that about a third of the increasing cost could be attributed to safety requirements from the Nuclear Regulatory Commission. They attribute another third to research and development projects for efficiency, reliability and safety improvements, and they assign the final third to a decrease in worker productivity — perhaps in part due to lower morale.

Fog rises behind twin containment domes at a nuclear power plant.

Twin containment domes rise above the facility as seen through a windshield on the drive to the Diablo Canyon Nuclear Power Plant.

(Genaro Molina/Los Angeles Times)

PG&E is estimating that Diablo Canyon will produce energy at $91 per megawatt-hour during its extension. (The average U.S. household buys about 10 megawatt-hours every year.)

Advertisement

However, the Alliance for Nuclear Responsibility argues the plant’s cost is even higher. David Weisman, the legislative director at the alliance, said PG&E is using optimistic predictions of its energy output for the extended period — 5% higher than previous years.

On top of that, the state gave PG&E a $1.4-billion loan to alleviate the initial costs of extended operations. But Wiesman said the funds don’t necessarily need to go toward offsetting the cost of running Diablo. The federal government agreed to reimburse the state up to $1.1 billion — depending on whether the plant meets specific operating criteria — and PG&E is expected to pay off the rest of the loan with profits.

While the loan isn’t a cost that consumers would see on their energy bills, taxpayers across the country could foot the bill. Weisman argued that it brings Diablo’s cost to a maximum of $115 per megawatt-hour — roughly double the cost of solar.

Yet Newsom argues that if California is to meet its goals of 60% renewable energy by 2030, Diablo needs to stay online in the meantime to ensure the state has reliable power amid heatwaves and wildfires.

Diablo Canyon essentially runs 24/7, providing constant power to the state (assuming it doesn’t have any issues, which it sometimes does). For solar to provide similarly constant power, the electric grid will require a massive expansion of its battery infrastructure to store the energy between the midday peak of energy production and the evening peak of energy use.

Advertisement

However, new studies are finding that energy storage is a feasible approach to grid reliability — and that even when adding the price of that infrastructure, solar still costs less than nuclear.

Tom Jones talks inside the turbine building at Diablo Canyon Nuclear Power Plant.

Tom Jones, a regulatory and environmental senior director at PG&E, talks about the number of days that Turbine Unit One has operated to bring power to California while inside the turbine building at Diablo Canyon Power Plant recently.

(Genaro Molina/Los Angeles Times)

Since Diablo’s extension was signed into law, California has almost doubled its battery storage. The state now has enough to supplement about a quarter of the state’s power needs for about half an hour during peak energy usage (although, in practice, it would likely supplement much less for much longer).

“That’s four or five Diablo Canyons,” said Weisman. Newsom should “save the people of California [billions of dollars] thrown down PG&E’s rat hole, declare triumphant victory in the renewable race and accept the laurels.”

Advertisement

Instead, at a recent press event announcing California had reached a fifth of its storage capacity goal, Newsom laughed off the idea that Californians will no longer have to worry about blackouts.

“We have a lot of work to do still in moving this transition, with the kind of stability that’s required,” he said. “So no, this is not today announcing that blackouts are part of our past.”

Diablo Canyon’s leaders and advocates view the plant as supporting California through this challenging transition period: It’s not perfect, but it provides the state with much-needed reliable, clean power, they say.

In a conference call shortly after Diablo’s initial 2024 shutdown date was negotiated, then-chief executive of PG&E Tony Earley acknowledged the plant would eventually become too expensive to operate.

“As we make this transition, Diablo Canyon’s full output will no longer be required,” he said.

Advertisement
Steam rises from the sea near a nuclear power plant.

Steam rises from the Pacific Ocean where an outfall of heated water from the Diablo Canyon Nuclear Power Plant pours into coastal waters.

(Genaro Molina/Los Angeles Times)

Zawalick said the Diablo team is ready to continue operating as long as the state needs it to. “Thinking about electrification, [electric vehicle] demand, continued drought, the temperatures we’re seeing, wildfires … tariffs — I mean, the list goes on,” she said. “That’s making the equation a bit challenging to see exactly when Diablo will shut down versus how long Diablo will be needed by the state.”

Advertisement
Continue Reading

Trending