Wisconsin

Answers to FAQs about AI data centers, including water, energy usage

Published

on



The rapid development of artificial intelligence is sparking the development of data centers that provide computing power for the technology. Water usage and energy consumption are among the concerns.

The Journal Sentinel asked readers to send us their questions about Wisconsin data centers. More than 300 responded.

We will be posting the answers to those questions here over the next weeks as more are published. This story will also be updated with a video replay of our Feb. 23 town hall meeting. You can still get free tickets to the event, which will be held from 5 to 7 p.m. at the Turner Hall ballroom in Milwaukee.

We are we just hearing about the AI data centers now?

There are a few reasons why data centers have become so high profile across the country in the last few years — and in Wisconsin over the last year, especially. Much of the data center boom we’re seeing now is tied to the scale of computing needed to advance AI.

For one, the infrastructure needed for AI is much larger and more resource intensive than the existing data centers we’ve had for decades. That scale unlocks new cost-benefit analyses for the communities they’re proposed in, which often makes them much more controversial and often ignites community-wide, even region-wide discussions.

Advertisement

This is playing out across the country. Still, Wisconsin is not at the forefront of this data center buildout with 51. Virginia (570) has the most data centers in the country, followed by Texas (407) and California (288). And so when people are trying to figure out what a new data center proposal means for their community, they’re reading about how data centers are affecting other states. 

How much land is being devoted to artificial intelligence data centers?

The answer depends on the project. In Wisconsin, the newer facilities range from 16 acres (the Meta site in Beaver Dam) to 250 acres (as proposed in Janesville) to 1,900 acres (the Vantage project in Port Washington). Those examples are a pretty good sample of what we’re seeing across the country. Most AI‑focused hyperscale data center campuses today are being planned on roughly 200–500 acres, with some of the headline projects at 1,000–2,000 acres.

For context, 16 acres is about 12 football fields. And 250 acres is comparable to one or to two large 18‑hole golf courses, a big regional shopping center plus its parking lot or a mid-sized university campus, like Northwestern University in Chicago. The University of Wisconsin–Madison’s main campus is around 1,000 acres, which is about the size of the Village of Shorewood.

Data centers also require land for supporting infrastructure

However, there’s a bit more complexity embedded into the original question. Data center sites themselves are a starting point for this conversation, but there’s a lot of additional infrastructure and land needed beyond those sites. Powering such facilities typically requires new energy generation in the form of new power plants and transmission lines, which also need land.

Advertisement

For instance, the first phase of construction in Port Washington, which will be used by OpenAI and Oracle, is 672 acres and requires about 1.3 gigawatts of electricity (one gigawatt equals one billion watts.) That could equate to the output of about one or two big modern power plants, or a few mid-sized facilities. And utility infrastructure isn’t just about power plants. Data centers also often need power lines to move the energy across the grid from the plant to the warehouse, if they aren’t built on-site. For example, there’s a transmission line project proposed to bring power to the Port Washington facility that spans across five counties in eastern Wisconsin and would be between 100 and 120 miles long.

With newer technology, why is so much water and power needed?

To answer requires a brief explanation of how artificial intelligence works. In the past, data centers were used to power the internet and for cloud storage, software and business records management.

AI requires a vast amount of data and computing power to perform numerous computations and to train chatbots and build enterprise tools for companies. The scale is higher than the amount of computing and data storage, requiring vast warehouses of interconnected computers and servers running around the clock. This requires a lot more power, typically at least 1 GW per data center campus.

Advertisement

That equipment generates heat and needs to be cooled, which also requires additional power — and water. Proponents point to the use of closed loop cooling systems in the Port Washington and Mount Pleasant projects which use considerably less water than previous coolant systems.

What are the life spans of data centers? How soon will they be obsolete?

Generally, data centers are designed to operate for around 10 to 20 years before they need major upgrade or full replacement, but different components have a range of life cycles. Like with most commercial buildings, the underlying building shell can last much longer that 20 year, but the internal systems, including specialized IT gear and power and cooling systems, are typically designed for 10-20 of use before they start to become “obsolete.”

However, the servers, which contain the chips that store and process data, have a much shorter lifecycle and are typically replaced every 3 to 5, though they can often function 7–10 years with good maintenance. There are several reasons for that. The frontiers of chip technology are constantly evolving and so using the newest hardware usually provides higher performance and energy efficiency, which reduces the risks of system failure.



Source link

Advertisement

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending

Exit mobile version