Data centers don't create jobs, pit municipalities against one another in a race to the bottom, and typically demand abated taxes and almost never deliver a net positive for where they operate.
But if you create a "water" monster, pivot the conversation on water being the issue, you can then show water consumption isn't a big deal. Water is the framing the data centers want because they can win the fight on that topic.
FYI on the jobs, CoreWeave's new AI center in Pennsylvania: 100 MW capacity, ~70–75 full-time technical roles initially, scaling to 175 full-time roles at ~300 MW.
>pit municipalities against one another in a race to the bottom, and typically demand abated taxes and almost never deliver a net positive for where they operate.
If datacenters are net negatives, why would municipalities compete to get them?
Short term they sound good and promising. They are techy and promise quite large employment, and investments sounds big.
The reality is lot worse. Building walls isn't that much investment to local labour. And most of the value is in components that come from somewhere else. After install, they run on handful of guards and techs. Not worst jobs, but general in general any type of factory or even small scale industry would be better.
The uproar over AI data center resource use has been rather bizarre to see and feels vaguely luddite. As this article points out, frivolous things like golf courses are far worse users of fresh water (and land) than any amount of AI. And on the electricity side, forcing the US to actually build more power generating capacity and infrastructure is a good thing in my book. Once the AI hype dies down we can use that for BEVs and other useful things.
> And on the electricity side, forcing the US to actually build more power generating capacity and infrastructure is a good thing in my book.
Electricity use is fungible. Every extra TW-hr of marginal demand is one coal plant that is delayed an extra year from being mothballed, spewing one extra quantum of CO2 into the atmosphere, adding one increment to the greenhouse effect.
It’s because a bunch of the tech elite backed Trump, therefore anything tech related must now be evil according to people left of center.
If they’d all opposed Trump you’d see MAGA people making up any reason for anything tech to be evil and calling for AI to be outlawed, and lefty puff pieces about how wonderful and liberating AI is.
Reality is now subordinate to political hyper partisanship. If Trump says the sky is blue, the left thinks it must be green. If Trump says it’s green, MAGA people will swear they see green and seeing blue would become “woke.”
This is a big oversimplification. First, the 20% (of all datacenter usage being LLMs) is based on 2024 estimates, while meanwhile all of the LLM players are putting billions of dollars into building more and larger datacenters. This number undoubtedly already gives an underestimate for total LLM power usage, and if all of the planned datacenters actually materialize (which is a big if), it will be an underestimate by an order of magnitude or maybe two.
Second, water issues are localized, and building datacenters in dry areas (like Texas), where aquifers are already being depleted, is going to be an issue there, even if it's a drop in the bucket of the great lakes or whatever.
This article was first published over a month ago, but Andy just added the section on potable water as a result of this Hacker News comments conversation from a couple of days ago: https://news.ycombinator.com/item?id=45926469#45926914
The AI water issue isn't fake, though it's often overstated.
The key problem is that data center evaporative cooling permanently removes water from local systems, unlike irrigation or golf courses where some water returns to groundwater, evaporated cooling water is lost to the atmosphere and must be continuously replenished.
While 0.008% of national freshwater seems tiny, the author misses the local impact. In water-stressed regions, even "small" demands matter. Comparing to golf courses in Phoenix sets the bar absurdly low, "less wasteful than the worst example" shouldn't be the standard.
The author dismisses 905M gallons in Maricopa County as "only 0.12%" of county use, but in a desert already overdrawing groundwater, that's 905M gallons unavailable for human needs.
The media has exaggerated, sure. But calling legitimate resource concerns "fake" swings way too far the other way. We need careful planning for data center locations, not dismissal of water consumption because other industries use more.
> The author dismisses 905M gallons in Maricopa County as "only 0.12%" of county use, but in a desert already overdrawing groundwater, that's 905M gallons unavailable for human needs.
Can you help explain what 905M gallons of water means?
My biggest problem with the data center water debate continues to be people throwing around big scary numbers like that without attempting to provide context for them.
(I found one estimate that the average US resident uses 30,000 gallons per year, which would make 900,000,000 gallons the same as 30,000 people.)
>The key problem is that data center evaporative cooling permanently removes water from local systems, unlike irrigation or golf courses where some water returns to groundwater, evaporated cooling water is lost to the atmosphere and must be continuously replenished.
But farms and golf courses suffer from evaporation as well, so that argument really only means you can discount farm/golf water usage by some fraction (eg. 50%). Considering the consumption figures are 0.08% for datecenters and the 8% for golf courses, the argument still holds up.
what comes out of the AI datacenters, and what that will do to society, is far more concerning to me than the water and electricity, which are trivial to address by comparison
So, I guess we can expect advocates to avoid any hilariously weak strawmen then, right?
So, the regulation making evaporative cooling illegal in datacenters is being voted into law as we speak, right?
Since its such a fake issue, then regulation capping datacenter water use is being voted into law as we speak, right? Should be straightforward enough since we know exactly how much water that is needed and being used, right?
Using national aggregated statistics for water is extremely misleading. And pretending one datacenter using 2% of a county's water supply isn't a huge deal demonstrates a complete disconnect from reality. Spin the numbers all you want, this is nuts.
Well that's why I cover things at the national, local, and personal level separately. I think if this were any other industry using 2% of a county's water you wouldn't have much of a reaction. In most countries only ~20% of the water is used by households, the rest goes to farms, industry, and commercial buildings
I see the energy issue as kind of fake too. Either we get our power from polluting sources or we don’t. Highlighting one specific use of energy detracts from that. Now it’s about that specific use and not where we get all energy for all uses.
The net is full of loud anti-AI people who will scream about power use and carbon emissions and then order tacos through DoorDash and crank up their heat or A/C. It’s all energy use.
It is a real concern mostly due to the time frame in which energy production needs to be increased. It can't be met solely by building new clean (or fossil fuel) energy, and thus must result in delaying the shutdown of coal plants in addition to new production.
Increasing energy demand increases incentives for less clean sources. Is it really that hard for you to understand? Your either or false dochotomy is some highschool debatebro level ignorant shit.
Why then is there a general negative correlation between the cost of potable water and the population of an area in places where freshwater is plentiful? There's a lot of literature on water economics and this pattern usually holds.
Of course it’s fake. The datacenters aren’t consuming the water as part of their operation they are using it as a supersized version of a custom loop PC.
I worry though that the fact that people seem to see political upside in claiming this will lead to data center NIMBYism and a future where building more compute will be as hard as building more housing, with all of the follow on effects on prices.
But that water remains in the water cycle. With agriculture the water goes into the crops and is then shipped off to other places, exiting the water cycle of its origin.
That's backwards. When data centers evaporate water for cooling, it becomes vapor that blows away to fall as rain somewhere else then it's gone from the local area or its discharged a waste water. Farm water mostly stays put but plants release it back into the local air, excess irrigation soaks into local groundwater, and only a fraction leaves in the harvested crops.
Farmers can reuse the same local water year after year. Data centers need fresh water constantly because their evaporated water doesn't come back.
“But the water cycle” is the dunning-krugerest counter argument of them all. It assumes the reader doesn’t remember 4th grade science class, while misapplying that same basic knowledge.
There’s a fundamental difference between water ending up in a tomato which is shipped across the world and leaves permanently and water that evaporates and rains down later. Regardless of whatever names you call me that is true.
Data centers don't create jobs, pit municipalities against one another in a race to the bottom, and typically demand abated taxes and almost never deliver a net positive for where they operate.
But if you create a "water" monster, pivot the conversation on water being the issue, you can then show water consumption isn't a big deal. Water is the framing the data centers want because they can win the fight on that topic.
Don't let your enemy choose the terrain.
FYI on the jobs, CoreWeave's new AI center in Pennsylvania: 100 MW capacity, ~70–75 full-time technical roles initially, scaling to 175 full-time roles at ~300 MW.
>pit municipalities against one another in a race to the bottom, and typically demand abated taxes and almost never deliver a net positive for where they operate.
If datacenters are net negatives, why would municipalities compete to get them?
Short term they sound good and promising. They are techy and promise quite large employment, and investments sounds big.
The reality is lot worse. Building walls isn't that much investment to local labour. And most of the value is in components that come from somewhere else. After install, they run on handful of guards and techs. Not worst jobs, but general in general any type of factory or even small scale industry would be better.
[dead]
The uproar over AI data center resource use has been rather bizarre to see and feels vaguely luddite. As this article points out, frivolous things like golf courses are far worse users of fresh water (and land) than any amount of AI. And on the electricity side, forcing the US to actually build more power generating capacity and infrastructure is a good thing in my book. Once the AI hype dies down we can use that for BEVs and other useful things.
> And on the electricity side, forcing the US to actually build more power generating capacity and infrastructure is a good thing in my book.
Electricity use is fungible. Every extra TW-hr of marginal demand is one coal plant that is delayed an extra year from being mothballed, spewing one extra quantum of CO2 into the atmosphere, adding one increment to the greenhouse effect.
AI is becoming a partisan issue in the US with all of the attendant consequences.
It’s because a bunch of the tech elite backed Trump, therefore anything tech related must now be evil according to people left of center.
If they’d all opposed Trump you’d see MAGA people making up any reason for anything tech to be evil and calling for AI to be outlawed, and lefty puff pieces about how wonderful and liberating AI is.
Reality is now subordinate to political hyper partisanship. If Trump says the sky is blue, the left thinks it must be green. If Trump says it’s green, MAGA people will swear they see green and seeing blue would become “woke.”
This is a big oversimplification. First, the 20% (of all datacenter usage being LLMs) is based on 2024 estimates, while meanwhile all of the LLM players are putting billions of dollars into building more and larger datacenters. This number undoubtedly already gives an underestimate for total LLM power usage, and if all of the planned datacenters actually materialize (which is a big if), it will be an underestimate by an order of magnitude or maybe two.
Second, water issues are localized, and building datacenters in dry areas (like Texas), where aquifers are already being depleted, is going to be an issue there, even if it's a drop in the bucket of the great lakes or whatever.
This article was first published over a month ago, but Andy just added the section on potable water as a result of this Hacker News comments conversation from a couple of days ago: https://news.ycombinator.com/item?id=45926469#45926914
The AI water issue isn't fake, though it's often overstated.
The key problem is that data center evaporative cooling permanently removes water from local systems, unlike irrigation or golf courses where some water returns to groundwater, evaporated cooling water is lost to the atmosphere and must be continuously replenished.
While 0.008% of national freshwater seems tiny, the author misses the local impact. In water-stressed regions, even "small" demands matter. Comparing to golf courses in Phoenix sets the bar absurdly low, "less wasteful than the worst example" shouldn't be the standard.
The author dismisses 905M gallons in Maricopa County as "only 0.12%" of county use, but in a desert already overdrawing groundwater, that's 905M gallons unavailable for human needs.
The media has exaggerated, sure. But calling legitimate resource concerns "fake" swings way too far the other way. We need careful planning for data center locations, not dismissal of water consumption because other industries use more.
> The author dismisses 905M gallons in Maricopa County as "only 0.12%" of county use, but in a desert already overdrawing groundwater, that's 905M gallons unavailable for human needs.
Can you help explain what 905M gallons of water means?
My biggest problem with the data center water debate continues to be people throwing around big scary numbers like that without attempting to provide context for them.
(I found one estimate that the average US resident uses 30,000 gallons per year, which would make 900,000,000 gallons the same as 30,000 people.)
So imagine 30,000 people suddenly appear in the empty lot next door needing water.
>The key problem is that data center evaporative cooling permanently removes water from local systems, unlike irrigation or golf courses where some water returns to groundwater, evaporated cooling water is lost to the atmosphere and must be continuously replenished.
But farms and golf courses suffer from evaporation as well, so that argument really only means you can discount farm/golf water usage by some fraction (eg. 50%). Considering the consumption figures are 0.08% for datecenters and the 8% for golf courses, the argument still holds up.
nice to see a post with lots of data and sources
what comes out of the AI datacenters, and what that will do to society, is far more concerning to me than the water and electricity, which are trivial to address by comparison
> The AI water issue is fake
What a relief!
So, I guess we can expect advocates to avoid any hilariously weak strawmen then, right?
So, the regulation making evaporative cooling illegal in datacenters is being voted into law as we speak, right?
Since its such a fake issue, then regulation capping datacenter water use is being voted into law as we speak, right? Should be straightforward enough since we know exactly how much water that is needed and being used, right?
Using national aggregated statistics for water is extremely misleading. And pretending one datacenter using 2% of a county's water supply isn't a huge deal demonstrates a complete disconnect from reality. Spin the numbers all you want, this is nuts.
Well that's why I cover things at the national, local, and personal level separately. I think if this were any other industry using 2% of a county's water you wouldn't have much of a reaction. In most countries only ~20% of the water is used by households, the rest goes to farms, industry, and commercial buildings
I see the energy issue as kind of fake too. Either we get our power from polluting sources or we don’t. Highlighting one specific use of energy detracts from that. Now it’s about that specific use and not where we get all energy for all uses.
The net is full of loud anti-AI people who will scream about power use and carbon emissions and then order tacos through DoorDash and crank up their heat or A/C. It’s all energy use.
It is a real concern mostly due to the time frame in which energy production needs to be increased. It can't be met solely by building new clean (or fossil fuel) energy, and thus must result in delaying the shutdown of coal plants in addition to new production.
Increasing energy demand increases incentives for less clean sources. Is it really that hard for you to understand? Your either or false dochotomy is some highschool debatebro level ignorant shit.
> it gives the utility more money to spend on drawing more water and improve infrastructure.
Deeply unserious, gradeschool-level economics. “Infrastructure” isn’t a marginal cost you can smoothly ramp up when a big new consumer comes online.
Why then is there a general negative correlation between the cost of potable water and the population of an area in places where freshwater is plentiful? There's a lot of literature on water economics and this pattern usually holds.
Of course it’s fake. The datacenters aren’t consuming the water as part of their operation they are using it as a supersized version of a custom loop PC.
I worry though that the fact that people seem to see political upside in claiming this will lead to data center NIMBYism and a future where building more compute will be as hard as building more housing, with all of the follow on effects on prices.
It's not a closed loop though , many use evaporative cooling towers ( wet towers )
But that water remains in the water cycle. With agriculture the water goes into the crops and is then shipped off to other places, exiting the water cycle of its origin.
That's backwards. When data centers evaporate water for cooling, it becomes vapor that blows away to fall as rain somewhere else then it's gone from the local area or its discharged a waste water. Farm water mostly stays put but plants release it back into the local air, excess irrigation soaks into local groundwater, and only a fraction leaves in the harvested crops.
Farmers can reuse the same local water year after year. Data centers need fresh water constantly because their evaporated water doesn't come back.
“But the water cycle” is the dunning-krugerest counter argument of them all. It assumes the reader doesn’t remember 4th grade science class, while misapplying that same basic knowledge.
There’s a fundamental difference between water ending up in a tomato which is shipped across the world and leaves permanently and water that evaporates and rains down later. Regardless of whatever names you call me that is true.
The datacenters ARE consuming the water as part of their operation.
A constant growth rate of datacenters will consume water at a fixed rate, though. And the growth rate is more likely to be exponential.