Now the only promises are a strained grid, higher energy bills and loud noise. It doesn't help that AI has been falsely attributed as the reason to lay people off in the past few years by CEOs who are actually just cutting costs or moving jobs offshore.
This situation probably gets worse before it gets better for the companies deploying new data centers.
It takes less energy to get fresh water that's 85F and cool it to 80F than recycle 90F water and cool it to 80F.
Also, I think the only truly "consumed" water is from evaporative coolers. Unless I'm mistaken, they start with potable and end up with warmer potable water. I don't think there's a reason it couldn't be fed into the water grid, where it should cool back down naturally. I guess the problem is when the datacenter requires more water than the rest of the water grid so you end up producing excess potable water.
You can use water or air internally but then to get rid of the heat from the facility there aren't many choices. You either put it into the air which is cheap, into nearby water bodies which has other environmental concerns, or into the ground which is expensive. The air is the simplest, cheapest solution and using water for evaporative cooling in dryer climates makes it even better
No. https://andymasley.substack.com/p/the-ai-water-issue-is-fake