←back to thread

31 points moneycantbuy | 1 comments | | HN request time: 0.208s | source
Show context
semiquaver ◴[] No.45647228[source]
Why can’t the water for cooling these be a closed-loop system?
replies(4): >>45647239 #>>45647840 #>>45647877 #>>45650899 #
1. everforward ◴[] No.45647877[source]
They probably could, but then electricity consumption goes up because in a lot of places the temperatures required are below ambient. I'm seeing quotes that place desired water temperature around 80F, which is below ambient temperatures in most of the US at least part of the year.

It takes less energy to get fresh water that's 85F and cool it to 80F than recycle 90F water and cool it to 80F.

Also, I think the only truly "consumed" water is from evaporative coolers. Unless I'm mistaken, they start with potable and end up with warmer potable water. I don't think there's a reason it couldn't be fed into the water grid, where it should cool back down naturally. I guess the problem is when the datacenter requires more water than the rest of the water grid so you end up producing excess potable water.