(Disclaimer: I am not an anti-AI guy — I am just listing the common talking points I see in my feeds.)
(Disclaimer: I am not an anti-AI guy — I am just listing the common talking points I see in my feeds.)
My strong intuition at the moment is that the environmental impact is greatly exaggerated.
The energy cost of executing prompts has dropped enormously over the past two years - something that's reflected in this report when it says "Driven by increasingly capable small models, the inference cost for a system performing at the level of GPT-3.5 dropped over 280-fold between November 2022 and October 2024". I wrote a bit about that here: https://simonwillison.net/2024/Dec/31/llms-in-2024/#the-envi...
We still don't have great numbers on training costs for most of the larger labs, which are likely extremely high.
Llama 3.3 70B cost "39.3M GPU hours of computation on H100-80GB (TDP of 700W) type hardware" which they calculated as 11,390 tons CO2eq. I tried to compare that to fully loaded passenger jet flights between London and New York and got a number of between 28 and 56 flights, but I then completely lost confidence in my ability to credibly run those calculations because I don't understand nearly enough about how CO2eq is calculated in different industries.
The "LLMs are an environmental catastrophe" messaging has become so firmly ingrained in our culture that I think it would benefit the AI labs themselves enormously if they were more transparent about the actual numbers.
There is a lot of heated debate on the "correct" methodology for calculating CO2e in different industries. I calculate it in my job and I have to update the formulas and variables very often. Don't beat yourself over it. :)