←back to thread

210 points blackcat201 | 1 comments | | HN request time: 0.227s | source
Show context
softwaredoug ◴[] No.45772588[source]
Everyone is worried about AI data centers destroying the planet with their extreme energy needs. Though it seems we have a big learning curve still to make AI inference and training more efficient.

How likely are we to NOT see the AI data center apocalypse through better algorithms?

replies(4): >>45772794 #>>45773341 #>>45774160 #>>45774881 #
wongarsu ◴[] No.45773341[source]
We have already seen huge efficiency increases over the last two years. Small models have become increasingly capable, the minimum viable model size for simple tasks keeps shrinking, and proprietary model providers have long stopped talking about new milestones in model sizes and instead achieved massive price cuts through methods they largely keep quiet about (but that almost certainly include smaller models and intelligent routing to different model sizes)

But so far this has just lead to more induced demand. There are a lot of things we would use LLMs for if it was just cheap enough, and every increase in efficiency makes more of those use cases viable

replies(1): >>45774165 #
1. naasking ◴[] No.45774165[source]
At some threshold, efficiency gains let models move out of the data center though.