←back to thread

625 points lukebennett | 1 comments | | HN request time: 0.213s | source
Show context
russellbeattie ◴[] No.42140888[source]
Go back a few decades and you'd see articles like this about CPU manufacturers struggling to improve processor speeds and questioning if Moore's Law was dead. Obviously those concerns were way overblown.

That doesn't mean this article is irrelevant. It's good to know if LLM improvements are going to slow down a bit because the low hanging fruit has seemingly been picked.

But in terms of the overall effect of AI and questioning the validity of the technology as a whole, it's just your basic FUD article that you'd expect from mainstream news.

replies(2): >>42141152 #>>42142789 #
NateEag ◴[] No.42142789[source]
> Go back a few decades and you'd see articles like this about CPU manufacturers struggling to improve processor speeds and questioning if Moore's Law was dead. Obviously those concerns were way overblown.

Am I missing something? I thought general consensus was that Moore's Law in fact did die:

https://cap.csail.mit.edu/death-moores-law-what-it-means-and...

The fact that we've still found ways to speed up computations doesn't obviate that.

We've mostly done that by parallelizing and applying different algorithms. IIUC that's precisely why graphics cards are so good for LLM training - they have highly-parallel architectures well-suited to the problem space.

All that seems to me like an argument that LLMs will hit a point of diminishing returns, and maybe the article gives some evidence we're starting to get there.

replies(1): >>42143562 #
1. russellbeattie ◴[] No.42143562[source]
I wrote "a few decades".

The article you pointed out says the end came in 2016: Eight years ago.

My point is those types of articles have been popping up every few years since the 1990s. Sure, at some point these sort of predictions will be proven correct about LLMs as well. Probably in a few decades.