In the meantime keep learning and practicing cs fundamentals, ignore hype and build something interesting.
In the meantime keep learning and practicing cs fundamentals, ignore hype and build something interesting.
I don't really agree with the reasoning [1], and I don't think we can expect this same rate of progress indefinitely, but I do understand the concern.
All relevant and recent evidence points to logarithmic improvement, not the exponential we were told (promised) in the beginning.
We're likely waiting at this point for another breakthrough on the level of the attention paper. That could be next year, it could be 5-10 years from now, it could be 50 years from now. There's no point in prediction.
People like to assume that progress is this steady upward line, but I think it's more like a staircase. Someone comes up with something cool, there's a lot of amazing progress in the short-to-mid term, and then things kind of level out. I mean, hell, this isn't even the first time that this has happened with AI [1].
The newer AI models are pretty cool but I think we're getting into the "leveling out" phase of it.
Your exponential problems have exponential problems. Scaling this system is factorially hard.
Any citations for this pretty strong assertion? And please don't reply with "oh you can just tell by feel".