←back to thread

3337 points keepamovin | 2 comments | | HN request time: 0.557s | source
Show context
Bjartr ◴[] No.46208149[source]
For comparison, here's the frontpage from ten years ago

https://news.ycombinator.com/front?day=2015-12-09

replies(10): >>46208472 #>>46208585 #>>46208605 #>>46208689 #>>46209838 #>>46210144 #>>46212812 #>>46213081 #>>46214655 #>>46217432 #
seizethecheese ◴[] No.46208605[source]
Today's front page is not a clean 10 year extrapolation from this. That's where AI is wrong. The future is weird and zig zags, it's not so linear as the Gemini generated page.
replies(3): >>46209505 #>>46210120 #>>46211328 #
1. neuronic ◴[] No.46209505[source]
This is a problem with nearly all predictions about the future. Everything is just a linear extrapolation of the status quo. How could a system have predicted the invention of the transformer model in 2010? At best some wild guess about deep learning possibilities.

Or the impact of smartphones in 2003? Sure smart phones were considered but not the entire app ecosystem and planetary behavioral adaptation.

replies(1): >>46209963 #
2. seizethecheese ◴[] No.46209963[source]
Yes, of course this is right. However, I do think LLMs suffer even more than people from linear extrapolation.