←back to thread

242 points simonebrunozzi | 1 comments | | HN request time: 0.212s | source
Show context
analogpixel ◴[] No.46237814[source]
I've been noticing lately, at least for myself, that useful technology stopped happening like 10-20 years ago. If all you could use was tech from 2000 and before you would have a pretty stable stack that just worked (without a monthly subscription.)

There is also this article today: https://jon.recoil.org/blog/2025/12/an-svg-is-all-you-need.h... about how great good ol' svg is. And then every recurring article about using RSS instead of all the other siloed products.

textfiles, makefiles, perl, php, rss, text based email, news groups, irc, icq, vim/emacs, sed, awk; all better than the crap they have spawned that is supposed to be "better".

Out of curiosity, what technology in the past 5 years do you use that you actually find better than something from 20 years ago?

replies(16): >>46237898 #>>46238286 #>>46238745 #>>46238841 #>>46238842 #>>46239106 #>>46239449 #>>46239849 #>>46239850 #>>46239910 #>>46240493 #>>46240958 #>>46241408 #>>46241498 #>>46241632 #>>46242641 #
1. jumploops ◴[] No.46241408[source]
Not to be the “ai” guy, but LLMs have helped me explore areas of human knowledge that I had postponed otherwise

I am of the age where the internet was pivotal to my education, but the teacher’s still said “don’t trust Wikipedia”

Said another way: I grew up on Google

I think many of us take free access to information for granted

With LLMs, we’ve essentially compressed humanity’s knowledge into a magic mirror

Depending on what you present to the mirror, you get some recombined reflection of the training set out

Is it perfect? No. Does it hallucinate? Yes. It it useful? Extremely.

As a kid that often struggled with questions he didn’t have the words for, Google was my salvation

It allowed me to search with words I did know, to learn about words I didn’t know

These new words both had answer and opened new questions

LLMs are like Google, but you can ask your exact question (and another)

Are they perfect? No.

The benefit of having expertise in some area, means I can see the limits of the technology.

LLMs are not great for novelty, and sometimes struggle with the state of the art (necessarily so).

Their biggest issue is when you walk blindly, LLMs will happily lead the unknowing junior astray.

But so will a blogpost about a new language, a new TS package with a bunch of stars on GitHub, or a new runtime that “simplifies devops”

The biggest tech from the last five years is undoubtedly the magic mirror

Whether it can evolve to Strong AI or not is yet to be seen (and I think unlikely!)