←back to thread

222 points dougb5 | 5 comments | | HN request time: 0s | source
Show context
mystraline ◴[] No.45132430[source]
So, its a "Kids these days", written by a kid.

I've seen the same commentary about:

Spellcheck

Typed material

Computer art programs

Calculators

replies(2): >>45132433 #>>45132460 #
1. nyc_data_geek1 ◴[] No.45132433[source]
This is fundamentally different. It replaces, and thus atrophies, cognitive faculties in a way the other tools you mentioned never aspired to.
replies(1): >>45132521 #
2. mystraline ◴[] No.45132521[source]
No, it isn't.

"Spellcheck removes the ability to spell"

"Calculators prevent you from doing math."

"Computer art will destroy real art"

"Typing text will destroy cursive and handwritten".

Same idea, that some form of tech will destroy something we should value.

replies(3): >>45133424 #>>45136568 #>>45141046 #
3. nyc_data_geek1 ◴[] No.45133424[source]
You are missing the point. Assistive tools do not replace the fundamental cognitive process. This does.
4. sensanaty ◴[] No.45136568[source]
The calculator example everyone keeps bringing up is such a massive misunderstanding of what the issue with LLMs is.

I did IGCSE/A-levels (International school following the British school system). For IGCSE, around grade 8 (if memory serves), you're allowed the use of calculators, including during all exams. For AS/A Levels (grades 11 and 12), you're also allowed calculators in all subjects that have any kind of mathematics at all, like Physics, Chemistry, Maths etc.

On the front page of the exam papers, you also get a list of all formulae that might be relevant. For physics this will be things like the formula for calculating Force, or all the ones relating to electricity, or gravity etc. Similarly for math, you'll get common formulae for even the simple things that everyone is expected to know like Pythagoras' theorem.

The thing is, the calculators and the formulae were of very little use to you if you didn't know the theory behind it in the first place. This same concept applies for the basics like arithmetic, if you yourself have never done arithmetic without the aid of a calculator, you can plug in all the numbers you want, but you don't have that intuition for what looks roughly right or not and you won't get very far. I still have memories of me punching in some numbers in the calculator, and then being confused by the resulting number, because for years I practiced my mental arithmetic and was at a state where I know that the number just looked different from what I was expecting.

The thing here is that the years studying arithmetic weren't only relevant for just the math class I was doing, it was universally useful whenever I interacted with any numbers, including in other subjects like Physics (where you're often working with incomprehensible numbers that end with an `^11`, so having an intuition for orders of magnitudes really mattered) or Chemistry (similarly except it's ^-11). Even for the less STEM-focused subjects like Business studies, having an intuition for basic Mathematics makes a huge difference.

LLMs/AIs in this context would be replacing all those foundation building years, for basically everything. You as the student relying on AI for everything will take a look at that exam sheet, take a look at that list of formulae and will have absolutely no clue what any of it means, because you haven't practiced any of it yourself. You'll see some number spat out by the calculator and it'll be negative when it should be positive, and the order of magnitude will be ^21 rather than ^11, and you'll lack the foundational knowledge to know intuitively that number is wrong and the answer is wrong. Except this will apply for everything that an LLM can answer, not just numbers.

We already live in a world where people take anything they read online as gospel. People already don't know how to read graphs, and have no intuition for numbers. If something is said in an authoritative-enough tone, many people will just go with it even if the data shows the opposite of what is being said. We now want to introduce hallucination machines and have the future generations be dependent on them and to outsource all their thinking to them? Even though it's hilariously easy to get these systems spitting out absolute nonsense, just in a confident and well-written tone?

> "Spellcheck removes the ability to spell"

This has factually happened. How many people out there do you see that don't know the difference between your and you're, or that constantly mispell [sic] common words? Whether that's something that is important is a different discussion, but it is a truth.

> "Typing text will destroy cursive and handwritten".

This has also objectively happened. Again, whether it's truly important or not is a different topic, but I can use myself as an example, I basically have a child's handwriting because I haven't really written anything since high school. I still have to write from time to time, for example when filling out some gov't forms, and it's a genuine pain in the ass sometimes how terrible my handwriting is.

5. schiem ◴[] No.45141046[source]
People who heavily utilize any of those _do_ have the associated skills atrophied. It's just that in all of the listed cases, the actual associated skill may not have actually been important.

The younger people at my job have atrocious spelling.

My ability to do mental math is much worse than it was when I was regularly doing math without a calculator.

People who have exclusively learned digital art do not have the muscle memory built up to seamlessly transition to analog art.

Almost everyone I know has awful handwriting.

So the question then is "What is the actual skill that AI tools are replacing?" And if the answer is "thinking," then that should be terrifying.