←back to thread

Playing in the Creek

(www.hgreer.com)
346 points c1ccccc1 | 3 comments | | HN request time: 0s | source
Show context
DrSiemer ◴[] No.43651677[source]
So many articles and comments claim Ai will destroy critical thinking in our youths. Is there any evidence that this conviction that many people share is even remotely true?

To me it just seems like the same old knee-jerk luddite response people have to any powerful new technology that challenges that status quo since the dawn of time. The calculator did not erase math wizards, the television did not replace books and so on. It just made us better, faster, more productive.

Sometimes there is an adjustment period (we still haven't figured out how to deal with short dopamine hits from certain types of entertainment and social media), but things will balance themselves out eventually.

Some people may go full-on Wall-E, but I for one will never stop tinkering, and many of my friends won't either.

The things I could have done if I had had an LLM as a kid... I think I've learned more in the past two years than ever before.

replies(5): >>43651964 #>>43651975 #>>43652300 #>>43652603 #>>43657305 #
hacb ◴[] No.43652603[source]
> The calculator did not erase math wizards

The major difference is that in order to use a calculator, you need to know and understand the math you're doing. It's a tool you can work with. I always had a calculator for my math exams and I always had bad grades :)

You don't have to know how to program to ask ChatGPT to build yet another app for you. It's a substitute for your brain. My university students have good grades on their do-at-home exams, but can't spot a off-by-one error on a 3 lines Golang for loop during an in-person exam.

replies(1): >>43656731 #
1. DrSiemer ◴[] No.43656731[source]
This is incorrect. You very much need to know how to program to make an AI build an app for you. Language models are not capable of creating anything new without significant guidance and at least some understanding of the code, unless you're asking it to create projects that tutorials have been written about. AI in it's current form is also just "a tool you can work with".

Like with the calculator, why would you need to be able to calculate things on paper if you can just have a machine do it for you? Same goes for more advanced AI: what's the point of being able to do things without them?

Not to offend, but in my opinion that's nothing more than a romantic view of what humans "should be capable of". 10 years from now we can all laugh at the idea of people defending doing stuff without AI assistance.

replies(1): >>43657122 #
2. hacb ◴[] No.43657122[source]
Of course, an AI will not "magically" code an app the same way 10 developers will do in a year, I don't think we disagree on this.

However, it allows you to do things you don't understand. I'm again taking examples from what I see at my university (n=1): almost all students deliver complex programming projects involving multi-threading, but can't answer a basic quizz about the same language in-person. And by basic question I mean "select among the propositions listed below the correct keyword used to declare a variable in Golang". I'm not kidding, at least one-third of the class is actually answering something wrong here.

So yeah, maybe we as a society agree on the fact that those people will not be software engineers, but prompt engineers. They'll send instructions to an agent that will display text in a strange and cryptic language, and maybe when they'll press "Run" lights will be green. But as a professional, why should I hire them once they earned their diploma? They are far from being ready for the professional world, can't debug systems without using LLMs (and maybe those LLMs can't help them because the company context is too important), and most importantly they are way less capable than freshly graduated engineers from a few years back.

> 10 years from now we can all laugh at the idea of people defending doing stuff without AI assistance.

I hope so, but I'm quite pessimistic unfortunately. Expertise and focus capabilities are dying, and we are more and more relying on artificial "intelligence" and its biases. But the future will tell

replies(1): >>43658209 #
3. DrSiemer ◴[] No.43658209[source]
Isn't it irrelevant that students do not have the answer to a basic quiz though? In a real life situation, they can just _ask an LLM_ if they need to know something.

I don't believe having this option will make people a lot less functional. Sure, some may slip through the cracks by faking it, but we'll soon develop different metrics to judge somebodies true capabilities. Actually, we'll probably create AI for that as well.

As a professional, you hire people who get things done. If that means hiring skilled LLM users, that do not fully understand what they produce, but what they make consistently works about as often as classic dev output does, and they do this in a fraction of the time... You would be crazy _not_ to hire them.

It's true that inexperienced developers will probably generate a massive tech debt during the time where AI is good enough to provide code, but not good enough to fish out hidden bugs. It will soon surpass humans at that skill though, and can then quickly clean up all the spaghetti.

Over the last two years my knowledge on how to perform and automate repetitive and predictable tasks has gradually worn away, replaced by a higher level understanding of software architecture. I use it to guide language models to a desired outcome. For those that want to learn, LLM's excel at explaining code. For this, and plenty of other subjects, it's the greatest learning tool we have ever had! All it takes is a curious mind.

We are in a transitionary time and we simply need to figure out how to deal with this new technology, warts and all. It's not like there is an alternative scenario; it's not going to go away...