←back to thread

323 points timbilt | 1 comments | | HN request time: 0.21s | source
Show context
ratedgene ◴[] No.42129665[source]
I was talking to a teacher today that works with me at length about the impact of AI LLM models are having now when considering student's attitude towards learning.

When I was young, I refused to learn geography because we had map applications. I could just look it up. I did the same for anything I could, offload the cognitive overhead to something better -- I think this is something we all do consciously or not.

That attitude seems to be the case for students now, "Why do I need to do this when an LLM can just do it better?"

This led us to the conclusion:

1. How do you construct challenges that AI can't solve? 2. What skills will humans need next?

We talked about "critical thinking", "creative problem solving", and "comprehension of complex systems" as the next step, but even when discussing this, how long will it be until more models or workflows catch up?

I think this should lead to a fundamental shift in how we work WITH AI in every facet of education. How can a human be a facilitator and shepherd of the workflows in such a way that can complement the model and grow the human?

I also think there should be more education around basic models and how they work as an introductory course to students of all ages, specifically around the trustworthiness of output from these models.

We'll need to rethink education and what we really desire from humans to figure out how this makes sense in the face of traditional rituals of education.

replies(12): >>42129683 #>>42129718 #>>42129742 #>>42129844 #>>42130036 #>>42130165 #>>42130200 #>>42130240 #>>42130245 #>>42130568 #>>42135482 #>>42137623 #
rurp ◴[] No.42130568[source]
> When I was young, I refused to learn geography because we had map applications. I could just look it up. I did the same for anything I could, offload the cognitive overhead to something better -- I think this is something we all do consciously or not.

This is certainly useful to a point, and I don't recommend memorizing a lot of trivia, but it's easy to go too far with it. Having a basic mental model about many aspects of the world is extremely important to thinking deeply about complex topics. Many subjects worth thinking about involve interactions between multiple domains and being able to quickly work though various ideas in your head without having to stop umpteen times can make a world of difference.

To stick with the maps example, if you're reading an article about conflict in the Middle East it's helpful to know off the top of your head whether or not Iran borders Canada. There are plenty of jobs in software or finance that don't require one to be good at mental math, but you're going to run into trouble if you don't at least grok the concept of exponential growth or have a sense for orders of magnitude.

replies(1): >>42130817 #
casey2 ◴[] No.42130817[source]
Helpful in terms of what? Understanding some forced meme? "Force this meme so you can understand this other forced meme." is not education it's indoctrination. And even if you wanted to, for some unknown reason, understand the article you can look at a (changing and disputed) map as the parent said.

This is the opposite of deep knowledge, this is API knowledge at best.

replies(1): >>42130984 #
1. achierius ◴[] No.42130984[source]
Are you referring to: > if you're reading an article about conflict in the Middle East it's helpful to know off the top of your head whether or not Iran borders Canada ?

Perhaps, but in the case that you are I think it's a stretch to say that the only utility of this is 'indoctrination' or 'understanding this. other forced meme'. The point is that lookups (even to an AI) cost time, and if you have to do one for every other line in a document, you will either end up spending a ton of time reading, or (more likely) do an insufficient number of lookups and come away with a distorted view of the situation. This 'baseline' level of knowledge IMO is a reasonable thing to expect for any field, not 'indoctrination' in anything other than the most diluted sense of the term.