←back to thread

395 points pseudolus | 3 comments | | HN request time: 0.421s | source
Show context
dtnewman ◴[] No.43633873[source]
> A common question is: “how much are students using AI to cheat?” That’s hard to answer, especially as we don’t know the specific educational context where each of Claude’s responses is being used.

I built a popular product that helps teachers with this problem.

Yes, it's "hard to answer", but let's be honest... it's a very very widespread problem. I've talked to hundreds of teachers about this and it's a ubiquitous issue. For many students, it's literally "let me paste the assignment into ChatGPT and see what it spits out, change a few words and submit that".

I think the issue is that it's so tempting to lean on AI. I remember long nights struggling to implement complex data structures in CS classes. I'd work on something for an hour before I'd have an epiphany and figure out what was wrong. But that struggling was ultimately necessary to really learn the concepts. With AI, I can simply copy/paste my code and say "hey, what's wrong with this code?" and it'll often spot it (nevermind the fact that I can just ask ChatGPT "create a b-tree in C" and it'll do it). That's amazing in a sense, but also hurts the learning process.

replies(34): >>43633957 #>>43634006 #>>43634053 #>>43634075 #>>43634251 #>>43634294 #>>43634327 #>>43634339 #>>43634343 #>>43634407 #>>43634559 #>>43634566 #>>43634616 #>>43634842 #>>43635388 #>>43635498 #>>43635830 #>>43636831 #>>43638149 #>>43638980 #>>43639096 #>>43639628 #>>43639904 #>>43640528 #>>43640853 #>>43642243 #>>43642367 #>>43643255 #>>43645561 #>>43645638 #>>43646665 #>>43646725 #>>43647078 #>>43654777 #
1. psygn89 ◴[] No.43634339[source]
Agreed, the struggle often leads us to poke and prod an issue from many angles until things finally click. It lets us think critically. In that journey you might've learned other related concepts which further solidifies your understanding.

But when the answer flows out of thin air right in front of you with AI, you get the "oh duh" or "that makes sense" moments and not the "a-ha" moment that ultimately sticks with you.

Now does everything need an "a-ha" moment? No.

However, I think core concepts and fundamentals need those "a-ha" moments to build a solid and in-depth foundation of understanding to build upon.

replies(2): >>43636793 #>>43640690 #
2. taftster ◴[] No.43636793[source]
Absolutely this. AI can help reveal solutions that weren't seen. An a-ha moment can be as instrumental to learning as the struggle that came before.

Academia needs to embrace this concept and not try to fight it. AI is here, it's real, it's going to be used. Let's teach our students how to benefit from its (ethical) use.

3. porridgeraisin ◴[] No.43640690[source]
Yep. People love to cut down this argument by saying that a few decades ago, people said the same thing about calculators. But that was a problem too! People losing a large portion of their mental math faculty is definitely a problem. If mental math was required daily, we wouldn't see such obvious BS numbers in every kind of reporting(media/corporate/tech benchmarks) that people don't bat an eye at. How much the problem is _worth_ though, is what matters for adoption of these kinds of tech. Clearly, the problem above wasn't worth much. We now have to wait and see how much the "did not learn through cuts and scratches" problem is worth.