←back to thread

395 points pseudolus | 7 comments | | HN request time: 0.204s | source | bottom
Show context
dtnewman ◴[] No.43633873[source]
> A common question is: “how much are students using AI to cheat?” That’s hard to answer, especially as we don’t know the specific educational context where each of Claude’s responses is being used.

I built a popular product that helps teachers with this problem.

Yes, it's "hard to answer", but let's be honest... it's a very very widespread problem. I've talked to hundreds of teachers about this and it's a ubiquitous issue. For many students, it's literally "let me paste the assignment into ChatGPT and see what it spits out, change a few words and submit that".

I think the issue is that it's so tempting to lean on AI. I remember long nights struggling to implement complex data structures in CS classes. I'd work on something for an hour before I'd have an epiphany and figure out what was wrong. But that struggling was ultimately necessary to really learn the concepts. With AI, I can simply copy/paste my code and say "hey, what's wrong with this code?" and it'll often spot it (nevermind the fact that I can just ask ChatGPT "create a b-tree in C" and it'll do it). That's amazing in a sense, but also hurts the learning process.

replies(34): >>43633957 #>>43634006 #>>43634053 #>>43634075 #>>43634251 #>>43634294 #>>43634327 #>>43634339 #>>43634343 #>>43634407 #>>43634559 #>>43634566 #>>43634616 #>>43634842 #>>43635388 #>>43635498 #>>43635830 #>>43636831 #>>43638149 #>>43638980 #>>43639096 #>>43639628 #>>43639904 #>>43640528 #>>43640853 #>>43642243 #>>43642367 #>>43643255 #>>43645561 #>>43645638 #>>43646665 #>>43646725 #>>43647078 #>>43654777 #
1. taftster ◴[] No.43634251[source]
I don't think asking "what's wrong with my code" hurts the learning process. In fact, I would argue it helps it. I don't think you learn when you have reached your frustration point and you just want the dang assignment completed. But before reaching that point, if you had a tutor or assistant that you could ask, "hey, I'm just not seeing my mistake, do you have ideas" goes a long way to foster learning. ChatGPT, used in this way, can be extremely valuable and can definitely unlock learning in new ways which we probably even haven't seen yet.

That being said, I agree with you, if you just ask ChatGPT to write a b-tree implementation from scratch, then you have not learned anything. So like all things in academia, AI can be used to foster education or cheat around it. There's been examples of these "cheats" far before ChatGPT or Google existed.

replies(2): >>43634324 #>>43635568 #
2. SoftTalker ◴[] No.43634324[source]
No I think the struggle is essential. If you can just ask a tutor (real or electronic) what is wrong with your code, you stop thinking and become dependent on that. Learning to think your way through a roadblock that seems like a showstopper is huge.

It's sort of the mental analog of weight training. The only way to get better at weightlifting is to actually lift weight.

replies(1): >>43636708 #
3. ◴[] No.43635568[source]
4. taftster ◴[] No.43636708[source]
If I were to go and try to bench 300lbs, I would absolutely need a spotter to rescue me. Taking on more weight than I can possibly achieve is a setup for failure.

Sure, I should probably practice benching 150lbs. That would be a good challenge for me and I would benefit from that experience. But 300lbs would crush me.

replies(1): >>43639506 #
5. theamk ◴[] No.43639506{3}[source]
Sadly, ChatGPT is like a spotter that takes over at the smallest hint of struggle. Yes, you are not going to get crushed, but you won't get any workout done either.

You really want start with a smaller weight, and increment it in steps as you progress. You know, like a class or something. And when you do those exercises, you really want to be lifting those weights yourself, and not rely on spotter for every rep.

replies(1): >>43648929 #
6. taftster ◴[] No.43648929{4}[source]
We're stretching the metaphor here. I know, kind of obnoxious.

If I have accidentally lifted too much weight, I want a spotter that can immediately give me relief. But yes, you're right. If I am always getting a spot, then I'm not really lifting my own weight and indeed not making any gains.

I think the question was, "I'm stuck on this code, and I don't see an obvious answer." Now the lazy student is going to ask for help prematurely. But that doesn't preclude ChatGPT's use to only the lazy.

If I'm stuck and I'm asking for insight, I think it's brilliant that ChatGPT can act as a spotter and give some immediate relief. No different than asking for a tutor. Yes maybe ChatGPT gives away the whole answer when all you needed is a hint. That's the difference between pure human intelligence and just the glorified search engine that is AI.

And quite probably, this could be a really awesome way in which AI learning models could evolve in the context of education. Maybe ChatGPT doesn't give you the whole answer, instead it can just give you the hint you need to consider moving forward.

Microsoft put out a demo/video of a grad student using Copilot in very much this way. Basically the student was asking questions and Copilot was giving answers that were in the frame of "did you think about this approach?" or "consider that there are other possibilities", etc. Granted, mostly a marketing vibe from MSFT, but this really demonstrates a vision for using LLMs as a means for true learning, not just spoiling the answer.

replies(1): >>43650360 #
7. theamk ◴[] No.43650360{5}[source]
Sure, this is possible. Also Chegg is an "innovative learning tool", not a way to cheat.

I agree that it's not that different than asking a tutor though, assuming it's a personal tutor whom are you paying so they won't ever refuse to answer. I've never had access to someone like that, but I can totally believe that if I did, I would graduate without learning much.

Back to ChatGPT: during my college times I've had plenty of times when I was really struggling, I remember feeling extremely frustrated when my projects would not work, and spending long hours in the labs. I was able to solve this myself, without any outside help, be it tutors or AI - and I think this was the most important part of my education, probably at least as important as all the lectures I went to. As they say, "no pain, no gain".

That said, our discussion is kinda useless - it's not like we can convince college students to stop using AI. The bad colleges will pass everyone (this already happens), the good colleges will adapt (probably by assigning less weight to homework and more weight to in-class exams). Students will have another reason to fail the class: in additional to classic "I spend whole semester partying/playing computer games instead of studying", they would also say "I never opened books and had ChatGPT do all assignments for me, why am I failing tests?"