←back to thread

395 points pseudolus | 8 comments | | HN request time: 0.928s | source | bottom
Show context
dtnewman ◴[] No.43633873[source]
> A common question is: “how much are students using AI to cheat?” That’s hard to answer, especially as we don’t know the specific educational context where each of Claude’s responses is being used.

I built a popular product that helps teachers with this problem.

Yes, it's "hard to answer", but let's be honest... it's a very very widespread problem. I've talked to hundreds of teachers about this and it's a ubiquitous issue. For many students, it's literally "let me paste the assignment into ChatGPT and see what it spits out, change a few words and submit that".

I think the issue is that it's so tempting to lean on AI. I remember long nights struggling to implement complex data structures in CS classes. I'd work on something for an hour before I'd have an epiphany and figure out what was wrong. But that struggling was ultimately necessary to really learn the concepts. With AI, I can simply copy/paste my code and say "hey, what's wrong with this code?" and it'll often spot it (nevermind the fact that I can just ask ChatGPT "create a b-tree in C" and it'll do it). That's amazing in a sense, but also hurts the learning process.

replies(34): >>43633957 #>>43634006 #>>43634053 #>>43634075 #>>43634251 #>>43634294 #>>43634327 #>>43634339 #>>43634343 #>>43634407 #>>43634559 #>>43634566 #>>43634616 #>>43634842 #>>43635388 #>>43635498 #>>43635830 #>>43636831 #>>43638149 #>>43638980 #>>43639096 #>>43639628 #>>43639904 #>>43640528 #>>43640853 #>>43642243 #>>43642367 #>>43643255 #>>43645561 #>>43645638 #>>43646665 #>>43646725 #>>43647078 #>>43654777 #
1. sally_glance ◴[] No.43638980[source]
I think this is a structural issue. Universities right now are trying to justify their existence - universities of the past used to be sites of innovation.

Using ChatGPT doesn't dumb down your students. Not knowing how it works and where to use it does. Don't do silly textbook challenges for exams anymore - reestablish a culture of scientific innovation!

replies(3): >>43639000 #>>43639022 #>>43651691 #
2. acbart ◴[] No.43639000[source]
Incorrect. Fundamentals must be taught in order to provide the context for the more challenging open-ended activities. Memorization is the base of knowledge, a starting point. Cheating (whether through an LLM or hiring someone or whatever) skips the journey. You can't just take them through the exciting routes, sometimes they have to go through the boring tedious repetitive stuff because that's how human brains learn. Learning is, literally, a stressful process on the brain. Students try to avoid it, but that's not good for them. At least in the introductory core classes.
replies(1): >>43643671 #
3. nixpulvis ◴[] No.43639022[source]
You claim using AI tools doesn't dumb you down, but it very well could and is. Take the calculator for example, I'm overly dependent on it. I'm slower to perform arithmetic than I would have been without it. But knowing how to use one allows me to do more complex math more quickly. So I'm "dumber" in one way and "smarter" in others. AI could be the same... except our education system doesn't seem ready for it. We still learn arithmetic, even if we later rely in tools to do it. Right now teachers don't know how to teach so that AI doesn't trivialize things.

You need to know how to do things so you know when the AI is lying to you.

replies(1): >>43643734 #
4. sally_glance ◴[] No.43643671[source]
I guess I should have phrased it differently - what I meant was just stop testing the tedious stuff, make it clear to students that learning the fundamentals is expected. Then examine them on hard exploratory problems which require the fundamentals.
5. sally_glance ◴[] No.43643734[source]
I agree that you should learn the fundamentals before taking shortcuts. I just don't view it as the universities' job to repeatedly remind their students of this, that's elementary/high school style. In universitiy, just give them hard problems requiring fundamental knowledge and cross-checking capabilities but don't restrict their tools.
replies(1): >>43648078 #
6. nixpulvis ◴[] No.43648078{3}[source]
I TA'd for the fundamentals of computer science I in college. In addition to being a great class for freshman, teaching it every year really did help keep me sharp.

High schools are a long way off from that level of education. I took AP CS in highschool and it was a joke by comparison. Of course YMMV. The best highschool CS course might be better than the worst university level offerings. We would always have know it all students who learned Java in high school. They either appreciated the new perspective on the fundamentals and did well, or they blew off the class and failed when it got harder.

replies(1): >>43650872 #
7. sally_glance ◴[] No.43650872{4}[source]
We could keep the same teaching offerings, my main gripe is with the assignments/examinations. It just feels wrong to complain about students using AI while at the same time continuing to hand out tasks that are trivial to solve using AI.

I also worked for the faculty for the better part of my university studies, and I know that ultimately changing the status quo is most likely impractical. There are not enough resources to continuously grade open-ended assignments for so many students and they probably need the pedagogical pressure to learn fundamentals. Still makes me a bit bitter from time to time.

8. never_inline ◴[] No.43651691[source]
> Using ChatGPT doesn't dumb down your students. Not knowing how it works and where to use it does.

LLMs can't produce intellectual rigour. They get fine details wrong every time. So indeed using ChatGPT for doing your reasoning for you produces inferior results. By normalising non-rigorous yet correct sounding answers, we drive down the expectations.

To take a concrete example, if you tell a student to implement memcpy with chatgpt, and it will just give an answer which uses uint64 copying. The student has not thought from first principles (copy byte by byte? Improve performance? How to handle alignment?). This lack of insight in return to immediate gratification will bite later.

It's maybe not problem for non-STEM fields where this kind of rigor and insight is not required to excel. But in STEM fields, we write programs and prove theorems for insight. And that insight and the process of obtaining it is gone with AI.