Most active commenters
  • zebomon(5)

←back to thread

395 points pseudolus | 16 comments | | HN request time: 0.999s | source | bottom
1. zebomon ◴[] No.43634458[source]
The writing is irrelevant. Who cares if students don't learn how to do it? Or if the magazines are all mostly generated a decade from now? All of that labor spent on writing wasn't really making economic sense.

The problem with that take is this: it was never about the act of writing. What we lose, if we cut humans out of the equation, is writing as a proxy for what actually matters, which is thinking.

You'll soon notice the downsides of not-thinking (at scale!) if you have a generation of students who weren't taught to exercise their thinking by writing.

I hope that more people come around to this way of seeing things. It seems like a problem that will be much easier to mitigate than to fix after the fact.

A little self-promo: I'm building a tool to help students and writers create proof that they have written something the good ol fashioned way. Check it out at https://itypedmypaper.com and let me know what you think!

replies(7): >>43634700 #>>43634723 #>>43634750 #>>43634975 #>>43638598 #>>43641238 #>>43641279 #
2. spongebobstoes ◴[] No.43634700[source]
Writing is not necessary for thinking. You can learn to think without writing. I've never had a brilliant thought while writing.

In fact, I've done a lot more thinking and had a lot more insights from talking than from writing.

Writing can be a useful tool to help with rigorous thinking. In my opinion, is mostly about augmenting the author's effective memory to be larger and more precise.

I'm sure the same effect could be achieved by having AI transcribe a conversation.

replies(1): >>43635829 #
3. janalsncm ◴[] No.43634723[source]
How does your product prevent a person from simply retyping something that ChatGPT wrote?

I think the prevalence of these AI writing bots means schools will have to start doing things that aren’t scalable: in-class discussions, in-person writing (with pen and paper or locked down computers), way less weight given to remote assignments on Canvas or other software. Attributing authorship from text alone (or keystroke patterns) is not possible.

replies(2): >>43634905 #>>43636208 #
4. ketzu ◴[] No.43634750[source]
> The writing is irrelevant.

In my opinion this is not true. Writing is a form of communicating ideas. Structuring and communicating ideas with others is really important, not just in written contexts, and it needs to be trained.

Maybe the way universities do it is not great, but writing in itself is important.

replies(1): >>43634805 #
5. zebomon ◴[] No.43634805[source]
Kindly read past the first line, friend :)
replies(1): >>43666872 #
6. zebomon ◴[] No.43634905[source]
It may be possible that with enough data from the two categories (copied from ChatGPT and not), your keystroke dynamics will differ. This is an open question that my co-founder and I are running experiments on currently.

So, I would say that while I wouldn't fully dispute your claim that attributing authorship from text alone is impossible, it isn't yet totally clear one way or the other (to us, at least -- would welcome any outside research).

Long-term -- and that's long-term in AI years ;) -- gaze tracking and other biometric tracking will undoubtedly be necessary. At some point in the near future, many people will be wearing agents inside earbuds that are not obvious to the people around them. That will add another layer of complexity that we're aware of. Fundamentally, it's more about creating evidence than creating proof.

We want to give writers and students the means to create something more detailed than they would get from a chatbot out-of-the-box, so that mimicking the whole act of writing becomes more complicated.

replies(1): >>43634985 #
7. knowaveragejoe ◴[] No.43634975[source]
Paul Graham had a recent blogpost about this, and I find it hard to disagree with.

https://www.paulgraham.com/writes.html

8. pr337h4m ◴[] No.43634985{3}[source]
At this point, it would be easier to stick to in-person assignments.
replies(1): >>43635124 #
9. zebomon ◴[] No.43635124{4}[source]
It certainly would be! I think for many students though, there's something lost there. I was a student who got a lot more value out of my take-home work than I did out of my in-class work. I don't think that I ever would have taken the interest in writing that I did if it wasn't such a solitary, meditative thing for me.
10. Unearned5161 ◴[] No.43635829[source]
I'm not settled on transcribed conversation being an adequate substitute for writing, but maybe it's better than nothing.

There's something irreplaceable about the absoluteness of words on paper and the decisions one has to do to write them out. Conversational speak is, almost by definition, more relaxed and casual. The bar is lower and as such, the bar for thoughts is lower, in order of ease of handwaving I think it goes: mental, speech, writing.

Furthermore there's the concept of editing which I'm unsure how it could be carried out in a conversational sense in graceful manner. Being able to revise words, delete, move around, can't be done with conversation unless you count "forget I said that, it's actually more like this..." as suitable.

11. logicchains ◴[] No.43636208[source]
>I think the prevalence of these AI writing bots means schools will have to start doing things that aren’t scalable

It won't be long 'til we're at the point that embodied AI can be used for scalable face-to-face assessment that can't be cheated any easier than a human assessor.

12. karn97 ◴[] No.43638598[source]
I literally never write while thinking lol stop projecting this hard
13. aprilthird2021 ◴[] No.43641238[source]
What we lose if we cut humans out of the equation is the soul and heart of reflection, creativity, drama, comedy, etc.

All those have, at the base of them, the experience of being human, something an LLM does not and will never have.

replies(1): >>43648919 #
14. jillesvangurp ◴[] No.43641279[source]
Students will work in a world where they have to use AI to do their jobs. This is not going to be optional. Learning to use AIs effectively is an important skill and should be part of their education.

And it's an opportunity for educators to raise the ambition level quite a bit. It indeed obsoletes some of the tests they've been using to evaluate students. But they too now have the AI tools to do a better job and come up with more effective tests.

Think of all that time freed up having to actually read all those submitted papers. I can tell you from experience (I taught a few classes as a post doc way back): not fun. Minimum you can just instantly fail the ones that are obviously poorly written, are full of grammatical errors, and feature lots of flawed reasoning. Most decent LLMs do a decent job of doing that. Is using an LLM for that cheating if a teacher does it? I think that should just be expected at this point. And if it is OK for the teacher, it should be OK for the student.

If you expect LLMs to be used, it raises the bar for the acceptable quality level of submitted papers. They should be readable, well structured, well researched, etc. There really is no excuse for those papers not being like that. The student needs to be able to tell the difference. That actually takes skill to ask for the right things. And you can grill them on knowledge of their own work. A little 10 minute conversation maybe. Which should be about the amount of time a teacher would have otherwise spent on evaluating the paper manually and is definitely more fun (I used to do that; give people an opportunity to defend their work).

And if you really want to test writing skills, put students in a room with pen and paper. That's how we did things in the eighties and nineties. Most people did not have PCs and printers then. Poor teachers had to actually sit down and try to decipher my handwriting. Which even when that skill had not atrophied for a few decades, wasn't great.

LLMs will force change in education one way or another. Most of that change will be good. People trying to cheat is a constant. We just need to force them to be smarter about it. Which at a meta level isn't that bad of a skill to learn when you are educating people.

15. zebomon ◴[] No.43648919[source]
I agree!
16. ketzu ◴[] No.43666872{3}[source]
I did. :)

(And I am aware of the irony in failing to communicate when mentioning that studying writing is important to be good at communication.) Maybe I should have also cited this part:

> writing as a proxy for what actually matters, which is thinking.

In my opinion, writing is important not (only) as a proxy for thinking, but as a direct form of communicating ideas. (Also applies to other forms of communication though.)