←back to thread

648 points bradgessler | 1 comments | | HN request time: 0.204s | source
1. analog31 ◴[] No.44009965[source]
Something I'm on the fence about, but just trying to figure out from observation, is whether the AI can decide what is worthwhile. It seems like most of the successes of AI that I've seen are cases where someone is tasked with writing something that's not worth reading.

Granted, that happened before AI. The vast majority of text in my in-box, I never read. I developed heuristics for deciding what to ignore. "Stuff that looks like it was probably generated" will probably be a new heuristic. It's subjective for now. One clue is if it seems more literate than the person who wrote it.

Stuff that's written for school falls into that category. It existed for some reason other than being read, such as the hope that the activity of writing conferred some educational benefit. That was a heuristic too -- a rule of thumb for how to teach, that has been broken by AI.

Sure, AI can be used to teach a job skill, which is writing text that's not worth reading. Who wants to be the one who looks the kids in the eye and explain this to them?

On the other hand, I do use Copilot now, where I would have used Stackoverflow in the past.