←back to thread

504 points puttycat | 4 comments | | HN request time: 0.001s | source
Show context
theoldgreybeard ◴[] No.46182214[source]
If a carpenter builds a crappy shelf “because” his power tools are not calibrated correctly - that’s a crappy carpenter, not a crappy tool.

If a scientist uses an LLM to write a paper with fabricated citations - that’s a crappy scientist.

AI is not the problem, laziness and negligence is. There needs to be serious social consequences to this kind of thing, otherwise we are tacitly endorsing it.

replies(37): >>46182289 #>>46182330 #>>46182334 #>>46182385 #>>46182388 #>>46182401 #>>46182463 #>>46182527 #>>46182613 #>>46182714 #>>46182766 #>>46182839 #>>46182944 #>>46183118 #>>46183119 #>>46183265 #>>46183341 #>>46183343 #>>46183387 #>>46183435 #>>46183436 #>>46183490 #>>46183571 #>>46183613 #>>46183846 #>>46183911 #>>46183917 #>>46183923 #>>46183940 #>>46184450 #>>46184551 #>>46184653 #>>46184796 #>>46185025 #>>46185817 #>>46185849 #>>46189343 #
thaumasiotes ◴[] No.46182401[source]
> If a scientist uses an LLM to write a paper with fabricated citations - that’s a crappy scientist.

Really? Regardless of whether it's a good paper?

replies(2): >>46182457 #>>46182500 #
1. Aurornis ◴[] No.46182500[source]
Citations are a key part of the paper. If the paper isn’t supported by the citations, it’s not a good paper.
replies(1): >>46182557 #
2. withinboredom ◴[] No.46182557[source]
Have you ever followed citations before? In my experience, they don't support what is being citated, saying the opposite or not even related. It's probably only 60%-ish that actually cite something relevant.
replies(2): >>46184339 #>>46192533 #
3. WWWWH ◴[] No.46184339[source]
Well yes, but just because that’s bad doesn’t mean this isn’t far worse.
4. Aurornis ◴[] No.46192533[source]
I follow them a lot. I’ve also had cases where they don’t support the paper.

This doesn’t make it okay. Bad human writer and reviewer practices are also bad.