←back to thread

504 points puttycat | 2 comments | | HN request time: 0.421s | source
Show context
theoldgreybeard ◴[] No.46182214[source]
If a carpenter builds a crappy shelf “because” his power tools are not calibrated correctly - that’s a crappy carpenter, not a crappy tool.

If a scientist uses an LLM to write a paper with fabricated citations - that’s a crappy scientist.

AI is not the problem, laziness and negligence is. There needs to be serious social consequences to this kind of thing, otherwise we are tacitly endorsing it.

replies(37): >>46182289 #>>46182330 #>>46182334 #>>46182385 #>>46182388 #>>46182401 #>>46182463 #>>46182527 #>>46182613 #>>46182714 #>>46182766 #>>46182839 #>>46182944 #>>46183118 #>>46183119 #>>46183265 #>>46183341 #>>46183343 #>>46183387 #>>46183435 #>>46183436 #>>46183490 #>>46183571 #>>46183613 #>>46183846 #>>46183911 #>>46183917 #>>46183923 #>>46183940 #>>46184450 #>>46184551 #>>46184653 #>>46184796 #>>46185025 #>>46185817 #>>46185849 #>>46189343 #
rectang ◴[] No.46182944[source]
“X isn’t the problem, people are the problem.” — the age-old cry of industry resisting regulation.
replies(3): >>46183019 #>>46183085 #>>46183400 #
codywashere ◴[] No.46183019[source]
what regulation are you advocating for here?
replies(2): >>46183149 #>>46183717 #
1. kibwen ◴[] No.46183149[source]
At the very least, authors who have been caught publishing proven fabrications should be barred by those journals from ever publishing in them again. Mind you, this is regardless of whether or not an LLM was involved.
replies(1): >>46183849 #
2. JumpCrisscross ◴[] No.46183849[source]
> authors who have been caught publishing proven fabrications should be barred by those journals from ever publishing in them again

This is too harsh.

Instead, their papers should be required to disclose the transgression for a period of time, and their institution should have to disclose it publicly as well as to the government, students and donors whenever they ask them for money.