←back to thread

504 points puttycat | 1 comments | | HN request time: 0.357s | source
Show context
theoldgreybeard ◴[] No.46182214[source]
If a carpenter builds a crappy shelf “because” his power tools are not calibrated correctly - that’s a crappy carpenter, not a crappy tool.

If a scientist uses an LLM to write a paper with fabricated citations - that’s a crappy scientist.

AI is not the problem, laziness and negligence is. There needs to be serious social consequences to this kind of thing, otherwise we are tacitly endorsing it.

replies(37): >>46182289 #>>46182330 #>>46182334 #>>46182385 #>>46182388 #>>46182401 #>>46182463 #>>46182527 #>>46182613 #>>46182714 #>>46182766 #>>46182839 #>>46182944 #>>46183118 #>>46183119 #>>46183265 #>>46183341 #>>46183343 #>>46183387 #>>46183435 #>>46183436 #>>46183490 #>>46183571 #>>46183613 #>>46183846 #>>46183911 #>>46183917 #>>46183923 #>>46183940 #>>46184450 #>>46184551 #>>46184653 #>>46184796 #>>46185025 #>>46185817 #>>46185849 #>>46189343 #
1. photochemsyn ◴[] No.46183119[source]
Yeah, I can't imagine not being familiar with every single reference in the bibliography of a technical publication with one's name on it. It's almost as bad as those PIs who rely on lab techs and postdocs to generate research data using equipment that they don't understand the workings of - but then, I've seen that kind of thing repeatedly in research academia, along with actual fabrication of data in the name of getting another paper out the door, another PhD granted, etc.

Unfortunately, a large fraction of academic fraud has historically been detected by sloppy data duplication, and with LLMs and similar image generation tools, data fabrication has never been easier to do or harder to detect.