←back to thread

388 points pseudolus | 1 comments | | HN request time: 0s | source
Show context
carabiner ◴[] No.43485544[source]
If you look at fields like mechanical engineering, no, it hasn't, because there isn't much training data for that type of work available on the internet. (CAD isn't engineering.) It's locked up in corporations like Ford, SpaceX, Toyota. There isn't open source mechanical analysis work available at a professional level, which might be an FEA output that references 5 messy spreadsheets that gets written up in a 20 page report with charts, and that references 20 other internal docs, specs. And every company does analysis differently so I'm not sure you could adequately train a model that can generalize it well.
replies(1): >>43486140 #
mywittyname ◴[] No.43486140[source]
I think this is a flawed outlook.

My wife works in healthcare and they just announced that AI would generate 50% of their documents. According to my wife, every in every review she's done of the work, there are critical errors in logic or reasoning (hallucinations), but leadership doesn't care because they a "grading" the AI on things like grammar and spelling. So, yay, it spelled "fusospirochetosis" correctly, but unfortunately it attributed that to a 23 year old woman who came in for a broken arm.

I think we need to be prepared for "vibe coding" taking over every industry. Leadership is going to blindly implement broken AI to replace workers.

Now, you might think that threat of lawsuits, regulations, and ethics standards will save your industry. And maybe that's true, but I personally don't think so. I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late (and maybe not even then).

I hope you're right, but I'm not convinced.

replies(6): >>43486506 #>>43486572 #>>43487094 #>>43488180 #>>43489345 #>>43494286 #
mrweasel ◴[] No.43486506[source]
> I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late

Sound about right to me. A large quantity of documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

I think that we're producing way to many documents/reports/content in general and we need to slow down. Humans can't keep up, neither on the production side, nor the consumption, so we employ computers to "help", but it's all busy work.

replies(3): >>43488364 #>>43489806 #>>43494335 #
1. johnnyanmac ◴[] No.43494335[source]
>documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

Medical records seems to be the opposite of this. Nurses document to report to doctors so they can diagnose and advise. That seems like the worst step to cut corners on. Accountants document reports for patient for prescriptions and diagnosis that can affect their lives. Heck, as a CYA you document records in case a law firm comes in demanding papers to read. Do they think lawyers don't care about this stuff?

Just because not everyone is gonna pour over ever word doesn't mean no one will.