Most active commenters
  • johnnyanmac(3)

←back to thread

388 points pseudolus | 13 comments | | HN request time: 2.23s | source | bottom
1. carabiner ◴[] No.43485544[source]
If you look at fields like mechanical engineering, no, it hasn't, because there isn't much training data for that type of work available on the internet. (CAD isn't engineering.) It's locked up in corporations like Ford, SpaceX, Toyota. There isn't open source mechanical analysis work available at a professional level, which might be an FEA output that references 5 messy spreadsheets that gets written up in a 20 page report with charts, and that references 20 other internal docs, specs. And every company does analysis differently so I'm not sure you could adequately train a model that can generalize it well.
replies(1): >>43486140 #
2. mywittyname ◴[] No.43486140[source]
I think this is a flawed outlook.

My wife works in healthcare and they just announced that AI would generate 50% of their documents. According to my wife, every in every review she's done of the work, there are critical errors in logic or reasoning (hallucinations), but leadership doesn't care because they a "grading" the AI on things like grammar and spelling. So, yay, it spelled "fusospirochetosis" correctly, but unfortunately it attributed that to a 23 year old woman who came in for a broken arm.

I think we need to be prepared for "vibe coding" taking over every industry. Leadership is going to blindly implement broken AI to replace workers.

Now, you might think that threat of lawsuits, regulations, and ethics standards will save your industry. And maybe that's true, but I personally don't think so. I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late (and maybe not even then).

I hope you're right, but I'm not convinced.

replies(6): >>43486506 #>>43486572 #>>43487094 #>>43488180 #>>43489345 #>>43494286 #
3. mrweasel ◴[] No.43486506[source]
> I personally think we are going to see country-scale enshitification of everything as lazy people use AI to generate everything which is then reviewed by lazy people who also use AI to review everything and no one notices or cares until it's too late

Sound about right to me. A large quantity of documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

I think that we're producing way to many documents/reports/content in general and we need to slow down. Humans can't keep up, neither on the production side, nor the consumption, so we employ computers to "help", but it's all busy work.

replies(3): >>43488364 #>>43489806 #>>43494335 #
4. thechao ◴[] No.43486572[source]
Idiocracy was supposed to be funny, not terrifying.
5. carabiner ◴[] No.43487094[source]
Profit and a competitive market are powerful incentives that keep people from dying in the industry of human transport, that don't exist as much in hospitals. Also, if you think "threat of lawsuits, regulations, and ethics standards" don't encourage safe cars or airplanes currently, what do you think does? In engineering currently there are 1,000 ways to cheat (and sometimes they do, like Volkswagen dieselgate), but reports are actually checked rigorously multiple times by other engineers at a company and by federal regulators (who are also engineers). Things get rejected and changed based on these reviews. Why would AI engineering make this different?

If AI could cause a fiasco on the level of Toyota's unintended acceleration debacle from 2008, which caused the largest auto recall in history, that is something that would give pause to using AI, no? If AI leads to bad products that kill people, companies will lose money and they will not want to use AI.

6. pianoben ◴[] No.43488180[source]
"Medical (vibe) Coding"

Oh lord, that is a phrase I did not need to think about today

replies(1): >>43491196 #
7. eulers_secret ◴[] No.43488364{3}[source]
This reminds me of my first internship, I was running a test suite and filling out a web form every week. It was never mentioned in any meetings or other comms.

After about 3 months of doing this, I asked my manager why I was doing this if no-one cared or noticed. He told me to stop and see if anyone said something. I stopped, interned another 15 months and it wasn't ever mentioned.

Hell of a lesson to learn as a newbie

replies(1): >>43494372 #
8. BlarfMcFlarf ◴[] No.43489345[source]
Certainly you won’t be saved in the US, where things like consumer protections are quickly becoming a thing of the past. And the US has traditionally exported its business practices to the world.
9. mmcconnell1618 ◴[] No.43489806{3}[source]
Idiocracy in real life. The AI told me to drink Brawndo. Why? Because it has electrolytes. My AI said electrolytes are good.
10. Karsteski ◴[] No.43491196{3}[source]
I am quite literally sleeping in a hospital while I read that. Horrifying.
11. johnnyanmac ◴[] No.43494286[source]
Why is it that lazy people are kicking out the motivated? I thought you had to be motivated to move up in the world? Or to understand and interpret law? Or to not take complete trite from a 20 yo MBA who just see numbers instead of lives?
12. johnnyanmac ◴[] No.43494335{3}[source]
>documentation is written solely to be written, not read. If it's your job to write reports, article, memos, whatever, and you deep down know that this isn't being read, then why not have to AI do it?

Medical records seems to be the opposite of this. Nurses document to report to doctors so they can diagnose and advise. That seems like the worst step to cut corners on. Accountants document reports for patient for prescriptions and diagnosis that can affect their lives. Heck, as a CYA you document records in case a law firm comes in demanding papers to read. Do they think lawyers don't care about this stuff?

Just because not everyone is gonna pour over ever word doesn't mean no one will.

13. johnnyanmac ◴[] No.43494372{4}[source]
3 months later a showstopper occurs and heads are rolling. The poor release manager wants to pinpoint when this breaking change occurred, but the last test suite ran was almost 2 years ago. They didn't care until it was too late. As fitting for American culture.

Not saying you were wrong to drop, but multiple people dropped the BL there if there how they are treating their testing.