Most active commenters
  • johnnienaked(4)

←back to thread

416 points floverfelt | 23 comments | | HN request time: 0s | source | bottom
Show context
jeppester ◴[] No.45057505[source]
In my company I feel that we getting totally overrun with code that's 90% good, 10% broken and almost exactly what was needed.

We are producing more code, but quality is definitely taking a hit now that no-one is able to keep up.

So instead of slowly inching towards the result we are getting 90% there in no time, and then spending lots and lots of time on getting to know the code and fixing and fine-tuning everything.

Maybe we ARE faster than before, but it wouldn't surprise me if the two approaches are closer than what one might think.

What bothers me the most is that I much prefer to build stuff rather than fixing code I'm not intimately familiar with.

replies(8): >>45057537 #>>45058508 #>>45061118 #>>45061272 #>>45061732 #>>45062347 #>>45065856 #>>45070745 #
1. epolanski ◴[] No.45057537[source]
As Fowler himself states, there's a need to learn to use these tools properly.

In any case poor work quality is a failure of tech leadership and culture, it's not AI's fault.

replies(1): >>45058751 #
2. FromTheFirstIn ◴[] No.45058751[source]
It’s funny how nothing seems to be AI’s fault.
replies(5): >>45060536 #>>45060602 #>>45060685 #>>45061146 #>>45062496 #
3. johnnienaked ◴[] No.45060536[source]
No one seems to be able to grasp the possibility that AI is a failure
replies(3): >>45060690 #>>45060691 #>>45061671 #
4. epolanski ◴[] No.45060602[source]
If poor work gets merged, the responsibility lies in who wrote it, who merged it, and who allows such a culture.

The tools used do not hold responsibilities, they are tools.

replies(1): >>45061715 #
5. colordrops ◴[] No.45060685[source]
How could a tool be at fault? If an airplane crashes is the plane at fault or the designers, engineers, and/or pilot?
replies(1): >>45061424 #
6. raducu ◴[] No.45060690{3}[source]
> No one seems to be able to grasp the possibility that AI is a failure.

Do you think by the time GPT-9 comes, we'll say "That's it, AI is a failure, we'll just stop using it!"

Or do you speak in metaphorical/bigger picture/"butlerian jihad" terms?

replies(1): >>45061367 #
7. colordrops ◴[] No.45060691{3}[source]
You've failed to figure out when and how to use it. It's not a binary failed/succeeded thing.
replies(2): >>45060823 #>>45060849 #
8. nicce ◴[] No.45060823{4}[source]
None of the copyright issues or suicide cases are handled in the court yet. There are many aspects.
9. johnnienaked ◴[] No.45060849{4}[source]
Metaverse was...
10. Cthulhu_ ◴[] No.45061146[source]
That's because it's software / an application. I don't blame my editor for broken code either. You can't put blame on software itself, it just does what it's programmed to do.

But also, blameless culture is IMO important in software development. If a bug ends up in production, whose fault is it? The developer that wrote the code? The LLM that generated it? The reviewer that approved it? The product owner that decided a feature should be built? The tester that missed the bug? The engineering organization that has a gap in their CI?

As with the Therac-25 incident, it's never one cause: https://news.ycombinator.com/item?id=45036294

replies(3): >>45061350 #>>45063842 #>>45083830 #
11. wk_end ◴[] No.45061350{3}[source]
Blameless culture is important for a lot of reasons, but many of them are human. LLMs are just tools. If one of the issues identified in a post-mortem is "using this particular tool is causing us problems", there's not a blameless culture out there that would say "We can't blame the tool..."; the action item is "Figure out how to improve/replace/remove the tool so it no longer contributes to problems."
12. johnnienaked ◴[] No.45061367{4}[source]
I don't see the use-case now, maybe there will be one by GPT-9
replies(1): >>45061977 #
13. wk_end ◴[] No.45061424{3}[source]
Designers, engineers, and/or pilots aren't tools, so that's a strange rhetorical question.

At any rate, it depends on the crash. The NTSB will investigate and release findings that very well may assign fault to the design of the plane and/or pilot or even tools the pilot was using, and will make recommendations about how to avoid a similar crash in the future, which could include discontinuing the use of certain tools.

14. jama211 ◴[] No.45061671{3}[source]
“There’s no use for this thing!” - said the farmer about the computer
15. discreteevent ◴[] No.45061715{3}[source]
"I got rid of that machine saw. Every so often it made a cut that was slightly off line but it was hard to see. I might not find out until much later and then have to redo everything."
16. Kiro ◴[] No.45061977{5}[source]
Absence of your need isn't evidence of no need.
replies(1): >>45066598 #
17. xtracto ◴[] No.45062496[source]
If your toaster burns your breakfast bread, Do you ultimately blame "it"?

You gdt mad, swear at it, maybe even throw it to the wall on a git of rage but, at the end of the day, deep inside you still know you screwed.

replies(2): >>45062623 #>>45063140 #
18. skydhash ◴[] No.45062623{3}[source]
Devices can be faulty and technology can be inappropriate.
19. sarchertech ◴[] No.45063140{3}[source]
If I bought an AI powered toaster that allows me to select a desired shade of toast, I select light golden brown, and it burns my toast, I certainly do blame “it”.

I wouldn’t throw it against a wall because I’m not a psychopath, but I would demand my money back.

20. Jensson ◴[] No.45063842{3}[source]
> You can't put blame on software itself, it just does what it's programmed to do.

This isn't what AI enthusiasts say about AI though, they only bring that up when they get defensive but then go around and say it will totally replace software engineers and is not just a tool.

21. johnnienaked ◴[] No.45066598{6}[source]
This is true, but I've never heard of a use case. To which you might reply, "doesn't mean there isn't one," which you would be also right about.

Maybe you know one.

replies(1): >>45076787 #
22. Kiro ◴[] No.45076787{7}[source]
I presume your definition of use case is something that doesn't include what people normally use it for. And I presume me using it for coding every day is disqualified as well.
23. FromTheFirstIn ◴[] No.45083830{3}[source]
Blame is purely social and purely human. “Blaming” a tool or process and root causing are functionally identical. Misattributing an outage to a single failure is certainly one way to fail to fix a process. Failing to identify faulty tools/ faulty applications is another way.

I was being flippant to say it’s never AI’s fault, but due to board/C-Suite pressure it’s harder than ever to point out the ways that AI makes processes more complex, harder to reason about, stochastic, and expensive. So we end up with problems that have to be attributed to something not AI.