←back to thread

388 points replyifuagree | 2 comments | | HN request time: 0.002s | source
Show context
nmstoker ◴[] No.37966119[source]
I get the point, and with irresponsible parties (as is fairly widespread in most companies) there's a real risk here.

However the analogy of a meteorologist seems poor as that job is focused on predicting the weather - the typical dev is focused on operating in that weather and comparatively inexperienced in predicting with great accuracy.

What's frustrating as a stakeholder is ludicrous estimates, which don't even start with the work time, let alone end up with a realistic duration. This is particularly true (and frustrating) at the micro task level, an area I'm often requiring items that take at most 30 minute to complete and are usually things I could do in less time if only I had access... You get a weeks long estimate back, even when it's incurring a serious cost in production and falls in the drop everything category (which obviously one wants to avoid but does come up). I get that none of those 30 minute tasks will take 30 minute alone as there's testing and documentation to add but the more bs level the estimate, the more it damages the trust relationship.

replies(9): >>37966150 #>>37966194 #>>37966480 #>>37966493 #>>37966648 #>>37966946 #>>37967117 #>>37967327 #>>37968617 #
Aeolun ◴[] No.37966150[source]
To some extend, but there’s tasks that I could do in 30m in a company with 15 employees that I have still not accomplished after 2 months in our 15k employee enterprise.
replies(3): >>37966496 #>>37966780 #>>37966920 #
briHass ◴[] No.37966780[source]
Large orgs almost become like a bloated government in that way: groups start to construct little fiefdoms with rules and policies that are ostensibly constructed to improve quality. However, those rules end up becoming bludgeoning tools used by nefarious actors in those groups.

The real problem is that nobody ever steps back and asks: are all these rules actually helping to improve quality of the software. Is the cost of the reduced velocity and overhead actually worth it in the end.

Then, the org does layoffs and all that policy is still in place without the necessary people to supported the bloated workflow.

replies(2): >>37966834 #>>37970883 #
1. pixl97 ◴[] No.37970883[source]
Even this isn't really looking at the problem correctly.

If your two man organization if one person steals the source code or slips in some back door into the code, it should be somewhat obvious who did it.

On the other hand in that large enterprise someone doing the same could lead to an international incident. And with that many people you are going to have nefarious actors at some point and you need to think about how to minimize their damage.

It's likely you're falling into the same trap of understanding why Chestertons fence was put up in the first place.

replies(1): >>37971102 #
2. briHass ◴[] No.37971102[source]
The problem is, the whole mess becomes so abstracted away from the real goal: shipping software to customers and making money. Government can get away with it, because they're the only game in town, but a company doesn't have the luxury of operating with a crippling level of inefficiency for long.

Whatever 'deal with the devil' that existed when the fence was erected may no longer be relevant or worth the overhead, but the policies live on. There may even now be individuals who's jobs are now directly related to enforcing/implementing the policy, and they have an interest in perpetuating it at any cost.