←back to thread

388 points pseudolus | 1 comments | | HN request time: 0s | source
Show context
Bukhmanizer ◴[] No.43485838[source]
I’m surprised not many people talk about this, but a big reason corporations are able to do layoffs is just that they’re doing less. At my work we used to have thousands of ideas of small improvements to make things better for our users. Now we have one: AI. It’s not that we’re using AI to make all these small improvements, or even planning on it. We’re just… not doing them. And I don’t think my experience is very unique.
replies(21): >>43486104 #>>43486264 #>>43486456 #>>43487649 #>>43487671 #>>43488414 #>>43488436 #>>43488988 #>>43489201 #>>43489228 #>>43489488 #>>43489997 #>>43490451 #>>43490843 #>>43491273 #>>43491336 #>>43491568 #>>43491660 #>>43492193 #>>43492499 #>>43493656 #
baazaa ◴[] No.43488436[source]
I think people need to get used to the idea that the West is just going backwards in capability. Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more. And these are industries which should be seeing the most progress, things are even worse in hard-tech at Boeing or whatever.

Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement. But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.

From the outside, decline always looks like a choice, because the exact form the decline takes was chosen. The issue is that all the choices are bad.

replies(33): >>43488541 #>>43488644 #>>43488809 #>>43488874 #>>43488894 #>>43488954 #>>43489176 #>>43489496 #>>43489529 #>>43489552 #>>43489570 #>>43489702 #>>43490076 #>>43490205 #>>43490296 #>>43491212 #>>43491465 #>>43491538 #>>43491547 #>>43491626 #>>43491950 #>>43492095 #>>43492352 #>>43492362 #>>43492581 #>>43492773 #>>43492829 #>>43492886 #>>43493251 #>>43493711 #>>43495038 #>>43495649 #>>43495778 #
nisa ◴[] No.43488894[source]
My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

In my work experience I've realized everybody fears honesty in their organization be it big or small.

Customers can't admit the project is failing, so it churns on. Workers/developers want to keep their job and either burn out or adapt and avoid talking about obvious deficits. Management is preoccupied with softening words and avoiding decisions because they lack knowledge of the problem or process.

Additionally there has been a growing pipeline of people that switch directly from university where they've been told to only manage other people and not care about the subject to positions of power where they are helpless and can't admit it.

Even in university, working for the administration I've watched people self congratulation on doing design thinking seminars every other week and working on preserving their job instead of doing useful things while the money for teaching assistants or technical personnel is not there.

I've seen that so often that I think it's almost universal. The result is mediocre broken stuff where everyone pretends everything is fine. Everyone wants to manage, nobody wants to do the work or god forbid improve processes and solve real problems.

I've got some serious ADHD symptoms and as a sysadmin when you fail to deliver it's pretty obvious and I messed up big time more than once and it was always sweet talked, excused, bullshitted away from higher ups.

Something is really off and everyone is telling similar stories about broken processes.

Feels like a collective passivity that captures everything and nobody is willing to admit that something doesn't work. And a huge missallocation of resources.

Not sure how it used to be but I'm pessimistic how this will end.

replies(19): >>43489116 #>>43489450 #>>43489478 #>>43489947 #>>43490245 #>>43490642 #>>43490661 #>>43490818 #>>43491877 #>>43491884 #>>43492061 #>>43492066 #>>43492290 #>>43492737 #>>43493477 #>>43494162 #>>43494326 #>>43495162 #>>43501334 #
lenerdenator ◴[] No.43494326[source]
> My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

Close. They're not incompetent; we just redefined competence.

It used to be that competence was a mix of a lot of distinct, but interdependent, qualities. The end result was synergy that allowed for people and organizations (including companies) to compete and move society forward.

In the 1970s, we started to allow a bunch of psychopaths (I'm saying this in the clinical sense) to redefine competence. Instead of this array of distinct qualities, they just defined it in terms of ability to create monetary value, particularly if that value was then transferred to shareholders. That was it.

We also switched to quarterly reporting for for-profit companies, shrinking the window to evaluate this new definition of competence to 90 days. Three months.

An end result of this was that you could simply do whatever made the most money in 90 days and be considered competent.

Jack Welch was the paragon of this. GE shareholders saw massive gains during the latter half of his tenure at the helm. This wasn't because of groundbreaking new products or services; quite the opposite: Jack realized that selling off divisions and cutting costs by any means necessary was a good way to make money in the 90 day period. Institutional knowledge and good business relationships in the market - two of the elements of the former definition of competence - were lost, while money - the sole element under which competence was judged in the new definition - went up.

You also had managers doing a lot of the avoidance of real management, like you speak of. Instead of betting on a new product or trying to enter a new market, they took a Six Sigma course, learned a bunch of jargon, and cut costs at the expense of business past the 90 day period.

If you do this enough (and we did, far beyond just GE), that expense is taken at the societal level. Existence extends beyond 90 days. You can't mortgage the future forever. It's now the future, the payment is due, and we have an empty account to draw from.

Theoretically, we could go back to a more in-depth evaluation of competence and reward its display over the long term. In practice, there are a bunch of people who got unfathomably wealthy off of the shift to the "new" competence, and now they're in charge and don't want to switch back, so we won't.

replies(3): >>43494362 #>>43495312 #>>43495319 #
gen220 ◴[] No.43495319[source]
In the Haudenosaunee system of governance, whenever they needed to make a consequential decision, the family-clan-appointed leaders would nominate some sub-group of the circle to represent the interests of the unborn 7 generations in the future. That's far enough into the future, ~100+ years, that the youngest person alive today to experience decision would certainly be deceased before the generation is born.

On a long enough time scale, short-term oriented systems naturally-select themselves out of existence. The U.S. Constitution didn't survive 7 generations. The Civil War was in 1865 (77 years, ~4 generations). Reconstruction Era made it maybe 60 years (3 generations), as far as the Great Depression / Dust Bowl.

The current post-war ordering of the interest of short-term capital above all else doesn't have a well-defined start date, but 1968 (MLK Jr, RFK, nomination of Humphrey) is a solid one. We're hardly 3 generations in, and it doesn't feel great.

Really, when you look at American history, the periods endowed with bouts of long-term thinking are really quite rare (1770-1810, 1880s-1900s, 1930s-1950s). Maybe we're due for another one.

replies(2): >>43496203 #>>43507420 #
1. immibis ◴[] No.43507420{4}[source]
And the current incarnation of capitalism began in the 2008 financial crisis.