Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement. But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.
From the outside, decline always looks like a choice, because the exact form the decline takes was chosen. The issue is that all the choices are bad.
In my work experience I've realized everybody fears honesty in their organization be it big or small.
Customers can't admit the project is failing, so it churns on. Workers/developers want to keep their job and either burn out or adapt and avoid talking about obvious deficits. Management is preoccupied with softening words and avoiding decisions because they lack knowledge of the problem or process.
Additionally there has been a growing pipeline of people that switch directly from university where they've been told to only manage other people and not care about the subject to positions of power where they are helpless and can't admit it.
Even in university, working for the administration I've watched people self congratulation on doing design thinking seminars every other week and working on preserving their job instead of doing useful things while the money for teaching assistants or technical personnel is not there.
I've seen that so often that I think it's almost universal. The result is mediocre broken stuff where everyone pretends everything is fine. Everyone wants to manage, nobody wants to do the work or god forbid improve processes and solve real problems.
I've got some serious ADHD symptoms and as a sysadmin when you fail to deliver it's pretty obvious and I messed up big time more than once and it was always sweet talked, excused, bullshitted away from higher ups.
Something is really off and everyone is telling similar stories about broken processes.
Feels like a collective passivity that captures everything and nobody is willing to admit that something doesn't work. And a huge missallocation of resources.
Not sure how it used to be but I'm pessimistic how this will end.
Close. They're not incompetent; we just redefined competence.
It used to be that competence was a mix of a lot of distinct, but interdependent, qualities. The end result was synergy that allowed for people and organizations (including companies) to compete and move society forward.
In the 1970s, we started to allow a bunch of psychopaths (I'm saying this in the clinical sense) to redefine competence. Instead of this array of distinct qualities, they just defined it in terms of ability to create monetary value, particularly if that value was then transferred to shareholders. That was it.
We also switched to quarterly reporting for for-profit companies, shrinking the window to evaluate this new definition of competence to 90 days. Three months.
An end result of this was that you could simply do whatever made the most money in 90 days and be considered competent.
Jack Welch was the paragon of this. GE shareholders saw massive gains during the latter half of his tenure at the helm. This wasn't because of groundbreaking new products or services; quite the opposite: Jack realized that selling off divisions and cutting costs by any means necessary was a good way to make money in the 90 day period. Institutional knowledge and good business relationships in the market - two of the elements of the former definition of competence - were lost, while money - the sole element under which competence was judged in the new definition - went up.
You also had managers doing a lot of the avoidance of real management, like you speak of. Instead of betting on a new product or trying to enter a new market, they took a Six Sigma course, learned a bunch of jargon, and cut costs at the expense of business past the 90 day period.
If you do this enough (and we did, far beyond just GE), that expense is taken at the societal level. Existence extends beyond 90 days. You can't mortgage the future forever. It's now the future, the payment is due, and we have an empty account to draw from.
Theoretically, we could go back to a more in-depth evaluation of competence and reward its display over the long term. In practice, there are a bunch of people who got unfathomably wealthy off of the shift to the "new" competence, and now they're in charge and don't want to switch back, so we won't.
On a long enough time scale, short-term oriented systems naturally-select themselves out of existence. The U.S. Constitution didn't survive 7 generations. The Civil War was in 1865 (77 years, ~4 generations). Reconstruction Era made it maybe 60 years (3 generations), as far as the Great Depression / Dust Bowl.
The current post-war ordering of the interest of short-term capital above all else doesn't have a well-defined start date, but 1968 (MLK Jr, RFK, nomination of Humphrey) is a solid one. We're hardly 3 generations in, and it doesn't feel great.
Really, when you look at American history, the periods endowed with bouts of long-term thinking are really quite rare (1770-1810, 1880s-1900s, 1930s-1950s). Maybe we're due for another one.
I know the 13th, 14th, and 15th amendments to the US Constitution are often considered America's Second Founding because they legally eliminated [1] all the elements of racism within the United States Constitution, but saying that the Constitution "didn't survive" doesn't see accurate...
[1] That being said we all know that it took many many decades after those 3 amendments for the laws in the United States to accurately reflect the principles embodied within these amendments.
The 3/5ths compromise, and its implicit enshrinement of slavery as an American institution, is as an example of short-term thinking (compromising on the legal definition of a human being, in order to get the constitution ratified) that eventually caused the greater system to unravel. Hundreds of thousands of people died in the civil war, millions of people experienced slavery. It could have been avoided if longer-term thinking prevailed.
I hear you that the Constitution (inclusive of its self-mutating property) survived as a useful document of federal governance. This purported maintenance of a federal union was a huge legitimizer of northern domination of the post-ACW United States. But, I think you'd agree that the "system of governance" that begat the constitution did not survive, that's more what I was getting at. That each successive system of governance can still legitimately claim to be implementing the U.S. Constitution is indeed impressive.
I am not a lawyer but many years ago I read about the following doctrine.
https://en.wikipedia.org/wiki/Incorporation_of_the_Bill_of_R...
Basically prior to the American Civil War the Bill of Rights was considered to only apply to the Federal government and not the state governments.
After the Civil War, the US Supreme Court interpreted the 14th amendment such that overtime all the amendments of the Bill of Rights were considered to apply to the states as well.
So what you are saying about one system being dominant over the other system (Federal government being dominant over the state governments) makes sense and it is something that seems to have happened more extensively after the Civil War.