If it was truly a decline in knowledge work in general, it should be visible in other economies as well, but I don't think that's the case, at least not here in Denmark. Arguably we typically trend a bit behind the US, so it could be looming on the horizon.
If your idea is only "use AI for X" your company doesn't have anything really. At least if AI isn't its core business like OpenAI.
We can't possibly have run out of consumer app ideas in a decade or two, right?
And those are just off the top of my head
- Slack: 2013
- Zoom: 2011
- TikTok: 2016
Based on "Initial Release" on Wikipedia.
None of these big tech companies really need to grow bigger. The smartphone is essentially done. AWS just prints money. Social/consumer apps are "done". What more is for them to do but collect rent?
The US government needs to break them all up. That'll oxygenate the entire tech sector, unlock value for investors, and kickstart the playing field for startups.
Google, Apple, Amazon, Meta, and maybe Microsoft. Break them up.
For every small startup trying to build innovative robotics to solve a healthcare or agriculture problem, there's 10 startups getting 100x funding because they figured out how to put jpegs on the blockchain and the last guys that figured out how to do that had a nice exit...
I forgot what the economists call this but it's the characterization of housing market. The value of your current house is determined by the last few local sales and little else. All of the startups using the current in Vogue technology feel like that...
Microsoft fumbled Skype and nobody was putting together convenient apis on top of IRC...
That’s all it is. That’s part of why Silicon Valley is so clout chasing and cargo culty. It’s entirely due to investor pressure to get immediate returns. Immediate returns mean you have to follow whatever the current hype is.
Is it IoT, crypto, nft, Uber for X, self driving, etc. etc.? That’s what you do. You follow whatever the hype is and bail after you get your desired return.
There’s no desire for a sustainable business. There’s a desire for other investors to be a sucker that holds the bag at the end.
We’re coming out of a long growth period fueled by two decades of war and the inflation that came with it, first in asset values from ZIRP and then from the COVID capital infusions.
Look at where we sit economically, at least in the US. Real estate, a core economic and political engine is a bomb waiting to go off. Commercial real estate is totally underwater. Residential real estate is in another bubble. I won’t go into the madman in DC that’s gonna light the fuse.
Everyone knows it at some level, so projects are getting cancelled. Changes add value but create problems. Take away change and the implode demand for labor. Avoiding the need for marginal later lowers the marginal cost, so you start purging expensive people.
Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement. But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.
From the outside, decline always looks like a choice, because the exact form the decline takes was chosen. The issue is that all the choices are bad.
How much easier is it to manage and operate technology in 2025 than it was in 2005 or 2015? I have three core tech teams with 12-18 people. I’d need 500+ to do what I do today in 2005, assuming the tech could do it.
Breakthrough B2C products don’t appear annually. But everything is better. Apple Maps can estimate my travel time for a 300 mile drive with 5 minutes. I bought a last minute flight to Rome last summer knowing nothing about Rome or speaking any Italian and I did fine, thanks to iPhone and the mobile app ecosystem.
In other words, the capitalists won.
I think about it quite often and there is a LOT of apps out there, and really, humans don't need that much to be happy.
I mean, like, Disney has been getting worse at CGI, but only because then whole company has given up. This is just normal companies shifting around, though.
I remember being bored and having to create my own fun. I remember being aware of my surroundings and being curious about it because I didn't have my favorite entertainment media attached to my palm. I remember learning about thing such as what was in my Cheerios because the box was the only thing in front of me when I ate my breakfast.
It would be a joke to say that AI exists to fill the void from what I mentioned above, but it does kinda sorta feel correct in a weird sci-fi conspiracy way.
In my work experience I've realized everybody fears honesty in their organization be it big or small.
Customers can't admit the project is failing, so it churns on. Workers/developers want to keep their job and either burn out or adapt and avoid talking about obvious deficits. Management is preoccupied with softening words and avoiding decisions because they lack knowledge of the problem or process.
Additionally there has been a growing pipeline of people that switch directly from university where they've been told to only manage other people and not care about the subject to positions of power where they are helpless and can't admit it.
Even in university, working for the administration I've watched people self congratulation on doing design thinking seminars every other week and working on preserving their job instead of doing useful things while the money for teaching assistants or technical personnel is not there.
I've seen that so often that I think it's almost universal. The result is mediocre broken stuff where everyone pretends everything is fine. Everyone wants to manage, nobody wants to do the work or god forbid improve processes and solve real problems.
I've got some serious ADHD symptoms and as a sysadmin when you fail to deliver it's pretty obvious and I messed up big time more than once and it was always sweet talked, excused, bullshitted away from higher ups.
Something is really off and everyone is telling similar stories about broken processes.
Feels like a collective passivity that captures everything and nobody is willing to admit that something doesn't work. And a huge missallocation of resources.
Not sure how it used to be but I'm pessimistic how this will end.
> There’s a desire for other investors to be a sucker that holds the bag at the end.
I agree with everything you said except for these two sentences. If you take VC funding to build a sustainable business frankly you’re doomed from the get go. That’s not what a VC wants and it won’t get you funded. There are other routes to that.
What VCs want is if 99 of those startups fail, to be holding the Airbnb, uber, Anthropic at the end. Because holding that from the beginning will make you more money than any other option.
I used to deliver pizzas in the early 2000s. I would get paid
$4/hour (later bumped to $5 per hour)
$1/delivery (pass through to customer)
+ tips
I had good days / times where I was pretty much always busy and made around $20/hour by the end.
So delivery cost the customer $1 + tip (usually ~$3), cost the business maybe $40 a night (~2.5 drivers for 3 hours), and I made out pretty well.
I can't compare exactly but I feel like today the business pays more, the customer pays more, the drivers get paid less and it's all subsidized by investors to boot. Am I totally wrong on this? But I feel like delivery got so much worse and I don't know where the money is going.
It’s led me to learn to DIY as much as possible, making my own fun and experiences so to say.
I'm struggling to think of a part of the OS that hasn't had ads shoved into it... the terminal I guess... There have been ads in the start menu, the lock screen, in pop up notices, in the file explorer, in search results, in the control panel, on the task bar, in the share pane, in windows update and in a bunch of windows apps like ink workspace. They've even just force-installed random programs to people's systems.
Most VCs doing 99 investments for the 1 big are akin to YC. They’re not doing series C for $100m and expecting 99 of those to fail. They’re expecting to get a return somewhat shortly back.
Holding the stock isn’t helpful for a VC unless they’re going to be using some financial mechanism for leverage - which means they’ve given it up for the other institution who now essentially owns it. A lot of these stocks aren’t giving you meaningful dividends. You have to sell or give up some form of control on them to be a successful VC. How else would you continue to invest?
I don't think specifically AI has done this compared to a broader view of constant stream of digitilisation of every departments function.
Orgs don't need grads to learn the ropes from the bottom and make their way up the career ladder, when the ladders might only be 3-4 rungs high now.
I'd wager, too, that the addition of the garbage you're describing has coincided with the OS's worsening performance. File Explorer performance is so abysmal that it may as well be an Electron app.
On the other hand (edit: regarding your first paragraph), Microsoft seems very serious about not falling afoul of the law, probably because of the cost of the anti-trust litigation they faced in the 90s and 2000s(?). It wouldn't surprise me at all if there were nothing for a whistleblower to blow the whistle on.
Dopamine addiction isn’t the problem.
Isn't that how self checkout happens in every part of the world that has self checkout? I'm failing to see what's special about self checkout in Japan.
Crude oversimplification: if all you’ve ever known are slow and bloated web app UIs on mobile phones, you’re simply not going to know how to make good design/development choices outside that environment.
The main culprits I've seen are cheaping out on quality, replacing traditional controls with touch screens or "AI" magic buttons, and squeezing in more monetization streams or adding gimmicky features that actively make the product worse.
Maybe things will turn around someday. There are a few rays of hope, like the touchscreen fad in cars gradually losing its luster, but it seems like we've been on the wrong path for a long time and I'm not sure it will ever correct.
Some of this is inevitable as new products and services move from being high end to mass-market, and it’s perhaps a bit chicken-and-egg to determine whether we accept this because we most people never really cared about quality that much anyway or because we just learn to accept what we’re given.
But it feels like there could be a world where automation still reduces costs while still maintaining a high level of quality, even if it’s not quite as cheap as it is now.
Today it is a commodity. So we are flooded with low effort productions.
With that being said, we have more capability than ever, at the cheapest cost ever. Whether businesses use that wisely is a different story.
There will always be outliers. I see many comments with people who derived value from whatever they perceived as something uncommon and unique they could do. Now AI has made those skills a commodity. So they lose their motivation since it becomes harder to attain some sort of adoration.
In any case, going forward, no matter what, there will be those who adopt the new tools and use them passionately to create things that are above and beyond the average. And folks will be on HN reminiscing about those people, 30 years from now.
Where I work in government we've stopped paying for important data from vendors (think sensors around traffic etc.) because the quotes are eye-wateringly expensive. But I've worked in data long enough to know the quotes probably reflect genuine costs, because data engineers are so incompetent (and if it's a form of pricing gouging it's not working because gov isn't paying up). So it looks like we're choosing to be in the dark about important data, but it's not entirely a choice.
Saying we can do stuff but it's unaffordable is imo just another way of saying we can't do stuff.
You are probably getting more, and the difference more than goes entirely into rent.
Real state is destroying the world's economy.
I have not yet figured out how to manually change the settings, as the buttons don't do anything when you press them.
I leave it on "normal" and it seems fine, and surely there is a way to activate those buttons, but I haven't found it.
I could probably install the app on my android device and use it to connect them to wifi, where I could presumably configure them.
Instead, however, I am looking at electronics-free diesel trucks.
There’s more music out there than there’s ever been. More tv shows and movies than I could possibly watch, but I still find new things to watch and listen to.
But tech? Maybe other than my Robot vacuum, I don’t think there’s anything in the last 5 years I’ve seen that I’ve felt is going to make my life easier or better. Which seems odd because the pace that technology seems to be improving only seems to be accelerating. We can do more than we ever could before, but it feels like the appetite to improve things is no longer there.
Did our quality and capability get worse or did everyone become a journalist that can document every flaw and distribute it globally in minutes?
Hmmm....
But consider an example which can't be blamed on that. My city (Melbourne) has a big century-old tram network. The network used to cover the city, now it covers only the inner city because it hasn't ever been expanded. We can't expand it because it's too expensive. Why could we afford to cover the whole city a century ago when we were 10x poorer? With increasing density it should be even more affordable to build mass-transit.
Obviously people blame the latter example on declining state capacity, but I'm not sure state capacity is doing any worse than Google capacity or General Electric capacity.
So it's a slightly less safe (in the grand scheme of things) airliner that's vastly more fuel efficient and cheaper to run than any in the past. Obviously this is of no comfort to the families of the people who died in the crash!
But to suggest that Boeing has somehow regressed decades in technical capabilities is just plain wrong.
1: https://en.wikipedia.org/wiki/List_of_accidents_and_incident...
https://www.msn.com/en-us/news/politics/biden-allies-call-tr...
The loaded term for that is simply "cutting corners."
It is exactly that! Food delivery is an excellent example of 'things just got worse'.
In 2019, 'delivery' was a specialty a restaurant would have to focus on to offer. Pizza places (Papa Johns, Pizza Hut, etc) and other specific delivery-focused restaurants (such as Panera Bread, Jimmy Johns, or your local Chinese restaurant) would have actual W2 employees who did delivery driving, as part of their job. The restaurant would want deliveries to go well (for both the customer, as well as the driver), so would make sure their own staff had reasonable access to food, some light training, and would ensure they could deliver it somewhat well. (They would reject orders too far away, they wouldn't serve food that wouldn't survive a delivery trip well, etc)
In post-COVID 2025, "every" restaurant offers delivery, but almost no restaurant still employs their own delivery drivers (locally, Jimmy Johns might be the only one left). Everyone else just outsourced to DoorDash. DoorDash drivers are employees who are 'legally-not-employees' (1099 employees), so they no longer have any direct access to the restaurants, and they can't train well for any specific service, because they might have to visit any-of-50 restaurants on any given day, all of which have entirely different procedures (even if they are the same brand or chain). Restaurants have zero incentive to ensure deliveries go well (the drivers aren't their employees, so they no longer care about turnover, and customers have to use DoorDash or Uber Eats or equivalent, because almost every restaurant uses it, so there's no downside to a DoorDash delivery going bad).
Prices to consumers are double-to-higher than what they were in 2019, depending on the restaurant. Wages are down, employment security is entirely eliminated. Quality and service have tanked.
Presumably, investors make slightly more money off of all of this?
30 years may be a stretch but 20-25 certainly isn't.
Don't replace an existing solution with exactly the same thing on a different platform.
Think larger. Solve today's / (near) tomorrow's problem's BETTER. That's probably going to require changes to process too. A full evaluation of what's the most effective way with the capabilities and needs that exist now.
Then bring up interfaces that provide what the old system did, verify the data round trips, and when it's approved cut over.
There are people out there who are pretty conflict-avoidant by nature, and any group tends to pretty significant levels of cohesion because of it. There are some classic stories out there about when it goes particularly bad and spirals into a bad case of groupthink.
In the economy there are supposed to be some slightly cruel feedback mechanisms where companies (effectively big groups) that get off track are defunded and their resources reallocated to someone more competent. The west has been on a campaign to disable all those feedback mechanisms and let companies just keep trudging on. We've pretty much disabled recessions by this point. A bunch of known-incompetent management teams have been bailed out so they can just keep plodding along destroying value. There is not so much advantage in being honest about competence in this environment, if anything it is a bad thing because it makes it harder to take bailout money with a straight face.
I cite the Silicon Valley Bank collapse as an interesting case study. A looot of companies should have gone bust with that one because they were imprudent with their money. They didn't.
The staff will also instantly materialize if they are needed to confirm you can buy alcohol, or there is some kind of problem; which is also not my experience elsewhere.
It's not a worldshattering difference, but it is noticeable.
The problem with CGI today is that it's over-used and mis-applied in areas that still have Uncanny Valley type issues (fight scenes, car chases/crashes, etc).
> Aladdin (Asset, Liability and Debt and Derivative Investment Network)[1] is an electronic system built by BlackRock Solutions, the risk management division of the largest investment management corporation, BlackRock, Inc. In 2013, it handled about $11 trillion in assets (including BlackRock's $4.1 trillion assets), which was about 7% of the world's financial assets, and kept track of about 30,000 investment portfolios.
For any one firm to have this much director and/or indirect assertion over the world’s financial assets is ripe for problems of all sorts.
Seems rather indicative of the general consolidation of power and decline of social equality across the west
Tried it again and hop the same issue. Now he is going for a chargeback. There is nobody in quickbooks that can solve this problem as most of the support (from India) seems to be there just to re-read manuscripts.
But hey you should buy into the stock as they are going into an AI transition: https://www.tipranks.com/news/intuit-stock-nasdaqintu-layoff...
I only checked its production budget while writing my comment.
When we funded the majority of the big infrastructure pushes our rate of growth was lower, and gdppc (and revenue/PC) was exploding. This generally ended with the start of big multicultural Australia policy in the late 60's.
So in comparison, the amount of infrastructure we need to build is greater per capita, as it has to try to cover the future population predictions, it needs to be done over less years as well.
Then we can get into the migration policy that's causing a decline in gdppc.
Being a great engineer or researcher doesn't pay. You won't get your name known for your work. All your achievements will be attributed to whoever manages you at best, or attributed to the corporation above you with not a single human name at worst.
People like being recognized for their work. Every great achiever wants to have their name remembered long after they leave this world. Everyone wants to be the next Isaac Newton. The next Bill Gates. The next Steve Jobs. The next Elon Musk. It's a constant downhill path from being known for using your brain and busting your ass to discover or create something, to being known for managing someone who created something, to being known as someone who bought the company that managed people who created something. Motivations are all fucked up. No matter what you discover or create these days, there's a feeling that you're not going to have your name written in history books. Your best options are join a grift or manage someone who's doing the hard work.
The traditional term for this is cobra effect. [1] When the Brits were occupying India they wanted to reduce the cobra population, so they simply created a bounty on cobra heads. Sounds reasonable, but you need to have foresight to think about what comes next. This now created a major incentive for entrepreneurial Indians to start mass breeding cobras to then turn in their heads. After this was discovered, the bounty program was canceled, and the now surging cobra farm industry mostly just let their cobras go wild.
I think the fundamental problem is that things just don't work so well at scale, after a point. This is made even worse by the fact that things work really well at scale before they start to break down. So we need a large economy that remains relatively decentralized. But that's not so easy, because the easiest way to make more money is to just start assimilating other companies/competitors with your excess revenue. Anti-trust is the knee jerk answer but even there, are we even going to pretend there's a single person alive who e.g. Google (or any other mega corp) doesn't have the resources to 'sway'?
But I think if you look at modern light-rail projects there really has been such insane cost-inflation it wouldn't be worth covering the city with trams even with a much bigger budget. Also because such a large fraction of the price is admin etc., it creates a bias towards more expensive infra (heavy rail) because the paperwork overhead is similar either way so you get more bang for your buck.
Incidentally this also applies similarly to risk issues. The biggest risk in a flight is not in flying, but in takeoff/landing. This is why the commonly cited deaths/mile metric is not only misleading but completely disingenuous by the people/organizations that release it, knowing full well that the vast majority of people don't understand this. If some person replaced their car with a plane (and could somehow land/take off anywhere), their overall risk of death in transit would be significantly higher than if they were using e.g. a car. 'Air travel being safer than cars' relies on this misleading and meaningless death/miles statistic.
In recent years lying has been normalized. Black is White etc. 1984 is here.
The moment you admit failure as an employee, you are out of the company. And no for most people it is not easy to find a job that will not disrupt their lives (aka move cities, change financial planning, even health insurance).
So employees do what they have to do. They will lie till the last moment and pretend that the initiatives they are working on are huge value add for the company.
In the past you knew you would retire from your company, also the compensation differential was not that huge across levels, so there was little incentive to BS.
Today everything is optimized with a horizon of a financial quarter. Then a pandemic hits, and we realize that we don't even know how to make freaking masks and don’t even have supplies of things for more than a week.
But more importantly, the other half of my point was that $250 million ought to be enough to pay for a high effort production. It's not like "well Blender is free now so of course theatres are flooded with amateur CG films since their production has been commoditized".
So did we just run out of useful things to do with people? Or did we concentrate the wealth away from the masses and blame the same immigration that created Australia in the first place?
This is most definitely still the fault of management.
Justify to whom? Shareholders just care about metrics like earnings and revenue. If they don't need workers to optimize this metrics, they have right to hire fewer workers.
We had a hugely restrictive immigration policy, (have a look at the rate of growth over time) followed by multiple wars that meaningfully reduced the population... We were winning the Malthusian game, just by having lots of resources per person available.
The policies you have probably heard called "white Australia" were more accurately understood as immigration restriction policy. If you read anything published at the time, there was only slightly less animosity for white english migration as the rest of the world. This was the era of communism and workers rights, and the workers absolutely understood that their labour was being devalued.
The problem is not refusing to fund replacements. The problem is refusing to fund maintenance.
A lot of managers in old school business were sold on IT as a tool. And tools? You buy them, use them and replace only when they break. Maintenance is minimal and you sure don't evolve them.
That's how you get couple decade old software chugging along, being so key to operations everything you want to add has to be aware of it and its warts which will then infect what touches it. And replacement projects cannot work because usually they mean changing how things are done.
But 20 years of rot are a symbiosis between users and tools:
- some tool does not allow a workflow, so users manage and find a workaround
- there is a workaround so next version of the software landscape cannot break it
- people want to do some new thing which is not in the software, changing it could break the previous workaround. So either people don't do the new thing or adapt and create other workarounds
Multiple rounds of this and you have a fossilized organization and IT where nothing can be easily changed. The business cannot adapt. The software cannot be modified to allow adaptation because it could break the business. Now a new competitor emerges, the business is losing and that's when everyone starts blaming everyone for the problems. But in reality? The cause is 20 years ago when some management decided to add IT as a cost center.
My solution to this problem? Create your own competitor and kill the old business.
The problem with replacement projects is when and why they're usually started. They're usually started once there's a fixed deadline on some technology ceasing to exist, creating the appropriate urgency.
Usually the people that wrote that original software have long gone, the last few people that were able to maintain it are also nearing retirement age or already gone as well, you have some ancient technologies used for which it's hard to get documentation on the internet today.
Now you're tasked with writing a replacement, and everything that doesn't work on day 1 is deemed a failure. It might have worked if you started earlier. Because if your original codebase is COBOL and assembly written for mainframe, it's really hard to find anyone that can understand what it does fully and rewrite it now cleanly.
If you had updated from COBOL and mainframe assembly to C, and from C to 90s Java, and from 90s Java to modern Java/Go/Rust/Node, you'd have plenty of institutional knowledge available at each step, and you would have people that know the old and the new world at each step. Jumping half a century in computing techonology is harder than doing small jumps every 10-15 years.
Construction is not massively more efficient today compared to a century ago, while salaries have massively increased.
Objectively this isn't true as CGI technology has improved by leaps and bounds (think e.g. subsurface skin scattering in new vs old Gollum), however there's a lot of other factors at play; old CGI used film tricks to make it blend better, new CGI uses full CGI and digital whatsits and doesn't care anymore. It also depends on budget and what studio takes care of it. Good CGI is invisible, and there's a number of non-superhero films where the CGI just isn't visible / you're not even aware of it. Anyway, what 20 year old CGI are you thinking about, and what are you comparing it with? I'm thinking The Spirits Within (2001) or Beowulf (2007); the former did not age well, the latter was already panned as having poor CGI when it came out. Avatar (2009) pushed the frontier again I think.
> go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.
This is a blinkered view of reality; there's thousands of game developers outside of this bubble, from single person developers making modern classics like Stardew Valley or even Minecraft when it first came out, to teams of developers that are bigger than those that made the games of 20 years ago.
Also, your opinion isn't fact; in the top 20 best selling games of 2024 [0] there is only one arguable remaster (GTA 5, which is on its 3rd remaster) and two complete remakes (FFVII Rebirth and CoD 3), with the former being a completely different game compared to the original. I share your cynicism about the "top of the line" video game market today, but you're not correct.
(meanwhile I'm playing 2007 video game (Supreme Commander))
[0] https://www.gamespot.com/gallery/2024s-best-selling-games-in...
Broad changes in the distribution of wealth, and government spending on education sharply declining, levels of critical thinking and open-mindedness have declined.
So now, if something can be made thats part of an existing franchise or consumer favoured products, then thats lower risk. It attracts more capital. Full on remakes again and again with idiots generally accepting bad games on nostalgia value means sales even of a bad game remain palletable.
I dont think the west is going backwards in capability, but people seem incapable of highlighting what has changed
Outside of places like Meta, who are printing money at a ridiculous rate, finance acts as a break on any long-term or big bets. There can be no risk taking.
I feel like this is one of Google's problems now. Once upon a time they were willing to take big swings with their piles of cash, now it's all about revenue maximisation at the low level. I forget which change it was, but they started charging for something, or limiting quotas on something, and the email contained the phrase "in line with industry norms", and I just thought that was very tellings. Back in the early 2000s Google was constantly defying and upturning "industry norms", now they are just like everyone else, squeezing every last drop from the smallest stones. Getting rid of the previously grandfathered in free Google Workspaces was a good example. I find it hard to imagine that the cost of those even registers in their accounts compared with everything else.
When I see those statistics I think about flights like Austria to Finland and I imagine that is indeed safer by plane.
But since it's all investor and profit driven for the bigger company, costs get cut on every side.
They are stuck with expensive legacy union employee contracts, while smaller and more efficient operations like TJs and Aldi and Lidl and Costco and Walmart and Winco eat all the consistent low margin sales, leaving the big grocery stores with only volatile high margin sales.
People per household has been trending down for a long time, which also impacts the amount/variety of cooking.
Also, traveling and routing utilities and police and ambulances and all of society around a larger plot of land costs (in time and energy and materials) at least a power of 2 more than a smaller plot of land.
Not only are there are no marginal tax rates for land value tax, but there are tax breaks for elderly, caps on tax increases the longer the land is owned, and tax deferrals (such as 1031 exchange).
I will leave it to the reader to figure out which “tribe” is most represented amongst land owners. And old people receiving Social Security and Medicare.
Fact of the matter: communications is everything for humans, including dealing with one's own self. Communications are how our internal self conversation mired in bias encourages or discourages behavior, communications are how peers lead, mislead, inform, misinform, and omit key information - including that critical problem information that people are too often afraid to relate.
An effective communicator can talk to anyone, regardless of stature, and convey understanding. If the information is damningly negative, the effective communicator is thanked for their insight and not punished nor ignored.
Effective communications is everything in our complex society, and this critical skill is simply ignored.
It was, but 2021-2024 was the first three year period that didn't end lower than it started since the 1960s (starting and ending at 2.51 average per household); it is possible that trend has arrested.
Sorry but thats just not true. Sure there are shit VFX films, but I guarantee that the "serious" movies that people hold up as "all in camera effects" have hundreds of shots with digital set extensions and all sorts of VFX magic.
If you look at TV, where there has been huge competition, the use of VFX has exploded, mainly as a cost saving, but also as a story enhancer. stuff that would have cost £20m ten years ago is being done for £500k. Thats huge innovation.
> remasters of 20 year old games because no-one knows how to do anything any more
They are remasters because the people putting the money up are conservative.
Innovation is happening, just not where you expect. Look at the indy games market.
Much as I don't like it, but a huge amount of innovation is happening in the world of youtube and tiktok. New editing styles, almost a complete new genre of moving picture has emerged.
Where there is competition, there is innovation.
This is really a cultural problem that has infected management along with everyone else.
It used to be that you were expected to be able to fix your own car or washing machine, and moreover that one you couldn't fix would be rejected by the customers. It was expected to come with documentation and be made of modular parts you could actually obtain for less than three quarters of the price of the entire machine.
Now everything is a black box you're expected to never open and if it breaks and the manufacturer doesn't deign to fix it you go to the store and buy another one.
The problem with this is that it poisons the well. Paying money to make the problem go away instead of learning how to fix it yourself means that, at scale, you lose the ability to fix it yourself. The knowledge and infrastructure to choose differently decays, so that you have to pay someone else to fix the problem, even if that's not what you would have chosen.
The result is a helplessness that stems from a lack of agency. Once the ability to do something yourself has atrophied, you can no longer even tell whether the person you're having do it for you is doing it well. Which, of course, causes them to not. And in turn to defend the opacity so they can continue to not.
Which brings us back to management. The C suite doesn't actually know how the company works. If something bad happens, they may not even find out about it, or if they do it's through a layer of middle management that has put whatever spin on it necessary to make sure the blame falls on the designated scapegoat. Actually fixing the cause of the problem is intractable because the cause is never identified.
But to fix that you'd need an economy with smaller companies, like a machine with modular parts and documented interfaces, instead of an opaque monolith that can't be cured because it can't be penetrated by understanding.
That’s doubly difficult because the complexity is what lets the system produce so much output, and if you produced less people would experience that as having less and would riot. The only way out would be if the whole society consumes less, including, visibly, the elites. Feeling taken advantage of is a far more powerful force on the non elites compared to, up to a point, their material ups and downs.
Trump’s ability to create a widely accepted narrative focused specifically on elites who are opposed to his power, but also who are doing extra well relative to the non elites, is what let him harness the raw force of wage stagnation et al for political power
Now it is in the best shape ever and progress seems to be unstoppable. And West throughly dominates it in every dimension and that dominance seems to only be accelerating.
Boeing just failed in what was an inherently unfair game: they tried to compete with state-funded Airbus that could just burn unlimited cash not worrying about real profitability, Boeing tried doing it by cutting costs, and failed.
The first prioritized engineering 'most everything that a modern nation might need. The latter prioritized only engineering its own financial statements.
Both did very well...at least at their top priorities.
>Karl Marx's theory of alienation describes the separation and estrangement of people from their work, their wider world, their human nature, and their selves. Alienation is a consequence of the division of labour in a capitalist society, wherein a human being's life is lived as a mechanistic part of a social class.[1]
There is of course the trend of influencers, quiet quitting, gig economy and people going to work for themselves. That assumes that there is enough capital floating around for everybody to survive on their own... If you can't land a decent job in an established company, maybe you should think twice about being able to float above the waterline when you go at it by yourself.
You got to offer good quality and stand out, which isn't easy without capital.
This is very insightful and, in my mind, a good preview of what is happening with AI right now. We will forget how to use the skills that built these systems in the first place.
Closer to tech, I feel we have had a big influx on non-tech joining the tech workforce and the quality has suffered as a result of a lack of fundamentals and passion
These services basically don't work with Western level wages. The economics are just not there.
So alot unfortunately is a choice that consumers have made. Even in terms of media again, alot of modern viewers watch media more as self-insert fantasies, so quality of writing or novelty is often going to worthless or even detrimental to them. I don't believe that mindset, but having talked to many on /a/, /v/ or reddit, there's many who are just there to consume rather than actual interest.
None of the things you said are actually true. Only superficially, because you've only seen those mass market crap.
Good movies are still around, and yuo don't even notice the CGI, because they're cleverly done. For crap like the recently released snow white, it's obvious that the CGI is badly done - it doesn't make it an indictment against all movies released of late!
Same with games - just because there's lots of AAA studio flops that look terrible, doesn't mean the medium is all terrible. There's so many good indie games that you can never truly play them all.
But if your exposure to these products are only the mass market crap, then you might certainly feel that way.
In the web development community there is a near linear correlation between the number of “influencers” who sell courses that pray on this influx to make money and the influx of such folks.
I miss the days where developers generally had a passion for this work vs seeing only a big paycheck, though without artificial barriers we should have expected a lot of influx of people given how well it generally paid for a long time
Even more glaring is TV shows, where you now get an 8-episode 'season' every 2-3 years rather than the old days of 20+ episode seasons every year, often non-stop for 5 or more years.
It's not so much about capability/competence as pushing production values to unsustainable levels. You could get away with much less expensive VFX, sets, and costume when filming in standard definition. Now every pixel is expected to look flawless at 4K.
Another more controversial factor is that everyone brings their politics/activism to work and injects them into everything that they do. Now everything has to be pushing for social change, nothing can just be entertainment for the sake of entertainment.
That's a really interesting claim. Do you have any sources that explain this further?
Compare 1997 to today.
Major hit after major hit was being released that year, and they were overwhelmingly original and creative. There had been a boom in independent filmmaking and many of the big production houses had started up smaller studios to attract the talent. Unfortunately, Hollywood did what Hollywood does and killed everything that made them good.
Nowadays, we have endless releases of super hero sequels that are, fundamentally, the same movie over and over. We have endless remakes and reboots because nobody wants to take a chance.
Yes, you can find creativity if you look hard enough, but in 1997 it was everywhere, and in your face. You can't pretend that it doesn't matter or that it doesn't mark an enormous shift in culture (business and society).
For example, when the internet emerged, everyone wanted to be online; when smartphones appeared, everyone wanted to have an app; and when VR emerged, Facebook changed its name and lost half of its value in the stock market. Now, it's AI. Capitalists do not focus on the details of the technologies; to them, every new technology looks the same. They see new tech as a growing opportunity and old tech as a saturated market. Obviously, this perspective is flawed, but it doesn’t matter.
In my opinion, AI is not going to create more value. The only real impact it will have is reducing the amount of workforce needed to generate that value, which will ultimately push the economy into a recession. As consumption declines, I don’t see what new technology could come after AI to offset this effect through further investment.
People do, but they aren't in AAA studios. They're doing indie games, because their large corporations were captured by the professional glom-onto-success management class.
I don't think this is a great example, because saving water (and thus the energy needed to heat the water) is both a social good and a private good.
Your new dishwasher program might take longer because, for example, (a) it's more efficient to soak residues than keep blasting away at them, but it takes longer and (b) if you alternate between shooting water at the the top and bottom drawers (but not both at once) then you can get away with using half the water, in twice the time.
Most dishwashers have an 'express' programme that uses more water and energy to finish faster, so if that matters you can still have it. If it doesn't matter to you (e.g. because you're running the dishwasher overnight, or while you're at work), you and everyone else benefits from the greater efficiency.
So I think this is an unambiguous improvement. :)
The average quality of appliances is a separate question. Anecdotally, I finally had to replace a 22-year-old Neff dishwasher. I got a new Bosch one (same firm, different logo), and have been pleasantly surprised that the new model is still made in Germany, seems pretty solid, washes well, and is guaranteed for 5 years.
This makes it sound as if management only decide whether to engage in modernizing or not. I think it's only fair to also give them full credit for the failures - profit over people, dogma over pragmatism, etc is their fault.
I don't think talent is the problem either. There's a lot more talent now than in the 90's.
It also was because development budgets were microscopic compared to today, so a bad release from a dev team of 5 people and 12 months won't bomb as badly as a 500 person 5 year "blockbuster" release. So yeah, Superman 64 was laughably bad but didn't sink a company the way Condord or even a not-that-bad game like Saints Row would.
Economy is different, as is the environment. There's still quality, but when a game flops, it's a tsunami level flop and not just a painful belly flop.
All this discussion assumes that Boeing engineers didn't catch this stuff and weren't banging the alarm bells over how these completely failed inspection. The problem was the people in power ignored it. This is an entirely social issue constructed by business demands, not one lacking expertise nor standards.
Do you own a PinePhone?
Or do you own a higher-spec, more familiar iPhone or Android that can't be opened up?
It's the second one, isn't it. Who made you choose it?
Capitalism needs a deadly threat to be good.
Cost of living from the fallout of '08 simply skyrocketed and most of the country didn't not increase compensation to make up for that. Despite that company simply charged more while cutting costs at the same time. So the driver and the customer lost out.
My employer buys a crap load of crap stuff from Broadcom just because the procurement is easy.
Edit: ooo someone's mad I don't like DoorDash
Maybe arts shouldn't have been industries. Look at sculpture or painting from the Renaissance and then postmodern sculpture and painting and you'll see a similar decline, despite the improvement of tools. We still have those techniques, and occasionally someone will produce a beautiful work as satire. We could be CNC milling stone buildings more beautiful and detailed than any palace or cathedral and that would last for generations, but brutalism killed the desire to do so, despite the technology and skill being available. There's something to industrialized/democratized art being sold to the masses that leads to a decline in quality, and it's not "because no-one knows how to do anything any more." It's because no one care nor wants to pay for anything beautiful, when there are cheaper yet sufficient alternatives.
> This is really a cultural problem that has infected management along with everyone else.
Because every time a natural correction happens, the government bails them out
I think that's the main point, yes. There's a sense before that companies were trying to push the envelope. These days it's just a shrug and cynical minmaxing of funds to the shareholders. CGI 20 years ago was objectively worse but you can tell they had way to hide the flaws or redirect the eye away from them. Now... Ehh, who cares? Just get the first pass through.
If you want a relevant example: some people say Lili and Stitch's life action has a weird looking stitch model. Part of thst is because way back in 2005, the original Stitch was simply never meant to be looked at in a side profile for an extended time. Art directors made sure to avoid that angle in every frame they drew. 20 years later... meh. Ship it. Screw the outsourced CGI trying to model something better, the cinematography begin careful of angles, nor any reaction from "nitpickers". We got the IP, it'll make money.
It's not a franchise killer but it'd just one example of the many broken windows
The reality is that you can have it both ways. I own an iPhone, I know how to build a computer, I buy software, and I know how to code. There is value in understanding how the things you have work, but that doesn't mean that you can't or shouldn't buy a high quality product just because you can't take it apart.
Your average gacha may look lower effort, but it has to sustain thst effort longer instead of patching the game for a few months and moving on. It has to do a lot more marketing to get players in, because many are this pseudo-MMO experience, completely with PvP and Guild content to manage.
At the highest end, Hoyovervese's operating costs would even make Activision blush. But those games make billions to compensate.
But there were companies shipping DVDs before Netflix, and also companies streaming movies online before them. So really a marketing / operations / distribution revolution.
Treating this mentality of "taking money out that you put in" as "taking handouts" is the exact reductive mentality being used to try and have the government steal the money you earned from under your nose.
It's tiring to read again and again about evil external forces wrecking the world, when the choices are our own, and right in front of our faces.
Means testing will also make paying into these programs even less popular. Upper middle class people will ask why, exactly, they’re expected to pay more into a retirement program they will get less out of. That’s a recipe for political change from a party who promises not to do that.
If you hire Canadian software engineers, you can dodge this and deduct the expenses in your Canadian subsidiary. If you outsource software dev to another company you can usually get away with expensing it.
Thing is those high profile disasters are still supposedly the "cream of the crop". That's why they get compared to the cream of before.
Popular examples are easier to exemplify as well instead of taking the time to explain what Blinx the Cat or Midnight Club are (examples of good but not genre-defining entries)
This annoyed me, because it's so manifestly untrue. The games of the year of the last few years (https://en.wikipedia.org/wiki/List_of_Game_of_the_Year_award...)
- 2024: Astro Bot
- 2023: Baldur's Gate 3
- 2022: Elden Ring
- 2021: No consensus pick, but It Takes Two stands out to me
- 2020: Hades
All of these, with the exception of BG3 are original IP. A lot of them have really unique game mechanics that I haven't seen before. Hades has some of the tightest combat that never gets old even after hundreds of runs. It also has extraordinary music and voice acting. Truly a labour of love.
It Takes Two is a co-op story adventure. Every single level has a new fun mechanic. In one of them you literally control time. Please, do tell me which game from 20 years ago was a co-op adventure where every level was unique? The best co-op was probably Halo 2 (2004), but that's just shooting from beginning to end.
You're thinking "well, ok there's one sequel in there. That's proof that video game companies want to play it safe". But you'd still be incorrect. BG3 is inspired by its prequels BG1 and 2, but those released 20 years ago. Open YouTube and check out how different they are in every single way. I'll bet there isn't even a single line of code common between the BG3 and the originals. BG3 exists because the developers grew up playing BG1 and 2 and wanted to make a homage to the games that shaped them. And they succeeded, good for them.
I will admit that I didn't play Elden Ring. I didn't even attempt to, because I already have a full time job. But that's great too, because it shows that there are games being made for people who love a punishingly difficult challenge. That's not me, but you can find that now if you want.
Your comment is just rose-tinted whingeing. It's so easy to write a comment like "man, the good old days were really good weren't they". But ... no. I can play all of the games from the good old days and I can also play Hades, It Takes Two and BG3. And that's just the surface! There are so many incredible games being made and released. Factorio is great in many ways, but the most remarkable part is how they've optimised their game to a mind-boggling extent.
No one knows how to do anything anymore? Then how did these incredibly innovative, flawlessly executed games get made?
Even with people that work with/in software roles there's often shocking knowledge gaps in areas that they work in. I've worked with more than one front-end "engineer" that only understood React--they had no conception of DOM APIs or that React was an abstraction on top of that whole underlying environment.
Even creating a static page with a simple form was create-react-app for them.
Management and executives had almost all worked their way up the ladder. Toward the end I think some of the higher-up ones were encouraged to get an MBA as they advanced, but they didn't do much hiring of MBAs.
The company got bought by another in IIRC the late 90s, and this other one had already been taking over by the "professional managerial class", and they quickly replaced most of the folks from the top down to the layer just above him with their own sort.
His description of what followed was incredible amounts of waste. Not just constant meetings that should have been emails (though, LOTS of that) but entire business trips that could have been emails. Lots of them fucking things up because they had no idea how anything worked, but wouldn't listen to people who did know. Just, constant.
The next step was they "encouraged" his layer to retire early, for any who were old enough, which was lots of them since, again, most of them had worked their way up the ladder to get where they were, not stepped straight into management as a 25-year-old with no clue how actual work gets done. I haven't asked, but I assume they replaced them with a bunch of young business school grads.
There are sometimes posts on HN suggesting that our dislike of business school sorts is silly or overblown, but if anything I think it's too weak. The takeover by them and, relatedly, the finance folks has been disastrous for actual productivity and innovation. Companies should be run by people who've done the work that the company does, and not just for an internship or something.
Stardew Valley is 9 years old.
Minecraft is almost 16 years old. The current version of the game has not dramatically changed in terms of the experience of most players of the game in over 10 years. (Hardcore players of any game will always make a big deal of any minor changes).
I was born in the 1990’s. I was playing games regularly in the 2000’s and the 2010’s although I don’t play as much today.
Hardly anyone in 2005 was playing 1996 games or 1989 games regularly.
Even in 2015 not many were playing 2006 or 1999 games regularly. (I think World of Warcraft was the only very popular old game in 2015)
But now in 2025 you bring up a 2016 game and 2009 game to argue with that other guy?
Hell what happened to the major big budget games? I remember playing Witcher 3, Red Dead Redemption 2 and Cyberpunk 2077…but even those games are ancient now. Witcher 3 is 10 years old, RDR 2 is 7 years old, Cyberpunk is 5 years old…
In 2015 I was playing games more often but I was playing games that were more recently released…. Not really games from 2010, 2008 and 2005….
Hell the most popular game for kids now is Fortnite which is 8 years old and came out in 2017! I wasn’t playing Mass Effect (2007) too much in 2015. The difference between Mass Effect 1 or Elder Scrolls Oblivion and The Witcher 3 is the same time difference as when Fortnite was released and 2025!
https://news.ycombinator.com/item?id=43493740
I don’t want to retype everything I posted in that reply but it kind of applies to your comment as well.
In 2015 if we were having this discussion I could easily pull out dozens of groundbreaking innovating games from 2010 to 2015.
In 2005 if we were having this discussion I could have easily pulled out dozens of groundbreaking innovating games from 2000 to 2005.
But we are having this discussion in 2025 and I know both you and I would struggle to pull out a dozen high quality new innovating games that have come out in the past 5 years.
Clearly things have gone worse.
Elden Ring is just Demon's Souls 4 from 2009. It's good to the point that I'll still preorder its successor, but nothing is original there any more.
Edit: not 4, more like 7?
Edit 2: Hades seemed more difficult to me than Elden Ring. Maybe you shouldn't trust the marketing and check for yourself.
And certainly some of these games are useful; abilities of this kind are highly correlated with other abilities, and having masterful language and perception manipulators act for the interest of your company or nation is valuable.
But it's not the only useful skill at the upper tier of organizations, and emphasizing it over all else is costly. So are internal political games-- when your organization plays too many of them, the benefits one gets from selecting these people and efforts are dwarfed by the infighting and wasted effort. It can also result in severe misalignment between individual and organizational incentives.
Back when I was doing food delivery before the pandemic, we would actively promote placing orders with our restaurant directly. I would tell repeat doordash customers that they can save 15% if they just call or use the website.
None of them converted. The convenience of the app is just too strong for people to care.
I love trader joes, don't get me wrong, but I wouldn't be happy if it was the only grocery store I had access to. For me it's an awesome second-in-line grocery store, more like a specialty grocery than a main grocery.
Enshittification in action.
What's worse is that comedy is a minefield, as somebody somewhere is bound to be offended and launch a cancel campaign. So comedy films, including the once-beloved rom com, just don't get produced anymore like they used to. Any attempts at humor in movies has to be rolled in to something else -- superheroes talking in aggressively annoying Whedonese and the like -- and housewives must content themselves with Hallmark Channel glurge. And what humor is there is cringey as fuck because it's either entirely toothless or it's a "Straight white men, am I right?" type of thing because you are still allowed -- and encouraged -- to mock that group.
I mean, the normally sequel-averse Jim Carrey came back to do three movies about a video game hedgehog because those are the only movies being made in which he gets to flat-out do Jim Carrey stuff.
So there has been something of a renaissance with television, starting around the time of the Sopranos release in 1999 I think, which there was a market for shows which didn't 'reset' somewhat between episodes.
The handy men of the future were like today’s tech bros. They were loaded, because people couldn’t perform basic tasks around the house. When a father was looking to teach his son how to fix the oven, he showed him how to call the handy man.
I would prefer a scenario where monopolists were broken up and regulators mandated open designs that can be repaired.
In 2005 I could play a game for 12 hours straight and then hardly be able to sleep I would be so excited about playing it the next day.
Today, even for a game like BG3 that is objectively an incredible game, I can do maybe 2 hours every few days and feel fulfilled.
I don't think this an outlying example either. Most of my friends are now the same way, and frankly when you login to play games online, it's not exactly overflowing with the 35-40yr olds who saturated servers 20 years ago.
Close. They're not incompetent; we just redefined competence.
It used to be that competence was a mix of a lot of distinct, but interdependent, qualities. The end result was synergy that allowed for people and organizations (including companies) to compete and move society forward.
In the 1970s, we started to allow a bunch of psychopaths (I'm saying this in the clinical sense) to redefine competence. Instead of this array of distinct qualities, they just defined it in terms of ability to create monetary value, particularly if that value was then transferred to shareholders. That was it.
We also switched to quarterly reporting for for-profit companies, shrinking the window to evaluate this new definition of competence to 90 days. Three months.
An end result of this was that you could simply do whatever made the most money in 90 days and be considered competent.
Jack Welch was the paragon of this. GE shareholders saw massive gains during the latter half of his tenure at the helm. This wasn't because of groundbreaking new products or services; quite the opposite: Jack realized that selling off divisions and cutting costs by any means necessary was a good way to make money in the 90 day period. Institutional knowledge and good business relationships in the market - two of the elements of the former definition of competence - were lost, while money - the sole element under which competence was judged in the new definition - went up.
You also had managers doing a lot of the avoidance of real management, like you speak of. Instead of betting on a new product or trying to enter a new market, they took a Six Sigma course, learned a bunch of jargon, and cut costs at the expense of business past the 90 day period.
If you do this enough (and we did, far beyond just GE), that expense is taken at the societal level. Existence extends beyond 90 days. You can't mortgage the future forever. It's now the future, the payment is due, and we have an empty account to draw from.
Theoretically, we could go back to a more in-depth evaluation of competence and reward its display over the long term. In practice, there are a bunch of people who got unfathomably wealthy off of the shift to the "new" competence, and now they're in charge and don't want to switch back, so we won't.
What I suspect is the problem is that you also want them to be groundbreaking and innovative. This is an impossibly high bar to meet in a mature industry. There are some games that still meet this bar. Half Life Alyx is from 2020, ever played anything like it? Have you truly built all the possible contraptions in Tears of the Kingdom (2023)? Last of Us Part II (2020) is going to premiere on TV in a couple of weeks. How many older video games have a story that was shot so perfectly that they be translated shot for shot into a hit TV or movie?
Check out these two videos of a guy horsing around in Tears of the Kingdom - https://www.youtube.com/watch?v=lpFXlkjAurc and https://www.youtube.com/watch?v=VyQdn5bwF_Q. Look at how much fun he's having! Yes, it takes an incredible game to enable such creativity. But he's having fun because he wants to have fun.
If you're finding less joy in games than you used to, you should be open to the idea that it's not the games that are causing that effect.
Anyway, stuff like Dirty Harry or a bunch of traditional Westerns are extremely political in the same ways that "woke" movies are (presenting and normalizing certain roles and behaviors, presenting politicized views of history and of certain groups, ways of life, and attitudes, and using caricatures of their political opponents as bad guys), they're just not liberal so that means they "aren't political".
Hell, most of the silent films that were good enough that anyone still gives a shit about them are plenty political, and often (but not always) rather liberal.
Historically there has been a brain drain from Canada to the US, but if Canada can set up favourable policies for companies maybe they can start reversing that.
Is this not A) ubiquitous, B) rich with incentives, and C) not downright implied in "They are masterful language and perception manipulators, in a strategic game of corporate dominance." and "the understanding of what makes others in their management circles feel good."
The fact that so many companies play tricks with CAPEX and OPEX completely misses the point that almost all corporate spending should be seen as investment or spending to support investment at some level.
The past 50 years of business school has taught people that outsourcing your core competency is a good idea because it gets things "off the books" and makes quarterly reports look better. The end result was shifting huge swaths of our economy to a hostile country.
Here in tech, I've literally seen companies shift stuff into the cloud even though it's more expensive, because OPEX can be written off right away and they don't want CAPEX on the books, only for a year later to want to shift back because they decided it's now better to optimize for actual cashflow. It's infuriating.
Those all sound vastly more positive both on a personal and societal level vs getting dopamine from your phone
https://www.grantthornton.com/insights/alerts/tax/2023/flash...
The TCJA amended Section 174 by removing the option to expense SRE expenditures, instead requiring taxpayers to capitalize and amortize SRE expenditures over a period of five years (attributable to domestic research) or 15 years (attributable to foreign research)
I dunno, there’s something in the fact that Isaac Newton the imaginary cultural figure was hit on the head by an apple, and then invented calculus.
Meanwhile Isaac Newton the actual guy (recalling from memory so feel free to correct) was a bit eccentric (dabbled in alchemy and other mystic arts), had some academic posts, some government jobs, and built Calculus on work that was ongoing in the academic community…
The imaginary Isaac Newton and the imaginary Elon Musk look sort of similar. Because we ignore the boring work that Newton did and the fact that Musk just bought his way around it—their real versions look very different of course! But if you want the actual day to day experience of being Isaac Newton, you can, just go be a professor and make some quirky friends.
After a few loads I bought a hand crank cloths wringer, basically two rollers that squeezes the water out. That thing honestly works better than a spin cycle, cloths are more dry than coming out of the washer and I have noticed the dryer finishing faster (I usually run it on an auto sense mode rather than a timer).
Compare+contrast with 'The Last Jedi'. Turning the male characters into total idiots and sending them off on a massive wild goose chase, before the day is saved by completely breaking the physics of the Star Wars universe, making all the previous heroes look like idiots for not using a relativistic kill vehicle against the Death Star!
I don't remember hearing any complaining about strong female characters in the era of Leia, Ellen Ripley, Sarah Connor, Major Kira, Susan Ivanova, and so on.
Also, it is a little odd that people are dining out more often if the experience is worse.
But, yeah, wages and employment being down is the most relevant change.
You're right that an important reason why it's hard to replace those 30+ year old systems, and that part of the reason is that the current devs are not necessarily at the same level as those who built the original. But at least in part, this is due to survivorship bias.
Plenty of the systems that were built 30-50 years ago HAVE been shut down, and those that were not tend to be the most useful ones.
A more important tell, though, is that you see traditional IT systems as the measuring stick for progress. If you do a review of history, you'll see that what is seen as the measuring stick changes over time.
For instance, in the 50's and 60's, the speed of cars and airplanes was a key measuring sticks. Today, we don't even HAVE planes in operation that match the SR-71 or Concorde, and car improvements are more incremental and practical than spectacular.
In the 70s and into the 80s, space exploration and flying cars had the role. We still don't have flying cars, and very little happened in space from 1985 until Elon (who grew up in that era) resumed it, based on his dream of going to Mars.
In the 90s, as Gen-X'ers (who had been growing up with C64/Amiga's) grew up, computers (PC) were the rage. But over the last 20 years little has happened with the hardware (and traditional software) except that the number of cores/socket has been going up.
In the 2000s, mobile phones were the New Thing, alongside apps like social media, uber, etc. Since 2015, that has been pretty slow, too, though.
Every generations tends to devalue the breakthroughs that came after they turned 30.
Boomers were not impressed by computers. Many loved their cars, but remained nostalgic about the old ones.
X-ers would often stay with PC's as the milennials switched to phones-only. Some X-ers may still be a bit disappointed that there's no flying cars, Moon Base and no Mars Colony yet (though Elon, an X'er is working on those).
And now, some Milennials do not seem to realize that we're in the middle of the greatest revolution in human history (or pre-history for that matter).
And developers (both X'ers and millennials) in particular seem to resist it more than most. They want to keep their dependable von Neumann architecture computing paradigm. The skills they have been building up over their career. The source of their pride and their dignity.
They don't WANT AI to be the next paradigm. Instead, they want THEIR paradigm to improve even further. They hold on to it as long as they can get away with it. They downplay of revolutionary it is.
The fact, though, is that every kid today walks around with R2D2 and C3PO in their pockets. And production of physical robots have gone exponential, too. A few more years at this rate, and it will be everywhere.
Walking around today, 2025 isn't all that different from 2015. But 2035 may well be as different from 2025 as 2025 is to 1925.
And you say the West is declining?
Well, for Europe (including Russia), this is true. Apart from DeepMind (London), very little happens in Europe now.
Also, China is a competitor now. But so was the USSR a couple of generations ago, especially with Sputnik.
The US is still in the leadership position, though, if only barely. China is catching up, but they're still behind in many areas.
Just like with Sputnik, the US may need to pull itself together to maintain the lead.
But if you think all development has ended, you're like a boomer in 2010, using planes and cars as the measuring stick that thinks that nothing significant happened since 1985.
Not only that, but often with whole government backed companies where the government will gladly support them and even participate in espionage to gain competitive advantages. Huawei is the classic example, but is just the tip of the iceberg.
Meanwhile in most of the western world, executives are focusing on the next quarterly results...
Apple was selling a desktop operating system that was a competitor to Microsoft Windows, Corel WordPerfect was a thing and so was Lotus 1-2-3.
DoorDash adds upwards of a couple dollars to every item. They charge a 5-10% service fee depending on if you pay them monthly. The default tip options are pretty egregiously high - it's not uncommon to see double-digit tips in all three options. I once saw a $22 tip in the top option for a single bag of food with no drinks less than a 10 minute drive away but that's likely an outlier.
All in if you don't have DashPass you're easily looking at a 30-40% increase if you get cheaper items which are more likely to be marked up only $.5-1 but represent a larger percentage of the total.
Nobody in their right mind would tip a delivery driver 40% of their entire meal, why are you happy to give most of that to a corporation that is doing very little for the transactions?
Edit: I just did this for an example order for a nearby restaurant - 1 appetizer and 1 entree so probably good for two people not super hungry to share or one hungry person to eat.
$31.59 in food, $2 delivery fee and $5.50 in fees (I subtracted sales taxes manually). This restaurant is 5 miles away and it's 11:30 local time. Tip suggestions are $9.50 (30%), $11.50, and $13.50 (42%). So at the lowest suggested tip amount, which is offensively high, you're looking at $49 before sales tax.
The exact same order is $26.36 including an online order service fee but before sales tax. Even if you were going to get it delivered and tip the driver 30% you're still saving a ton of money and this is on one meal with enough food for 1-2 people.
The appetizer alone is $10 on their website and $14 on DoorDash. It's a crazy system and I can't believe how much money people burn on this every year.
The fact that two brand new MAX's crashed killing all aboard within 2 years of its commercial introduction (out of only ~600 models flying at the time) is a brutal safety record for the jet age, especially as the cause of the crash was the plane itself. That list you post includes any and all reported incidents that merely involve 737s (and involve incidents that were caused by factors that aren't necessarily related to the safety of the place itself).
Maybe it's really about wrong incentives and lack of technical excellence.
Government money keeps coming in and making it broken and buggy actually assures ongoing contracts. Investment in skilled workers or solving technical issues is not paid for and everyone - company and customer are completely disconnected from the end user and feedback mechanisms are broken or manipulated.
It's maybe a mix of all the different answers my post got.
On a long enough time scale, short-term oriented systems naturally-select themselves out of existence. The U.S. Constitution didn't survive 7 generations. The Civil War was in 1865 (77 years, ~4 generations). Reconstruction Era made it maybe 60 years (3 generations), as far as the Great Depression / Dust Bowl.
The current post-war ordering of the interest of short-term capital above all else doesn't have a well-defined start date, but 1968 (MLK Jr, RFK, nomination of Humphrey) is a solid one. We're hardly 3 generations in, and it doesn't feel great.
Really, when you look at American history, the periods endowed with bouts of long-term thinking are really quite rare (1770-1810, 1880s-1900s, 1930s-1950s). Maybe we're due for another one.
From an employee perspective, lets say I am a computer scientist, why should I spend precious time to develop myself in the fundamentals of Web when my manager just wants me to pump out React and Express.js code 24/7?
And for my promo? Well I will just point out that the system became slow and unmaintainable, propose adopting a new set of frameworks, cash the checks and move on to other pastures.
All the incentives are wrong.
That very much is an indictment of ambition and progress here.
Yes. Its a piece of junk. Why do I own it? I like to throw my money away on ideals I never actually follow. Its sitting next to my unplayed guitar, my list of books on how to effectively get A's in college (I ended up a C+ student) and my Raspberry Pi that has only ever been powered on once.
If you have OSX, you can use Al Dente[1] to limit SoC to 70 or 80% while using it to reduce battery aging. There may be similar settings on Windows depending on your laptop's manufacturer.
If you can maintain a limited SoC rather than running the battery down, that's most preferable.
Otherwise, discharging lightly (but not below 20% or so) then charging to 80% or so would be a good usage pattern.
It's helpful to know that many chargers are designed to achieve 1C charge rate (this excludes "fast chargers"). That essentially means they go from 0 to 100% SoC in one hour. So start a 30 minute timer when you plug in electronics to charge, and you'll gain about 50% SoC.
[1] https://github.com/AppHouseKitchen/AlDente-Charge-Limiter
Apple also had a much smaller desktop OS market share at the time. They nearly disappeared but are in a much stronger position today, which makes it harder to argue that Microsoft has a monopoly. There's no strict threshold for market share in these cases, but it's one of the factors taken into account.
It increasingly applies to nearly all aspects of the economy. Everybody wants to lock you in and take a cut. Almost all new innovation these days is just rent seeking gatekeeping. Even genuine innovations are unable to get their innovations out without either recreating entire software stacks (or supply chains) that's under feudalistic/parasitic control, they often remain niche and undermonetized. This will have an effect on the economy like a % yearly reduction in atmospheric oxygen will destroy biodiversity.
This can be changed in software, setting it to 70-80% or having a toggle is best for the battery.
Users want good fresh food delivered at a reasonable price. But they are willing to tolerate shitty cold food at an expensive price because it’s a tiny bit easier to do than picking up the phone.
We are lazy by nature to preserve energy and many many companies have just perfected finding the right ratio of how fucked we will allow our selves to be to tickle that lazy button.
Only because the service is disrupting the business model for those wages.
It worked well(ish) in 2019, it failed by 2022. It's not some kind of mystery around wages or inflation, the introduction of these services (and their popularity and growth, due to COVID closing in-person restaurants for a while) is the thing that killed the economics around delivery, for much of the US.
I know the 13th, 14th, and 15th amendments to the US Constitution are often considered America's Second Founding because they legally eliminated [1] all the elements of racism within the United States Constitution, but saying that the Constitution "didn't survive" doesn't see accurate...
[1] That being said we all know that it took many many decades after those 3 amendments for the laws in the United States to accurately reflect the principles embodied within these amendments.
The Canadian government also heavily subsidises this. Smart of them to do so.
I like to use Star Trek in the 90s as a good example of what I mean. While there are episodes where the writers got preachy (they're only human I suppose), most of the time the writers were very careful to not openly take sides on the issues they raised. Even if you got the sense that the writer for an episode might feel a certain way about the topic, the characters wouldn't tell the audience how to feel. They didn't call other characters who disagreed with them names. They didn't just bully their way to victory in the story. The topics were treated as complicated issues where reasonable adults could disagree.
Compare that to shows/movies/books today. The writers treat the story primarily as a vehicle to express their opinions on issues. They have characters tell people "this is how a decent person behaves", with the understanding that the message is really meant for the audience. They have characters who agree with them call their opponents bigots or worse insults. They portray said opponents as villains or morons who only hold their beliefs because of how evil/stupid they are. They have the "good guys" run roughshod over anyone who disagrees with them, and they get to win despite their bad behavior. And often, the writers (and even other people involved like actors) will openly express their contempt for their audience when speaking about the work. They pick fights where none needed to happen, saying stuff like "if you don't like this then I don't want you as an audience member anyway". They are, in short, bad writers who don't have the skill to successfully let their social views influence their work.
The result of all this is that these writers don't succeed at persuading anyone. In years past writers could actually make progress on advancing the things they believed in because they had the wisdom to not openly preach to people and call them names. They respected people enough to let them draw their own conclusions, and as a result were successful. But writers today aren't good enough to persuade people to continue breathing, let alone something more controversial than that.
There is also an uptick in how much politics get forced into art, with people trying to claim "everything is political" and the like. But that isn't nearly as big a factor as how bad today's artists are at using political themes in their work.
If you order food directly, you won't have the delivery tracking on the map. Even within the app, if the restaurant provides their own couriers, you lose the visibility and arrival ETA info.
And 15% might look impressive, but if you are getting your food from a delivery app, you probably don't care that much about food price in the first place.
The 3/5ths compromise, and its implicit enshrinement of slavery as an American institution, is as an example of short-term thinking (compromising on the legal definition of a human being, in order to get the constitution ratified) that eventually caused the greater system to unravel. Hundreds of thousands of people died in the civil war, millions of people experienced slavery. It could have been avoided if longer-term thinking prevailed.
I hear you that the Constitution (inclusive of its self-mutating property) survived as a useful document of federal governance. This purported maintenance of a federal union was a huge legitimizer of northern domination of the post-ACW United States. But, I think you'd agree that the "system of governance" that begat the constitution did not survive, that's more what I was getting at. That each successive system of governance can still legitimately claim to be implementing the U.S. Constitution is indeed impressive.
Compare the market success of the PinePhone to the Framework laptop. Their laptops are technically competitive with the Dells and the HPs of the world, while also being repairable.
The PinePhone doesn't even beat the until-recently-current iPhone SE in performance. It's a terrible choice, technically speaking.
That is what a 0201 capacitor looks like.
I know much of the answer is a mix of private equityb and an overload of debt taken from insane evaluations.
That's often a good change. Less filler for the sake of having another full season.
Toy Story was a good idea because attempts at depicting humans with CGI at the time had a very plastic look.
It's been like this for a while now.
I think I even saw someone in a conservative subreddit suggest that everyone should work on a farm for a few years after college before they get real jobs. I'm still unable to determine if this was a troll or if a well-meaning conservative actually reinvented Mao's Down to the Countryside movement.
So this is why it's a cultural issue.
Let's consider a market that still works basically like it's supposed to: Desktop PCs. You have your ATX standard PC, it came with a Core i3 processor which is getting a little long in the tooth, but you can drop in a Core i7 and double the number of cores. Not only that, the parts are all modular and standard. You take your ten year old i3 6100 dual core, swap out the motherboard and CPU and now it's a 16-core Ryzen 9 5900XT from 2024, but it still supports the same memory, GPU, SSD, chassis, power supply, etc., any of which you could also have independently replaced before or after this.
So now I go and buy a PinePhone, and after a couple years the CPU seems a little anemic. No problem, it's modular, I'll just buy one of those fancy chips they put in the iPhones and put that in there. Or at least the top end things from Samsung or Qualcomm. No? That's not available?
Okay, but at least I can put whatever software I want on it. Now the way this works is, people can improve their own devices in collaboration with other people. Adding a new subsystem to your phone would be a full time job, but it could also be a dozen part time jobs. Somebody does a barebones implementation and throws it on github, then you personally only need it to do one extra thing and all you have to do is add the extra thing instead of starting from scratch, which is a tractable problem instead of a hopeless pipe dream. But when each person contributes a little part, you ultimately end up with a complete implementation. Most of the users don't even have to contribute anything, as long as there is a large enough community of people who do.
Except that 99% of people have locked down devices, so the community is suppressed and then even if you buy the device that allows you to do it, you're the only one working on that subsystem and it's too much work for you to do yourself, so you don't even make the attempt. And then what good is the device?
It's an ecosystem problem. A cultural issue. It can't be just you. You need the default attitude of the common customer to be "this despotism will not stand" and to give the finger to any company that locks you out of your own property. Regardless of whether you personally actually upgrade your own device or write your own code, you need everyone to have the ability to do it, because the alternative is a friction that erodes the community and in turn destroys a backstop against involuntary captivity.
Boeing being better now than in the 90s doesn't mean that the stock shouldn't drop, because competitors and expectations are higher now than in the 90s.
I am not a lawyer but many years ago I read about the following doctrine.
https://en.wikipedia.org/wiki/Incorporation_of_the_Bill_of_R...
Basically prior to the American Civil War the Bill of Rights was considered to only apply to the Federal government and not the state governments.
After the Civil War, the US Supreme Court interpreted the 14th amendment such that overtime all the amendments of the Bill of Rights were considered to apply to the states as well.
So what you are saying about one system being dominant over the other system (Federal government being dominant over the state governments) makes sense and it is something that seems to have happened more extensively after the Civil War.
It's misleading to say they prioritized making it fly like its predecessor over safety.
In theory there was absolutely nothing wrong with a system LIKE MCAS. In fact the 737 MAX is still approved to fly with it.
The flaws were in the specific implementation and documentation around it, not with the idea of the system itself.
> The fact that two brand new MAX's crashed killing all aboard within 2 years of its commercial introduction (out of only ~600 models flying at the time) is a brutal safety record for the jet age, especially as the cause of the crash was the plane itself.
If you want to be pedantic about it, the reason for the crashes is that the pilots failed to recognize trim runaway during takeoff. The trim runaway was caused by MCAS, but this is not a new failure mode for ANY aircraft and pilots get extensive training on how to manage it [1].
MCAS failing was not an unrecoverable error [2]. It failed several times in the US, as well, but American pilot training standards are very high compared to the places where there WERE disasters and the pilots recognized and recovered quickly.
I say this not to deflect blame from MCAS. Its original implementation was unsafe and should never have been approved.
A large part of why modern jetliners are so safe is exactly because of flight control augmentations like this - both Boeing and Airbus have been implementing these for decades and they have made flying much safer. Your suggestion that any system like MCAS is always unsafe (or that Boeing was somehow doing something wrong by adding it) is totally wrong.
1: https://www.aopa.org/news-and-media/all-news/2017/july/pilot... 2: https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-c...
Just quoting from Wikipedia:
>Han argues that subjects become self-exploiters: "Today, everyone is an auto-exploiting labourer in his or her own enterprise. People are now master and slave in one. Even class struggle has transformed into an inner struggle against oneself."[12] The individual has become what Han calls "the achievement-subject"; the individual does not believe they are subjugated "subjects" but rather "projects: Always refashioning and reinventing ourselves" which "amounts to a form of compulsion and constraint—indeed, to a "more efficient kind of subjectivation and subjugation." As a project deeming itself free of external and alien limitations, the "I" subjugates itself to internal limitations and self-constraints, which are taking the form of compulsive achievement and optimization.[13]
They're managing capital. If they get bailed out because they turned out to be completely irresponsible in managing their capital then nobody can claim to be surprised that management tend not to be of the highest standard on any axis.
What is supposed to be the incentive here for appointing competent managers for most companies? It literally doesn't matter. Even company-bankrupting performance will turn a profit once the effects of money printing are factored in.
People also seem to try to shoe horn him into every topic, even when it really doesn't fit. For instance this issue is not one about some group of melancholy workers being alienated from the product, but 'capitalists' who have become so detached from their product that they are left looking at things through a sort of compression lens that leaves them with a deeply distorted view of reality. Even with your example - I agree that learning 'life skills' is extremely important for a solid development, but Mao wasn't doing that - he was effectively exiling people to rural areas, largely to replenish populations after massive famines that were created by his other harebrained schemes.
1. Pizza travels very very very well
2. Pizza is pretty cheap to make
3. Wages (and costs of transportation) were lower 20 years ago.
More generally, delivery as a model can work, but not when you have an organisation of really expensive engineers/salespeople working on a frontend to it.
Note: The implementation is out of sight, the vision of what it could be is the actual competitor.
The internal Boeing emails literally say otherwise.
> In theory there was absolutely nothing wrong with a system LIKE MCAS. In fact the 737 MAX is still approved to fly with it.
I never said that MCAS had any issues in theory. And the 737 MAX was mostly "approved" by Boeing's self-regulators, where emails trails (again literally) had anybody raising questions or concerns sidelined.
> The flaws were in the specific implementation and documentation around it, not with the idea of the system itself.
Yes, because Boeing's top priority was making it so that no expensive extra training was required to fly the MAX, despite the fact that MCAS was designed to deal with some situations that could cause the plane to fly differently.
> If you want to be pedantic about it...
Yes, I am being pedantic about it. The trim issues in the crashes were (intermittently) caused by MCAS, but there was no specific documentation or specific training as to how to deal with it in the case of faulty MCAS sensors. There were indeed several MCAS incidents in western flights, but they were different as the failures were different. The two crashed pilots did indeed attempt disabling MCAS but the intermittent failures masked the problem and there were insufficient checklists by boeing, because had they existed it could have allueded to the fact that such situations may need new simulator time.
The MCAS issue was totally and completely recoverable if it were properly documented, but doing that would have almost certainly guaranteed the simulator time that was Boeing's top priority to avoid.
Almost all the reports about pilot capability differences had more to do with experience than it did training. These "developing" countries have younger airlines and pilots who don't have the same pipeline of pilots with decades of experience, including the military like in the US. MCAS "acted up" on several other Lion Air flights that the pilots corrected for as well, but again those were different failure modes.
The fact that the Ethiopian Airlines had perfectly acceptable safety record on other planes negates that these are "poorly trained" pilots. They've had one major accident in 2010 that was attributed to pilot error, but most of the rest were due to bad luck (eg bird strikes) or hijackings.
> A large part of why modern jetliners are so safe is exactly because of flight control augmentations like this...
I never even mentioned MCAS by name. Yes, modern jetliners are safe because of these kinds of systems. Airbus planes will not allow pilots to do many things no matter what, even. But these systems are documented, pilots trained on them, and go through rigorous testing because in most cases they're designed to make a plane safer, not try to deal with aerodynamic changes.
Boeing wanted no new simulator training despite the MAX being a very different aircraft due to changed engine placement. That was the cause. If Boeing wasn't trying to avoid new simulator training the 737 MAX is a perfectly fine aircraft as far as we know.
The 787 had similar issues as the overriding goal of the program was to get as much capital expenditure off of Boeing's books, but all of the outsourcing led to a nightmare when trying to assemble the plane and there was no unified quality control program, or even a straight line of responsibility.
The common person often doesn't realize this at all. Every modern plane is flying itself essentially, with hints from the pilot on what to actually do.
>MCAS failing was not an unrecoverable error [2]
Also this is frustrating, especially in the case of the second crash where every max pilot knew the procedure (including the one that crashed), they even performed the procedure but then disabled it a minute later. Both the NTSB and the BEA (French equivalent) agreed pilot error/CRM played a role in the second.
Uh. 1997 had Oz, Buffy the Vampire Slayer, Stargate SG-1, King of the Hill, Just Shoot Me, Ally McBeal, The X-Files, Friends, 3rd Rock from the Sun, and MTV still showed music videos. Cable television hadn't yet been completely overrun with 'reality' television. We joked about The History Channel becoming the WWII channel, but it hadn't yet become the Ancient Aliens, cheap, pseudo-reality parody of itself.
I get your point about serialized stories, but I'd still take the great entertainment of the 90s over today's over-reliance on digital effects and low-quality writing to generate cheap drama. Besides, most shows aren't written with a set arc, they just keep writing more so long as the numbers stay up. So we get a couple of seasons of increasing drama and mystery, then it gets cancelled with no payoff. I'd rather have the amnesia-based reset system than that!
The difference is access to capital. Just like it was 150 years ago. Workers don't have enough holdings to sustain themselves without selling their body. Capitalists have enough holdings to not have to sell their body and can instead put their money to work through various means like entrepreneurship.
Also, I didn't even bring up the Down to the Countryside program as a good aspect of Mao... But since you brought it up, I figured I'd mention that his "harebrained schemes" doubled the life expectancy in China rather quickly. Like all world leaders I've studied, he did great things, and he did horrible things.
Turns out that "will" is a vague concept and doesn't have great neurological or animal models.
However, we can use some reasonable proxies!
I would argue that "libido" is the most obvious one. I recently heard a multimillionaire admit (with some embarrassment) that "we really do all this to get girls."
( I assume "libido is a function of testosterone" requires no citation ;)
Testosterone directly affects dopamine levels, dopamine sensitivity, and willingness to engage in competitive behavior:
https://www.edenclinic.co.uk/post/testosterone-and-the-brain
Another factor is "goal-directed behavior", which is mediated indirectly by "increased sense of agency"
> these results further imply that through an embodied SoA, testosterone can ultimately modulate higher-order experiences of social power and goal-directed behaviour.
https://www.semanticscholar.org/paper/The-Effect-of-Testoste...
At the societal level there is a fascinating (and deeply disturbing) book by J. D. Unwin, who studied thousands of civilizations:
>The book concluded with the theory that as societies develop, they become more sexually liberal, accelerating the social entropy of the society, thereby diminishing its "creative" and "expansive" energy.
https://en.wikipedia.org/wiki/Sex_and_Culture
Notably, conscientiousness and executive function are not enhanced by testosterone. However, deficiency is associated with fatigue, depression, brain fog etc. So it supports "will" by supporting overall health, and a population-wide ~50% decline does not sound healthy to me.
Consider the example of Amazon marketplace. You still have one payment button and you can still shop for things from different vendors. Yet the order fulfillment can be done by Amazon or the seller directly.
If such an arrangement is possible on Amazon, it must mean that there are shops that trust their own fulfillment more than they do Amazon's. It is entirely possible that some restaurants will want to own that delivery experience as well.
Well paid workers can amass the means to become capitalists.
>Let alone with the stereotypes Marx depended upon for his arguments?
Marx called these types of people that make enough money to own their own means of production "petit bourgeoisie". This is in contrast to the "haute bourgeoisie".
This isn't some exception to Marxist thought; this is literally one of the core components of Marxist thought.
The prepper nerds seem to advocate for Cummins 12-valve engines from the 1990s or the Toyota 1HZ.
There's a whole lot of old diesel LandCruisers out there. I'm guessing that's the sweet spot for it still being a normal car that mechanics can maintain while still being comfy and looking cool.
---
"Karl Marx and other Marxist theorists used the term petite bourgeoisie to academically identify the socio-economic stratum of the bourgeoisie that consists of small shopkeepers and self-employed artisans.
The petite bourgeoisie is economically distinct from the proletariat social-class strata who rely entirely on the sale of their labour-power for survival. It is also distinct from the capitalist class haute bourgeoisie, defined by owning the means of production and thus deriving most of their wealth from buying the labour-power of the proletariat..."
---
The critical distinction being that they aren't 'selling their labor-power' to others.
And I just don't see how one can claim this makes any sense in modern times! Proles selling their 'labor power' are out-earning the bougies, anybody (even relatively low wage workers) can hire the 'labor-power of the proletariat' with things like Fiverr (amongst many others). And basically everybody owns the most valuable means of production in modern society - a computer. If you don't, you can buy one with a day or so of minimum wage work.
For that matter bougies in modern times don't make wealth their from buying labor power - they mostly just dump money into investments, bonds, and other such financial vessels. Bonds right now are at near 5%! And again the distinctions really fail because the same is also true of retail investors with a a Robin Hood or whatever.
No, they generally are not. There is obviously overlap, as there was in Marx's time, in income, but that’s not a problem with the theory—class isn’t about income but mode of participation in the economy.
> For that matter bougies in modern times don't make wealth their from buying labor power - they mostly just dump money into investments, bonds, and other such financial vessels.
The “financial vessels” are instruments of other entities, most of which exist by rented labor power.
> And again the distinctions really fail because the same is also true of retail investors with a a Robin Hood or whatever.
The distinctions have never been hard lines. In the most simplistic analysis class is determined by the predominant mode of interaction with the economy, while a more nuanced view sees class membership as essentially a fuzzy membership function, depending on the degree to which one interacts in the manner (selling labor to capitalists vs applying your own labor to your own capital vs. owning capital to which rented labor is applied) archetypical of a given class (both these modes of a analysis have been around for quite a while, thougj the fuzzy membership function language would only be used fairly recently.)
It’s popular because they figured out and unbelievably low-friction method to get users into the ecosystem. Being able to do everything online without installing anything was huge at the time. You had an account in like 2 clicks. THAT was a thing of beauty.
We can challenge this assertion by reductio ad absurdum. Imagine somehow all bougies earned less than all workers. Everything Marx said would be absolutely and completely nonsensical. There's nothing inherently impossible about such a world existing and it makes clear the point that income levels do absolutely matter. And in Marx's time I think it is fairly safe to say there would have been exactly 0 proles earning more than bougies. The concept of a 'factory' worker earning more than a factory owner would have been entirely alien to him, and most of the world, until fairly recently.
The most paradoxical thing about all of this is that the people most drawn to Marxist stuff are disproportionately in tech, the exact sort who, in many cases, already earn more than many, and likely most, business owners, work far fewer hours, and generally have dramatically nicer working conditions. I think it's mostly misidentified discontent. It's not the economic system that's at fault, but somehow building things in the digital world is fundamentally unsatisfying and unfulfilling, even if you get drowned in money, massages, bean bag chairs, and ping pong tables.
If people want fulfilling lives (so far as work as concerned) don't work in ad-tech. If you want stupid amounts of money work in ad-tech. You get the stupid amounts of money precisely because the work is awful and empty. It's a rather dramatically different world from Marx's time where, in general, work was awful and compensation was awful.
I mean, it wouldn't, if they still exercised power. But...they don't, while there is overlap on the boundaries, the classes defined by modes of interaction do, across every capitalist economy (including modern mixed economies, which are not the same system as the capitalism that Marx named and addressed, but share important features with it) form on aggregate hierarchy of both power and income in the same order that as the heirarchy of power Marx describes them in, even though the ranges of individual incomes overlap.
> And in Marx's time I think it is fairly safe to say there would have been exactly 0 proles earning more than bougies.
No, definitely the most well-paid person-living-by-rented labor would have had a higher income than the least-successful owner of capital to which rented labor applied. Capitalists (then no less than now) are capable of losing money continuously, eventually reaching the point where they fall out of the bourgeoisie entirely, and even among those that are more fortunate than that, there would have been many who were technically haut bourgeois because they relied primarily on renting others labor to apply to their capital, and many more who were petit bourgeois and applying their own labor to their own capital--like homesteaders with small holdings--who would earn less the most successful hired experts.
> It's a rather dramatically different world from Marx's time where, in general, work was awful and compensation was awful.
Yes, in modern mixed economies the condition of the median worker is better than in the capitalism of Marx's time, but, in general, work is awful and compensation is awful. Sure, the small percentage of the workers in well-compensated positions like the ad-tech you point to may do amazingly well -- but that's a minute fraction of workers.
Factor in the fact that a business owner is going to be working far more hours on average, than a 'worker', and it turns out that we do live live in this apparently not-so-hypothetical world where proles make more than bougies if we just define classes by their 'modes of economic interaction'! We can argue/nitpick the specifics in Marx's time, but I don't think you can claim in good faith that the situation was even remotely like this, and his logic was largely based on the conditions that he lived in. Even the most fundamental concepts like means of production are obsolete because in modern times everybody owns the most valuable (by a very wide margin) means of production.
And the pleasure or pain of labor is always relative to itself. For most people there's about a million things they'd rather be doing than working (including for business owners), but everybody has to put food on the plate and in modern times that's so much more pleasant an endeavor that it can't really be overstated, and this applies even to relatively recent times. When I, and I assume you, were growing up don't you remember getting endlessly spammed on TV with the non-stop 'Hurt on the job? Call Mr. Ambulance Chaser at 123-4567 today, and get what you deserve!'
[1] - https://altline.sobanco.com/small-business-revenue-statistic...
And our government is busy prattling on about putting tariffs on Canadian maple syrup or something…
SVB has been a vital supporter of startups for decades. Why would a resource constrained startup spend time worried about it? Money goes in and out the bank, great, that’s all most startups should need to worry about.
The startups had a strategy of pooling their money - their huge amount of money, as it turns out - into a fund run by people who couldn't keep a bank solvent. If you want to shield the people doing that from consequences then, frankly, you don't have an interest in running a high-integrity system geared to competence. Because there need to be direct and painful consequences to an action that stupid. Oh there are only 5 of them! Well there is only 1 of me and I can tell you how dumb they were in isolation. The only reason to act this way and keep all the eggs in one high-risk basket is because of an assumption that the government will come in and conduct bailouts if any risk eventuates. IE, a management class that doesn't ever expect to succeed on their own merits. Although since the bailouts did happen that suggests that sticking to a dumb strategy is what winners should do.
The entire capital management system here is out of control.
Investments into what? Businesses?
Businesses that have employees? Employees that are selling their labour? Who are they selling their labour to?
Google was initially revolutionary just because their search engine actually worked incredibly well back before people started trying to game their rankings