←back to thread

492 points Lionga | 1 comments | | HN request time: 0s | source
Show context
ceejayoz ◴[] No.45672187[source]
Because the AI works so well, or because it doesn't?

> ”By reducing the size of our team, fewer conversations will be required to make a decision, and each person will be more load-bearing and have more scope and impact,” Wang writes in a memo seen by Axios.

That's kinda wild. I'm kinda shocked they put it in writing.

replies(34): >>45672233 #>>45672238 #>>45672266 #>>45672367 #>>45672370 #>>45672398 #>>45672463 #>>45672519 #>>45672571 #>>45672592 #>>45672666 #>>45672709 #>>45672722 #>>45672855 #>>45672862 #>>45672949 #>>45673049 #>>45673060 #>>45673501 #>>45673549 #>>45673723 #>>45673795 #>>45674537 #>>45674817 #>>45674914 #>>45675187 #>>45675194 #>>45675426 #>>45675612 #>>45676161 #>>45676264 #>>45676418 #>>45676920 #>>45678165 #
dekhn ◴[] No.45673060[source]
I'm seeing a lot of frustration at the leadership level about product velocity- and much of the frustration is pointed at internal gatekeepers who mainly seem to say no to product releases.

My leadership is currently promoting "better to ask forgiveness", or put another way: "a bias towards action". There are definitely limits on this, but it's been helpful when dealing with various internal negotiations. I don't spend as much time looking to "align with stakeholders", I just go ahead and do things my decades of experience have taught me are the right paths (while also using my experience to know when I can't just push things through).

replies(8): >>45673157 #>>45673217 #>>45673223 #>>45673278 #>>45675276 #>>45675476 #>>45675842 #>>45678613 #
noosphr ◴[] No.45675276[source]
Big tech is suffering from the incumbents disease.

What worked well for extracting profits from stable cash cows doesn't work in fields that are moving rapidly.

Google et al. were at one point pinnacle technologies too, but this was 20 years ago. Everyone who knew how to work in that environment has moved on or moved up.

Were I the CEO of a company like that I'd reduce headcount in the legacy orgs, transition them to maintenance mode, and start new orgs within the company that are as insulated from legacy as possible. This will not be an easy transition, and will probably fail. The alternative however is to definitely fail.

For example Google is in the amazing position that it's search can become a commodity that prints a modest amount of money forever as the default search engine for LLM queries, while at the same time their flagship product can be a search AI that uses those queries as citations for answers people look for.

replies(9): >>45675751 #>>45675757 #>>45676217 #>>45676220 #>>45676332 #>>45676648 #>>45677426 #>>45678143 #>>45680082 #
janalsncm ◴[] No.45675751[source]
Once you have a golden goose, the risk taking innovators who built the thing are replaced by risk averse managers who protect it. Not killing the golden goose becomes priority 1, 2, and 3.

I think this is the steel man of “founder mode” conversation that people were obsessed with a year ago. People obsessed with “process” who are happy if nothing is accomplished because at least no policy was violated, ignoring the fact that policies were written by humans to serve the company’s goals.

replies(2): >>45677722 #>>45677985 #
1. tharkun__ ◴[] No.45677722{3}[source]
This but also: not the managers in the teams that build/"protect" it.

But really, leadership above, echoing your parents.

I just went through this exercise. I had to estimate the entirety of 2026 based on nothing but a title and a very short conversation based on that for a huge suite of products. Of course none of these estimates make any sense in any way. But all of 2026 is gonna be decided on this. Sort of.

Now, if you just let us build shit as it comes up, by competent people - you know, the kind of things that I'd do if you just told me what was important and let me do shit (with both a team and various AI tooling we are allowed to use) then we'd be able to build way more than if you made us estimate and then later commit to it.

It's way different if you make me to commit to building feature X and I have zero idea if and how to make it possible and if you just tell me you need something that solves problem X and I get to figure it out as we go.

Case in point: In my "spare" time (some of which has been made possible by AI tooling) I've achieved more for our product in certain neglected areas than I ever would've achieved with years worth of accumulated arguing for team capacity. All in a few weeks.