←back to thread

492 points Lionga | 2 comments | | HN request time: 0s | source
Show context
ceejayoz ◴[] No.45672187[source]
Because the AI works so well, or because it doesn't?

> ”By reducing the size of our team, fewer conversations will be required to make a decision, and each person will be more load-bearing and have more scope and impact,” Wang writes in a memo seen by Axios.

That's kinda wild. I'm kinda shocked they put it in writing.

replies(34): >>45672233 #>>45672238 #>>45672266 #>>45672367 #>>45672370 #>>45672398 #>>45672463 #>>45672519 #>>45672571 #>>45672592 #>>45672666 #>>45672709 #>>45672722 #>>45672855 #>>45672862 #>>45672949 #>>45673049 #>>45673060 #>>45673501 #>>45673549 #>>45673723 #>>45673795 #>>45674537 #>>45674817 #>>45674914 #>>45675187 #>>45675194 #>>45675426 #>>45675612 #>>45676161 #>>45676264 #>>45676418 #>>45676920 #>>45678165 #
dekhn ◴[] No.45673060[source]
I'm seeing a lot of frustration at the leadership level about product velocity- and much of the frustration is pointed at internal gatekeepers who mainly seem to say no to product releases.

My leadership is currently promoting "better to ask forgiveness", or put another way: "a bias towards action". There are definitely limits on this, but it's been helpful when dealing with various internal negotiations. I don't spend as much time looking to "align with stakeholders", I just go ahead and do things my decades of experience have taught me are the right paths (while also using my experience to know when I can't just push things through).

replies(8): >>45673157 #>>45673217 #>>45673223 #>>45673278 #>>45675276 #>>45675476 #>>45675842 #>>45678613 #
noosphr ◴[] No.45675276[source]
Big tech is suffering from the incumbents disease.

What worked well for extracting profits from stable cash cows doesn't work in fields that are moving rapidly.

Google et al. were at one point pinnacle technologies too, but this was 20 years ago. Everyone who knew how to work in that environment has moved on or moved up.

Were I the CEO of a company like that I'd reduce headcount in the legacy orgs, transition them to maintenance mode, and start new orgs within the company that are as insulated from legacy as possible. This will not be an easy transition, and will probably fail. The alternative however is to definitely fail.

For example Google is in the amazing position that it's search can become a commodity that prints a modest amount of money forever as the default search engine for LLM queries, while at the same time their flagship product can be a search AI that uses those queries as citations for answers people look for.

replies(9): >>45675751 #>>45675757 #>>45676217 #>>45676220 #>>45676332 #>>45676648 #>>45677426 #>>45678143 #>>45680082 #
nopurpose ◴[] No.45675757[source]
> Google et al. were at one point pinnacle technologies too, but this was 20 years ago.

In 2017 Google literally gave us transformer architecture all current AI boom is based on.

replies(3): >>45675795 #>>45675948 #>>45676381 #
noosphr ◴[] No.45675795{3}[source]
And what did they do with it for the next five years?
replies(3): >>45675890 #>>45676005 #>>45676612 #
seanmcdirmid ◴[] No.45676005{4}[source]
Used it to do things? This seems like a weird question. OpenAI took about the same amount of time to go big as well (Sam was excited about open AI in 2017, but it took 5+ years for it to pan out into something used by people).
replies(2): >>45677015 #>>45680206 #
keeda ◴[] No.45677015{5}[source]
I think the point is that they hoarded the technology for internal use instead of opening it up to the public, like OpenAI did with ChatGPT, thus kicking off the current AI revolution.

As sibling comments indicate, reasons may range from internal politics to innovator's dilemma. But the upshot is, even though the underlying technology was invented at Google, its inventors had to leave and join other companies to turn it into a publicly accessible innovation.

replies(1): >>45677078 #
seanmcdirmid ◴[] No.45677078{6}[source]
So I started at Google in 2020 (after Sam closed our lab down in 2017 to focus on OpenAI), and if they were hoarding it, I at least had no clue about it. To be clear, my perspective is still limited.
replies(2): >>45677275 #>>45677513 #
keeda ◴[] No.45677513{7}[source]
Fair enough, maybe a better way to put it is: why was the current AI boom sparked by ChatGPT and not something from Google? It's clear in retrospect that Google had similar capabilities in LaMDA, the precursor to Gemini. As I recall it was even announced a couple years before ChatGPT but wasn't released (as Bard?) until after ChatGPT.

LaMDA is probably more famous for convincing a Google employee that it was sentient and getting him fired. When I heard that story I could not believe anybody could be deceived to that extent... until I saw ChatGPT. In hindsight, it was probably the first ever case of what is now called "AI psychosis". (Which may be a valid reason Google did not want to release it.)

replies(1): >>45677999 #
1. dekhn ◴[] No.45677999{8}[source]
Google had been burned badly in multiple previous launches of ML-based products and their leadership was extremely cautious about moving too quickly. It was convenient for Google that OpenAI acted as a first mover so that Google could enter the field after there was some level of cultural acceptance of the negative behaviors. There's a whole backstory where Noam Shazeer had come up with a bunch of nice innovations and wanted to launch them, but was only able to do so by leaving and launching through his startup- and then returned to Google, negotiating a phenomenal deal (Noam has been at Google for 25 years and has been doing various ML projects for much of that time).
replies(1): >>45678842 #
2. Jyaif ◴[] No.45678842[source]
> badly in multiple previous launches of ML-based products

Which ML-based products?

> It was convenient for Google that OpenAI acted as a first mover

That sounds like something execs would say to fend of critics. "We are #2 in AI, and that's all part of the plan"