←back to thread

Google is winning on every AI front

(www.thealgorithmicbridge.com)
993 points vinhnx | 3 comments | | HN request time: 0.641s | source
Show context
thunderbird120 ◴[] No.43661807[source]
This article doesn't mention TPUs anywhere. I don't think it's obvious for people outside of google's ecosystem just how extraordinarily good the JAX + TPU ecosystem is. Google several structural advantages over other major players, but the largest one is that they roll their own compute solution which is actually very mature and competitive. TPUs are extremely good at both training and inference[1] especially at scale. Google's ability to tailor their mature hardware to exactly what they need gives them a massive leg up on competition. AI companies fundamentally have to answer the question "what can you do that no one else can?". Google's hardware advantage provides an actual answer to that question which can't be erased the next time someone drops a new model onto huggingface.

[1]https://blog.google/products/google-cloud/ironwood-tpu-age-o...

replies(12): >>43661870 #>>43661974 #>>43663154 #>>43663455 #>>43663647 #>>43663720 #>>43663956 #>>43664320 #>>43664354 #>>43672472 #>>43673285 #>>43674134 #
noosphr ◴[] No.43661870[source]
And yet google's main structural disadvantage is being google.

Modern BERT with the extended context has solved natural language web search. I mean it as no exaggeration that _everything_ google does for search is now obsolete. The only reason why google search isn't dead yet is that it takes a while to index all web paged into a vector database.

And yet it wasn't google that released the architecture update, it was hugging face as a summer collaboration between a dozen people. Google's version came out in 2018 and languished for a decade because it would destroy their business model.

Google is too risk averse to do anything, but completely doomed if they don't cannibalize their cash cow product. Web search is no longer a crown jewel, but plumbing that answering services, like perplexity, need. I don't see google being able to pull off an iPhone moment where they killed the iPod to win the next 20 years.

replies(7): >>43661911 #>>43661929 #>>43662090 #>>43662277 #>>43662527 #>>43662862 #>>43670870 #
petesergeant ◴[] No.43662277[source]
> Google is too risk averse to do anything, but completely doomed if they don't cannibalize their cash cow product.

Google's cash-cow product is relevant ads. You can display relevant ads in LLM output or natural language web-search. As long as people are interacting with a Google property, I really don't think it matters what that product is, as long as there are ad views. Also:

> Web search is no longer a crown jewel, but plumbing that answering services, like perplexity, need

This sounds like a gigantic competitive advantage if you're selling AI-based products. You don't have to give everyone access to the good search via API, just your inhouse AI generator.

replies(2): >>43662697 #>>43663002 #
michaelt ◴[] No.43662697[source]
Kodak was well placed to profit from the rise of digital imaging - in the late 1970s and early 1980s Kodak labs pioneered colour image sensors, and was producing some of the highest resolution CCDs out there.

Bryce Bayer worked for Kodak when he invented and patented the Bayer pattern filter used in essentially every colour image sensor to this day.

But the problem was: Kodak had a big film business - with a lot of film factories, a lot of employees, a lot of executives, and a lot of recurring revenue. And jumping into digital with both feet would have threatened all that.

So they didn't capitalise on their early lead - and now they're bankrupt, reduced to licensing their brand to third-party battery makers.

> You can display relevant ads in LLM output or natural language web-search.

Maybe. But the LLM costs a lot more per response.

Making half a cent is very profitable if you only take 0.2s of CPU to do it. Making half a cent with 30 seconds multiple GPUs, consuming 1000W of power... isn't.

replies(3): >>43663362 #>>43663969 #>>43665437 #
1. dgacmu ◴[] No.43663969[source]
1/2 kW/minute costs about $0.001 so you technically could make a profit at that rate. The real problem is the GPU cost - a $20k GPU amortized over five years costs $0.046 per second. :)
replies(1): >>43666278 #
2. pingou ◴[] No.43666278[source]
How do you get that? I get $0.0001 per second over 5 years to reach 20k.
replies(1): >>43666426 #
3. dgacmu ◴[] No.43666426[source]
Because I'm an idiot and left off a factor of 365. Thank you! A 20k GPU for 30 seconds is 1/3 of a cent. Still more than the power but also potentially profitable under this scenario informing all the other overhead and utilization.