←back to thread

Google is winning on every AI front

(www.thealgorithmicbridge.com)
993 points vinhnx | 1 comments | | HN request time: 0.308s | source
Show context
thunderbird120 ◴[] No.43661807[source]
This article doesn't mention TPUs anywhere. I don't think it's obvious for people outside of google's ecosystem just how extraordinarily good the JAX + TPU ecosystem is. Google several structural advantages over other major players, but the largest one is that they roll their own compute solution which is actually very mature and competitive. TPUs are extremely good at both training and inference[1] especially at scale. Google's ability to tailor their mature hardware to exactly what they need gives them a massive leg up on competition. AI companies fundamentally have to answer the question "what can you do that no one else can?". Google's hardware advantage provides an actual answer to that question which can't be erased the next time someone drops a new model onto huggingface.

[1]https://blog.google/products/google-cloud/ironwood-tpu-age-o...

replies(12): >>43661870 #>>43661974 #>>43663154 #>>43663455 #>>43663647 #>>43663720 #>>43663956 #>>43664320 #>>43664354 #>>43672472 #>>43673285 #>>43674134 #
krackers ◴[] No.43661974[source]
Assuming that DeepSeek continues to open-source, then we can assume that in the future there won't be any "secret sauce" in model architecture. Only data and training/serving infrastructure, and Google is in a good position with regard to both.
replies(4): >>43662137 #>>43662710 #>>43663691 #>>43694115 #
1. fulafel ◴[] No.43662137[source]
Making your own hardware would seem to yield freedoms in model architectures as well since performance is closely related to how the model architecture fits the hardware.