It's always seemed to me that AI is going to be a commodity business - it's pretty clear that any company with enough money can compete, and it seems that current LLM-based AI is levelling off in terms of capability, with the new focus being on building layers of services on top of that (e.g. deep research agents).
In a commodity business cost is key, and Google with their N'th generation home grown TPUs and AI-optimized datacenters have a big advantage over anyone paying NVIDIA markups for accelerators or without this level of vertical integration.