←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.001s | source
Show context
drewbeck ◴[] No.45030618[source]
Whenever people say that Apple is behind on AI, I think about stories like this. Is this the Siri people want? And if it is easy to prevent, why didn't OpenAI?

Some companies actually have a lot to lose if these things go off the rails and can't just 'move fast and break things' when those things are their customers, or the trust their customers have in them.

My hope is that OpenAI actually does have a lot to lose; my fear is that the hype and the sheer amount of capital behind them will make them immune from real repercussions.

replies(1): >>45032337 #
bigyabai ◴[] No.45032337[source]
When people tell you that Apple is behind on AI, they mean money. Not AI features, not AI hardware, AI revenue. And Apple is behind on that - they've got the densest silicon in the world and still play second fiddle to Nvidia. Apple GPU designs aren't conducive to non-raster workloads, they fell behind pretty far by obsessing over a less-profitable consumer market.

For whatever it's worth, I also hope that OpenAI can take a fall and set an example for any other businesses that recoup their model. But I also know that's not how justice works here in America. When there's money to be made, the US federal government will happily ignore the abuses to prop up American service industries.

replies(3): >>45032414 #>>45032600 #>>45033225 #
drewbeck ◴[] No.45033225[source]
Apple is a consumer product company. “There’s a lot of money in selling silicon to other companies therefore Apple should have pivoted to selling silicon to other companies” is a weird fantasy-land idea of how businesses work.

Idk maybe it’s legit if your only view of the world is through capital and, like, financial narratives. But it’s not how Apple has ever worked, and very very few consumer companies would attempt that kind of switch let alone make the switch successfully.

replies(1): >>45043732 #
1. bigyabai ◴[] No.45043732{3}[source]
It's not a fantasy-land idea at all. Apple has already tried penetrating the datacenter before, they've proven they can ship a product to market if they want to. They just don't. They don't want to support Nvidia drivers or complex GPGPU primatives or non-raster GPU architectures or cross-platform acceleration libraries. Which is frankly an braindead decision from the opportunity cost side of things; if your consumers don't care, why not deliver what developers want?

Apple can have their cake and eat it here. If they reacted fast enough (eg. ~2018) then they could have had a CUDA competitor in-time for the crypto and AI craze - that's a fact! But they missed out on those markets because they had their blinders on, dead-to-rights focused on the consumer market they're losing control over. It's starting to verge on pathetic how gimped the Mac is for a "real computer" product.