←back to thread

490 points jarmitage | 2 comments | | HN request time: 0.424s | source
Show context
dudus ◴[] No.40681617[source]
Gotta keep digging that CUDA moat as hard and as fast as possible.
replies(2): >>40681968 #>>40687332 #
1. tomjen3 ◴[] No.40687332[source]
Thats the part I don't get. When you are developing AI, how much code are you really running on GPUs? How bad would it be to write it for something else if you could get 10% more compute per dollar?
replies(1): >>40688879 #
2. incrudible ◴[] No.40688879[source]
Those 10% are going to matter when you have an established business case and you can start optimizing. The AI space is not like that at all, nobody cares about losing money 10% faster. You can not risk a 100% slowdown running into issues with an exotic platform for a 10% speedup.