←back to thread

311 points melodyogonna | 3 comments | | HN request time: 0.2s | source
1. JonChesterfield ◴[] No.45141867[source]
ML seems to be doing just fine with python and cuda.
replies(2): >>45142176 #>>45148850 #
2. poly2it ◴[] No.45142176[source]
Python and CUDA are not very well adapted for embedded ML.
3. davidatbu ◴[] No.45148850[source]
Yeah the rate of progress in AI definitely makes it seem like that from the outside for me too.

But having never written cuda, I have to rely on authority to some extent for this question. And it seems to me like few are in a better position to opine on whether there's a better story to be had for the software-hardware boundary in ML than the person who wrote MLIR, Swift-for-Tensorflow (alongside with making that work on TPUs and GPUs), ran ML at Tesla for some time, was VP at SiFive, ... etc.