←back to thread

186 points nserrino | 5 comments | | HN request time: 0.211s | source
1. nikolayasdf123 ◴[] No.45119495[source]
> non 100% correctness of kernels

wouldn't model not work properly if kernels are even slightly off?

wasn't kernels a part of training stack for models? am I missing anything?

replies(2): >>45119795 #>>45120966 #
2. ymsodev ◴[] No.45119795[source]
The article is referring to GPU compute kernel (https://en.wikipedia.org/wiki/Compute_kernel), not the term kernel used in ML/NN/etc.
replies(1): >>45124876 #
3. arjvik ◴[] No.45120966[source]
I believe their speedup is computed _assuming they can easily fix the correctness bugs in the kernels_.

In practice, with slight differences the model will feel almost lobotomized.

4. saagarjha ◴[] No.45124876[source]
…aren't they the same thing
replies(1): >>45143386 #
5. ymsodev ◴[] No.45143386{3}[source]
They're not, but I also misunderstood the original question, they're referring to the correct definition of kernel. I thought they were confusing the GPU kernel with https://en.wikipedia.org/wiki/Kernel_method or https://en.wikipedia.org/wiki/Kernel_(image_processing)