←back to thread

Pixar's Render Farm

(twitter.com)
382 points brundolf | 1 comments | | HN request time: 0.213s | source
Show context
mmcconnell1618 ◴[] No.25616372[source]
Can anyone comment on why Pixar uses standard CPU for processing instead of custom hardware or GPU? I'm wondering why they haven't invested in FPGA or completely custom silicon that speeds up common operations by an order of magnitude. Is each show that different that no common operations are targets for hardware optimization?
replies(12): >>25616493 #>>25616494 #>>25616509 #>>25616527 #>>25616546 #>>25616623 #>>25616626 #>>25616670 #>>25616851 #>>25616986 #>>25617019 #>>25636451 #
brundolf ◴[] No.25616546[source]
In addition to what others have said, I remember reading somewhere that CPUs give more reliably accurate results, and that that's part of why they're still preferred for pre-rendered content
replies(2): >>25616695 #>>25616939 #
1. dahart ◴[] No.25616939[source]
> I remember reading somewhere that CPUs give more reliably accurate results

This is no longer true, and hasn’t been for around a decade. This is a left-over memory of when GPUs weren’t using IEEE 754 compatible floating point. That changed a long time ago, and today all GPUs are absolutely up to par with the IEEE standards. GPUs even took the lead for a while with the FMA instruction that was more accurate than what CPUs had, and Intel and other have since added FMA instructions to their CPUs.