Can anyone comment on why Pixar uses standard CPU for processing instead of custom hardware or GPU? I'm wondering why they haven't invested in FPGA or completely custom silicon that speeds up common operations by an order of magnitude. Is each show that different that no common operations are targets for hardware optimization?
In addition to what others have said, I remember reading somewhere that CPUs give more reliably accurate results, and that that's part of why they're still preferred for pre-rendered content
I believe this to be historically true as GPUs often “cheated” with floating point math to optimize hardware pipelines for game rasterization where only looks matter. This is probably not true as GPGPU took hold over the last decade.