←back to thread

Pixar's Render Farm

(twitter.com)
382 points brundolf | 1 comments | | HN request time: 0.214s | source
Show context
klodolph ◴[] No.25615970[source]
My understanding (I am not an authority) is that for a long time, it has taken Pixar roughly an equal amount of time to render one frame of film. Something on the order of 24 hours. I don’t know what the real units are though (core-hours? machine-hours? simple wall clock?)

I am not surprised that they “make the film fit the box”, because managing compute expenditures is such a big deal!

(Edit: When I say "simple wall clock", I'm talking about the elapsed time from start to finish for rendering one frame, disregarding how many other frames might be rendering at the same time. Throughput != 1/latency, and all that.)

replies(6): >>25615994 #>>25616015 #>>25616474 #>>25617115 #>>25617883 #>>25618498 #
ChuckNorris89 ◴[] No.25616015[source]
Wait, what? 24 hours per frame?!

At the standard 24fps it takes you 24 days per film second which works out to 473 years for the average 2 hour long film which can't be right.

replies(7): >>25616045 #>>25616061 #>>25616115 #>>25616213 #>>25616559 #>>25616561 #>>25617639 #
1. dagmx ◴[] No.25616213[source]
It's definitely not 24 hours per frame outside of gargantuan shots, at least by wall time. If you're going by core time, then it assumes you're serial which is never the case.

That also doesn't include rendering multiple shots at once. It's all about parallelism.

Finally, those frame counts for a film only assume final render. There's a whole slew of work in progress renders too, so a given shot may be rendered 10-20 times. Often they'll render every other frame to spot check and render at lower resolutions to get it back quick.