←back to thread

Pixar's Render Farm

(twitter.com)
382 points brundolf | 1 comments | | HN request time: 0.421s | source
Show context
klodolph ◴[] No.25615970[source]
My understanding (I am not an authority) is that for a long time, it has taken Pixar roughly an equal amount of time to render one frame of film. Something on the order of 24 hours. I don’t know what the real units are though (core-hours? machine-hours? simple wall clock?)

I am not surprised that they “make the film fit the box”, because managing compute expenditures is such a big deal!

(Edit: When I say "simple wall clock", I'm talking about the elapsed time from start to finish for rendering one frame, disregarding how many other frames might be rendering at the same time. Throughput != 1/latency, and all that.)

replies(6): >>25615994 #>>25616015 #>>25616474 #>>25617115 #>>25617883 #>>25618498 #
ChuckNorris89 ◴[] No.25616015[source]
Wait, what? 24 hours per frame?!

At the standard 24fps it takes you 24 days per film second which works out to 473 years for the average 2 hour long film which can't be right.

replies(7): >>25616045 #>>25616061 #>>25616115 #>>25616213 #>>25616559 #>>25616561 #>>25617639 #
1. mattnewton ◴[] No.25616115[source]
Not saying it's true, but I assume this is all parallizable so 24 cores would complete that 1 second in 1 day, and 3600*24 cores would complete the first hour of the film in a day, etc. And each frame might have parallizable processes to get them under 1 day wall time, but still cost 1 "day" of core-hours