←back to thread

Pixar's Render Farm

(twitter.com)
382 points brundolf | 1 comments | | HN request time: 0.222s | source
Show context
klodolph ◴[] No.25615970[source]
My understanding (I am not an authority) is that for a long time, it has taken Pixar roughly an equal amount of time to render one frame of film. Something on the order of 24 hours. I don’t know what the real units are though (core-hours? machine-hours? simple wall clock?)

I am not surprised that they “make the film fit the box”, because managing compute expenditures is such a big deal!

(Edit: When I say "simple wall clock", I'm talking about the elapsed time from start to finish for rendering one frame, disregarding how many other frames might be rendering at the same time. Throughput != 1/latency, and all that.)

replies(6): >>25615994 #>>25616015 #>>25616474 #>>25617115 #>>25617883 #>>25618498 #
brundolf ◴[] No.25615994[source]
Well it can't just be one frame total every 24 hours, because an hour-long film would take 200+ years to render ;)
replies(5): >>25616010 #>>25616035 #>>25616054 #>>25616125 #>>25616154 #
chrisseaton ◴[] No.25616010[source]
I’m going to guess they have more than one computer rendering frames at the same time.
replies(1): >>25616073 #
brundolf ◴[] No.25616073[source]
Yeah, I was just (semi-facetiously) pointing out the obvious that it can't be simple wall-clock time
replies(2): >>25616150 #>>25616184 #
1. masklinn ◴[] No.25616184[source]
It could still be wallclock per-frame, but you can render each frame independently.