←back to thread

Pixar's Render Farm

(twitter.com)
382 points brundolf | 1 comments | | HN request time: 0.243s | source
Show context
klodolph ◴[] No.25615970[source]
My understanding (I am not an authority) is that for a long time, it has taken Pixar roughly an equal amount of time to render one frame of film. Something on the order of 24 hours. I don’t know what the real units are though (core-hours? machine-hours? simple wall clock?)

I am not surprised that they “make the film fit the box”, because managing compute expenditures is such a big deal!

(Edit: When I say "simple wall clock", I'm talking about the elapsed time from start to finish for rendering one frame, disregarding how many other frames might be rendering at the same time. Throughput != 1/latency, and all that.)

replies(6): >>25615994 #>>25616015 #>>25616474 #>>25617115 #>>25617883 #>>25618498 #
1. CyberDildonics ◴[] No.25617115[source]
Not every place talks about frame rendering times the same. Some talk about the time it takes to render one frame of every pass sequentially, some talk about more about the time of the hero render or the longest dependency chain, since that is the latency to turn around a single frame. Core hours is usually separate because most of the time you want to know if something will be done overnight or if broken frames can be rendered during the day.

24 hours of wall clock time is excessive and the reality is that anything over 2 hours starts to get painful. If you can't render reliably over night, your iterations slow down to molasses and the more iterations you can do the better something will look. These times are usually inflated in articles. I would never accept 24 hours to turn around a typical frame as being necessary. If I saw people working with that, my top priority would be to figure out what is going on, because with zero doubt there would be a huge amount of nonsense under the hood.