Conference Changing the Game
ON SITE König-Karl-Halle presented by Unreal Engine Tuesday, May 03, 15:45

Virtual Production: Getting the Lighting Right

Video on demand

Today's virtual production stages surround actors with the light of millions of LEDs, creating real-world image-based lighting in addition to in-camera backgrounds.  But these stages usually fall short of effortlessly recording actors in a studio the way they really would appear in real sets and locations.  In-camera backgrounds often need to be touched up or replaced, requiring the actors to be rotoscoped out or shot against greenscreen in the camera frustum.  Camera tracking latency limits the kinds of camera moves which are safe to attempt.   LED screens are prone to exhibiting moiré patterns, and the way they go out of focus differs from genuine lens bokeh.  LED panels lack the dynamic range and maximum intensity needed for many lighting environments, especially ones with direct sunlight, and stages rarely cover all the angles light can come from.  And finally, LED panels struggle with accurate color rendition, having just red, green, and blue spectral peaks along the visible spectrum.  This is sufficient to show a camera practically any color it can record, but it misses key parts of the spectrum needed to illuminate actors and set pieces the way they would appear under daylight or incandescent light.  In this talk, I'll describe each of these problems and point the way to potential solutions, drawing upon examples from classic cinema technology, the lighting reproduction research at the USC Institute for Creative Technologies, and a new LED stage lighting reproduction study at the University of Southern California's Entertainment Technology Center.

 

Paul Debevec, Director of Research, Creative Algorithms and Technology, Netflix

Paul Debevec is Netflix’s Director of Research for Creative Algorithms and Technology where he oversees R&D in new technologies in computer vision, computer graphics, and machine learning with applications in visual effects, virtual production, and animation.  His 2002 Light Stage 3 system at the USC Institute for Creative Technologies was the first LED Stage to illuminate live-action actors with imagery of digital sets for virtual production.  Techniques from Paul’s work have been used to create key visual effects sequences in The Matrix, Spider-Man 2, Benjamin Button, Avatar, Gravity, Furious 7, Blade Runner: 2049, Gemini Man, Free Guy, numerous video games, and to record a 3D Portrait of US President Barack Obama.  His light stage facial capture technology has helped numerous technology companies, video game studios, and visual effects companies create photoreal digital actors and advance ML datasets for facial appearance.  Paul’s work in HDR imaging, image-based lighting, and light stage facial capture has been recognized with two technical Academy Awards and SMPTE’s Progress Medal.  Paul is a Fellow of the Visual Effects Society and a member of the Television Academy's Science and Technology Peer Group, and has served on the Motion Picture Academy's Visual Effects Executive Committee and Science and Technology Council, and as Vice President of ACM SIGGRAPH.  More info at: www.debevec.org.