Nothing Special   »   [go: up one dir, main page]

A light stage is an active illumination system used for shape, texture, reflectance and motion capture often with structured light and a multi-camera setup.

Light Stage X - Facial Relighting and Scanning

Reflectance capture

edit

The reflectance field over a human face was first captured in 1999 by Paul Debevec, Tim Hawkins et al and presented in SIGGRAPH 2000. The method they used to find the light that travels under the skin was based on the existing scientific knowledge that light reflecting off the air-to-oil retains its polarization while light that travels under the skin loses its polarization.[1]

 
BSSRDF: BRDF + Subsurface scattering
 
Bidirectional scattering distribution function: BRDF + BTDF

Using this information, a light stage was built by Debevec et al., consisting of

  1. Moveable digital camera
  2. Moveable simple light source (full rotation with adjustable radius and height)
  3. Two polarizers set into various angles in front of the light and the camera
  4. A computer with relatively simple programs doing relatively simple tasks.[1] The setup enabled the team to find the subsurface scattering component of the bidirectional scattering distribution function over the human face which was required for fully virtual cinematography with ultra-photorealistic digital look-alikes, similar to effects seen in the films The Matrix Reloaded, The Matrix Revolutions and others since the early 2000s.

Following great scientific success Debevec et al. constructed more elaborate versions of the light stage at the University of Southern California's (USC)'s Institute for Creative Technologies (ICT). Ghosh et al. built the seventh version of the USC light stage X. In 2014 President Barack Obama had his image and reflectance captured with the USC mobile light stage.[2]

Examples of use

edit
  • Human image synthesis is hard to tell apart from a human imaged with an imaging technology
    • Digital Emily presented to the SIGGRAPH convention in 2008 was a project whereby the reflection field of actress Emily O'Brien was captured using the USC light stage 5,[3] and the prerendered digital look-alike was made in association with Image Metrics. Video came from USC light stage 5 and USC light stage 6.
    • Digital Ira was a fairly convincingly rendered real-time image that was presented at the 2013 SIGGRAPH in association with Activision.[4] While Digital Emily was a pre-computed simulation, Digital Ira ran in real-time and was fairly realistic looking even as a real-time rendering of animation. The field is rapidly moving from movies to computer games and leisure applications – Video includes USC light stage X.
    • The Presidential Portrait by USC ICT in conjunction with the Smithsonian Institution was done using the latest mobile light stage. It included texture, feature and reflectance capture with high resolution multi-camera setup and also additional hand held scanners. A 3D printed bust of the President was also produced.[5]

References

edit
  1. ^ a b Debevec, Paul; Tim Hawkins; Chris Tchou; Haarm-Pieter Duiker; Westley Sarokin; Mark Sagar (2000). "Acquiring the reflectance field of a human face". Proceedings of the 27th annual conference on Computer graphics and interactive techniques - SIGGRAPH '00. ACM. pp. 145–156. doi:10.1145/344779.344855. ISBN 1581132085. S2CID 2860203.
  2. ^ "Scanning and printing a 3D portrait of President Barack Obama". University of Southern California. 2013. Archived from the original on 2015-09-17. Retrieved 2015-11-04.
  3. ^ Paul Debevec animates a photo-real digital face - Digital Emily 2008
  4. ^ Debevec, Paul (2013). "Digital Ira - A real-time animatable face demonstration". His web site. University of Southern California. Archived from the original on 2018-10-01. Retrieved 2013-08-10.
  5. ^ "Scanning and Printing a 3D Portrait of President Barack Obama". usc.edu. Archived from the original on 2017-05-17. Retrieved 2017-02-24.