Making light work: part 4
Published 07 June 2007
Page 1 of 3
In the final part of Darren Brookerİs four-part comparison of the interior lighting methods available within 3ds Max, he looks at the role of HDR lighting and relighting your scene with both 3ds Max and combustion
Having started this four-part series with the Global Illumination tools available within 3ds Maxİs scanline renderer before moving onto equivalent photon mapping techniques within mental ray, we then progressed on to the use of standard lights to fake the look of Global Illumination. In this, the last part of this series on lighting in 3ds Max, weİll take a look at using mental ray again to create High Dynamic Range (HDR) images, which can then be used to light and relight your scene, with the help of combustion.
1. Simple to setup
2. Perfect for match lighting
3. Versatile, especially in a compositing pipeline
1. Generation of HDR maps can be tricky
2. Sampling settings can be difficult to optimise
3. Relatively slow to calculate
Tips & Tricks
Regular 8-bit RGB images store an 8-bit value for each channel, giving us 256 increments of Red, Green and Blue, which means that all whites are clipped at R:255, G:255, B:255. HDR images, on the other hand, are floating-point images, and are capable of storing a far higher luminance range. When both HDR and 8-bit images are displayed on our 8-bit monitors, we cannot see the difference between the white of a wall and the white of the sun, but internally, when working with HDR images, the renderer sees a luminance value of 300 allocated to the wall, but a luminance value of 10,000 allocated to the sun.
HDR images are generally created using photographs taken at different exposure settings. By exposing correctly for the brightest light in the scene incrementally down to the least bright, all highlights and shadows are captured at a mid-tone exposure level. These photographs are then assembled into a single image using software like HDRshop, as shown in Figure 1 (www.hdrshop.com) and these luminance levels are essentially layered, which gives the brightest highlights the super-whites within the resultant HDR image. Within a film pipeline, it took a relatively long time for HDR images to catch on (the first real application was in 2000İs X-Men), but it is now commonplace for chrome spheres to be photographed at multiple exposures on set to enable CG content to be easily lit to match the live action backplate.
Just as spheres can be used to capture a live action environment, HDR output can also be rendered out as spherical images, which essentially does the same thing, capturing what would be reflected if a chrome ball were photographed on set. To calculate floating-point values in the first place requires a floating-point renderer, which is where mental ray comes in.
|< Prev||Next >|