One of the best updates to iOS 4.1, to me, was the ability to automatically take and merge HDR photos right on the iPhone. For those unfamiliar with HDR (High Dynamic Range) photography, this is a relatively new area in photography that takes multiple images, which are beyond the tonal stop range of most digital camera sensors (or film for that matter). This actually isn’t new at all, film photographers have used this technique of bracketing ever since the invention of photography, but it is the digital media that has made it far more easy to achieve pleasing results, and hence, it is a revived sort of new.
High Dynamic Range (HDR) Explained
In brief, when you take images bracketed in exposure range, and then combined them either in camera or in something like Photoshop CS5 (which made huge changes to their HDR process over CS2, CS3, or CS4), or Photomatrix, you get a much richer, deeper tonal range, which just isn’t possible with one image. Digital SLR cameras can “see” about 6-8 stops of light, which is translated to be 4 stops underexposed and 4 stops overexposed from the middle exposure value, whatever that middle value is deemed to be. The human eye on the other hand can see far more “stops” of light, some say 3-4 times that of a digital camera, so HDR is a little closer to what we see with our own eyes. It still doesn’t match the light sensitivity of the eye, but it’s closer than non-HDR.
With HDR, you generally take one middle (or properly exposed) image, one frame that is 2 stops underexposed (-2/EV) and another frame 2 stops overexposed (+2/EV). When combined, the dark shadows are seen along with the detail in blown out areas (white overexposed spots). The iPhone now does this automatically. I took a few test shots to see how a camera-phone would handle these types of exposures. These images below are right out of the camera, they had zero processing, but all images below were taken with my iPhone. All the shots below show the non-HDR image first and the HDR image second.
iPhone HDR Image Examples
These first two shots were actually the ones where I liked the non-HDR image better, mainly because it had deeper colors on the first one with a little more contrast, simply because of how the shot was framed and where the backlight fell on the image.
These two shots were taken in the impossible condition for a camera, a bright light in a dark room. This just gives you an idea how in the second shot the HDR opens up the range of exposed light. Even with opening up the room light, you still get a visible image on the screen, which most of the time will be blown out to white.
These two shots show the great potential of HDR. This shot of my friend Lee, though he wasn’t expecting to be the HDR subject, shows how well it works. The HDR shows much more natural facial features, his white shirt shows a much more realistic looking texture and color, and the overall image has a much better well-balanced look to it.
Overall it seems to be a fantastic edition to the iOS 4.1 package. There are some well known photographers, like Jarvis, now getting incredible shots with the iPhone, the HDR setting should only make the images better and better.