Maggi Hambling’s ‘Scallop’ tribute to Benjamin Britten on Aldeburgh beach, on which he often used to walk. Photographed this morning.
Click on the image to see a bigger version.
This morning’s Observer column:
If you’re a keen photographer (which this columnist is) one of the things you prize most is a strange property called bokeh. It’s the aesthetic quality of the blur produced in the parts of an image that are not of central interest – the way a lens renders out-of-focus points of light. You often see it in great portraits: the subject’s eyes are razor-sharp but the – potentially distracting – background is fuzzy.
In the era when all photography was analogue, the only way to get good bokeh was to use lenses that produced narrow depth of field at wide apertures. Since the optical performance of most lenses decreased at such apertures, that meant that serious photographers faced a trade-off: their lust for bokeh involved compromising on overall image quality. And the only way round that was to spend money on lenses of complex design and exceedingly high optical quality. Neither of these came cheap: a photo-buff of my acquaintance, for example, recently laid out a small fortune for a Leica Noctilux f0.95 aspherical lens, which, its manufacturer claims, provides “unique bokeh”. (At a retail price of £9,100 it jolly well ought to.)
Enter Apple, which was once a struggling computer company…
It’s hard to believe but Apple has 800 people working just on the iPhone camera. Every so often, we get a glimpse of what they are doing. Basically, they’re using computation to enhance what can be obtained from a pretty small sensor. One sees this in the way HDR (High Dynamic Range) seems to be built-in to every iPhone X photograph. And now we’re seeing it in the way the camera can produce the kind of convincing bokeh(the blur produced in the out-of-focus parts of an image produced by a lens) that could hitherto only be got from particular kinds of optical lenses at wide apertures.
Matthew Panzarino, who is a professional photographer, has a useful review of the new iPhone XS in which he comments on this:
Unwilling to settle for a templatized bokeh that felt good and leave it that, the camera team went the extra mile and created an algorithmic model that contains virtual ‘characteristics’ of the iPhone XS’s lens. Just as a photographer might pick one lens or another for a particular effect, the camera team built out the bokeh model after testing a multitude of lenses from all of the classic camera systems.
Really striking, though, is an example Panzarino uses of how a post-hoc adjustable depth of focus can be really useful. He shows a photograph of himself with his young son perched on his shoulders.
And an adjustable depth of focus isn’t just good for blurring, it’s also good for un-blurring. This portrait mode selfie placed my son in the blurry zone because it focused on my face. Sure, I could turn the portrait mode off on an iPhone X and get everything sharp, but now I can choose to “add” him to the in-focus area while still leaving the background blurry. Super cool feature I think is going to get a lot of use.
Yep. Once, photography was all about optics. From now on it’ll increasingly be about computation.
We go to Provence every summer, and for many years we flew there and rented cars. And then one year we decided that this was silly: and so now we drive down south — slowly, on those wonderful secondary roads that the French excel at. By the time we get to Arles after three leisurely days on the road, we have completely acclimatised to life in the South.
The thing that proved crucial in making the change was Eurotunnel. It enables us to drive to Folkestone and then onto a train (as in the photograph). 25 minutes later we drive off and are on our way to Burgundy. It still seems like magic. And in a way it is.