Photography, as your parents or grandparents used to know it, is a dying breed.
If 20 years ago the idea of a photograph was to capture an important moment in one’s life as authentically as possible, today we’re living in a different world… Fair enough, not everyone owned a camera in 1980, and phones have managed to make this once unattainable thing very accessible, which is wonderful!
However, as it turns out in 2022, the world is less about authenticity and more about “making everything better” – whatever that’s supposed to mean. Nowadays, photos (amongst other things) are supposed to enhance our reality and make it “cool and fun”. Your baby can have bunny ears, and you can puke rainbows.
But there’s more beyond Snapchat filters that enhance the way your photos look, and it all boils down to something called computational photography. It’s the “hidden filter” that makes photos taken with your phone look “ready to share online”.
This little experiment will try to show off the pros and cons of modern computational photography-enabled phone cameras, and the phone I’ve chosen is Apple’s
– one of the most popular phones in the past ten months.
Before I show you a bunch of “before and after” sample photos, let me establish something: I’m well aware that people like photos that are ready to be shared online. And while I might not be one of them, I think I might know what happened here…
In a nutshell, social media played a huge role in the demand for “Instagram-ready” photos (that’s a term that we actually use in the tech community). Speaking of The Gram, ever since it emerged in 2010, the photo and video sharing social network has encouraged the use of bold filters with exaggerated colors, which people simply couldn’t resist, which of course, meant Apple and Android would jump on board…
For instance, Instagram was the reason Apple felt the need to include a Square Photo mode in the iPhone 5S (2013), which was part of the iPhone’s camera for nearly a decade. However, even more importantly, this was around the time when iPhone and Android started adding photo filters to their stock camera apps. Because the Instagram fever made it clear that people liked filters.
Technically, the Nexus 6P’s HDR+ algorithm was available on the Motorola Nexus 6 too, but it was the Nexus 6P that really helped establish Nexus as one of the best smartphones for taking photos.
What HDR+ did was “advanced image stacking”. HDR+ was part of the post-processing stage of taking a photo with the Nexus 6P/Nexus 5X and its role was to balance out the highlights and shadows in high-contrast scenes – one of the biggest challenges for phones back in 2014-2015 (alongside the sheer inability to produce usable night photos).
Anyway, in a short verdict to HDR+: It made the Nexus 6P one of the best phones for taking photos. Sure, my bias plays a role in that statement (I never bought a Nexus 6P, but it was only because I couldn’t afford it), but there was no denying that the somewhat darker photos Google’s 2015 flagships took had something very appealing to them. Other tech enthusiast loved them too.
Light, highlights and shadows: What photography really should be about
Taken on iPhone 13 and edited with a monochrome filter and adjusted highlights/shadows.
It wasn’t until about a year ago when I watched a brilliant 24-minute long video by David Imel that managed to help me verbalize what I was feeling about the time when the Nexus 6P and original Google Pixel’s cameras ruled the phone camera industry.
To sum up 24 minutes of storytelling, David is drawing a parallel between modern computational photography and classical art, all in an attempt to explain the importance of light for both photography and paintings.
What he’s trying to explain is that in the early days of photography, the artistic control/element (in photos) was completely founded “on the intensity of the highlights and the deepness of the shadows” – like in paintings. Those are used to emote feelings and create depth through tonality in our photos. This is especially evident in monochrome photography where light, shadows, and highlights are pretty much the only elements that create nuance and perspective.
But, as he says, “computational speed was advancing a lot faster than physics was changing”, and it looks like this is why I don’t like many of the photos that my super-powerful iPhone 13 takes and wish they were more like the original Google Pixel’s images.
Apple, Samsung and Google have abandoned the original idea of HDR. There’s no High Dynamic Range in my photos, because there’s not much range between the highlights and shadows, and overall brightness and exposure.
iPhone 13, Galaxy S22, Pixel 6 take photos that don’t represent reality and aren’t always more appealing than what the real scene looks like
What we see here are a bunch of photos I’ve taken with the iPhone 13 in full auto mode. It’s important to note that I didn’t start taking photos in order to make my point, but the photos the iPhone 13 gave me became the reason to write this story…
Anyway, iPhone 13 photos taken in Auto Mode are on the left, and the same iPhone 13 photos, which I’ve edited are on the right. I’ve adjusted them not to my liking, but to the authenticity of the scene at the time (and to the best of my ability).
I chose to edit the photos using the iPhone’s photo editing abilities because that’s what most people have access to. Of course, Lightroom would’ve given me a lot more (and better) control over the different properties of the images (which weren’t taken in RAW format), but that’s not the idea here.
If you’re curious, what helped me most in my attempt to get the iPhone 13’s photos to look more realistic to the scene, it was dragging the Brightness and Exposure sliders way back. Which means photos taken with modern phones are too bright. Then, some Brilliance and Highlight and Shadow adjustments helped me to get an even more accurate result.
iPhone 13, Galaxy S22 and Pixel 6 showcase the problems of modern HDR and computational photography
The iPhone 13’s videos are also way too bright. The default video screenshot is on the left, and the edited video screenshot is on the right – that’s what I was seeing when I took the video.
The results tell me that computational photography on phones today is quite literally a hit or a miss.
On the one hand, some people will like the default output of the iPhone 13, Galaxy S22, and Pixel 6 (the Galaxy also takes photos that are too bright, while the Pixel’s are incredibly flat), because they are “sharable”.But even if we leave authenticity aside, I’d argue the iPhone’s processing doesn’t actually make photos look “better” than what the scene looked like. Take another glance at the samples shown above. Which photos do you like more? The ones on the left or the ones on the right?
As Ramesh Raskar of the MIT photo lab explains, there are three elements to (modern) photography: Capture, Process, and Display.
Photos and even videos taken with iPhone 13 and other modern phones often appear too bright, too oversharpened, too flat, and eventually “lifeless”. Sure, they might be able to capture both the highlights and shadows incredibly well and even turn night into day thanks to Night Mode, but without the element of balance and natural contrast, photos taken with most phones won’t emote any feelings…
But hey! They look fine on Instagram.
In the end: There’s light at the end of the tunnel of computational photography thanks to Sony and Xiaomi
On the left is the iPhone 13’s interpretation of the scene, and on the right is what I was seeing with my own eyes – clearly defined highlights and shadows! I had to drag the exposure slider way down before snapping the photo on the right, which is what you can do in certain high-contrast scenes.
To end on a positive note, there’s light (pun intended) at the end of the tunnel!
Unlike Apple and Samsung, companies like Sony have always tried to stick to the basics of photography, and that’s evident by the fact that the Sony Xperia 1 IV has incredible processing power but doesn’t even include a Night Mode in its camera. The phone also brings the first continuous zoom on a modern smartphone, which is as close to a “real camera zoom” as we’ve ever gotten.And then, of course, we have the Xiaomi 12S Ultra, which uses a full 1-inch sensor and Leica’s magic to deliver some of the best photos I’ve ever seen come out of a phone camera (if not the very best). Xiaomi and Leica chose to let the shadows be shadows, avoid oversharpening, and rely on groundbreaking hardware, which (shocker!) results in photos with incredible depth, and natural detail.
The Xiaomi 12S Ultra is the Nexus 6P of the 2022.
So, I call for Apple, Samsung, and even Google to go back and look at the original Pixel; go back and look at the iPhone 4S (as unimpressive as its camera might seem today), and bring back the realism in our photos. I’m sure that with the increasing power of hardware and software, a touch of authenticity can go a long way!
And you know – for those who want bright and saturated photos… Give them filters!
Source: phonearena.com