Why the iPhone 14 Pro Camera Is a Big Leap for Photo Enthusiasts – CNET

This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.

Thanks to the iPhone 14 Pro, it’s a good time to be a serious photographer. Apple’s newest smartphone, which packs a 48-megapixel sensor, delivers significant improvements in image quality not just for the average person, but also for photo enthusiasts and pros trying to get the most out of their smartphone cameras.

I’m one of those people, shooting professionally some of the time and as a hobbyist the rest of the time. I’ve been putting my iPhone 14 Pro through its paces, and after six days of shooting and pixel peeping, I’m impressed with the technology improvements.

Three highlights in the iPhone 14 Pro and Pro Max stand out: the main camera’s 48-megapixel resolution, better image quality on the main, and ultrawide angle cameras so photos look more natural and the improved low-light performance on all three of the rear cameras. Mainstream folks should appreciate them, as my colleague Patrick Holland observes in his iPhone 14 Pro review and camera testing, but serious shooters can really benefit.

The ability to shoot better photos is one of the more obvious ways you can see advancements in the latest iPhone. You might not notice processor speeds or display quality improving from one year to the next, but camera quality shows progress more visibly. And competitively, with Samsung offering powerful 10x zoom lenses and Google pioneering computational photography, Apple has to work hard to keep iPhone fans loyal.

Fortunately, Apple has raised its game too. I’ve scrutinized hundreds of photos to compare my iPhone 14 Pro with the iPhone 13 Pro. Here’s what I’ve learned.

The iPhone 14 Pro’s 48-megapixel camera is great

My favorite improvement to the iPhone 14 Pro is the 48-megapixel sensor on the main camera, the one that gets the most use. I love diving into the details of each photo.

<span class="caption" readability="5"></p> <p>I love peering at the details the iPhone 14 Pro&#8217;s main camera captures at 48 megapixels. Even though high megapixel counts can impair image quality at the pixel level, Apple&#8217;s sensor is beg enough to capture good color and dynamic range.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

For most folks, the iPhone 14 Pro models will shoot at 12 megapixels, combining four pixels on the image sensor into one through a process called pixel binning. Because Apple increased the sensor size, image quality improves compared with 12-megapixel shots on earlier phones.

But the more adventurous can shoot with all 48 pixels. That quadruples pixel count and triples file sizes but gives you the flexibility to crop or rotate your photos without losing detail and resolution.

If you like viewing or printing your photos in large sizes, having 48 megapixels is great. At 240 pixels per inch, a common setting for high-quality prints, you can print 48-megapixel photos at a 25.2×33.6 inch size instead of 12.6×16.8 inches for 12 megapixels.

To take 48-megapixel shots, you must use Apple’s ProRaw format, an option enabled through the camera app’s format settings. Many serious photographers already prefer that for its advantages in editing: better flexibility with color, exposure, sharpening. ProRaw is a computational raw format, meaning that it combines multiple frames into one photo and performs other tricks to squeeze as much image quality as possible out of a smartphone’s relatively small sensor.

<span class="caption" readability="7"></p> <p>At left, the 12.6&#215;16.8-inch print you can make with a 12-megapixel photo printed at 240 pixels per inch. In the middle, the iPhone 14 Pro for scale. At right, the 25.2&#215;33.6 print you can make with the iPhone 14 Pro&#8217;s 48-megapixel main camera.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

Low-light shooting is better on the iPhone 14 Pro

The bigger sensor on the iPhone 14 Pro’s main camera improves shooting at nighttime or when conditions are dim. I compared a lot of low-light photos, many of them taken with Apple’s night mode and some of them mounted on a tripod to eliminate problems from shaky hands. 

The 14 Pro’s main camera takes appreciably better shots, preserving more shadow detail and color than the 13 Pro. The dynamic range is also better, capturing a broader range between bright and dark. It’s not what you’d get out of a full-frame SLR or mirrorless camera from the likes of Sony, Nikon or Canon, but it’s impressive.

The comparison above shows the same nighttime shot, deliberately overexposed to reveal shortcomings in darker parts of the scene. The iPhone 13 Pro photo at left suffers from more noise, less detail, and worse color than the iPhone 14 Pro shot at right. Both were shot at 12-megapixel resolution with Apple’s night mode.

The ultrawide camera gets a similar improvement from the 13 Pro to the 14 Pro, though its performance isn’t as good as the main camera’s. Nighttime shots peering into my house show less noise where it’s dark and better detail and color everywhere.

If you edit your photos, that translates to more flexibility. You can boost shadows and ease blown-out highlights without introducing as many artifacts like noise speckles or posterization, where there’s not enough data for smooth tonal gradations.

And the better dynamic range helps when it’s bright,too, for example with bright skies that look more natural.

Nighttime shot of a city with dark trees, bright house lights, and a deep blue sky
<span class="caption" readability="5"></p> <p>I&#8217;m impressed with the iPhone 14 Pro&#8217;s dynamic range, capturing bright highlights and deep shadows pretty well in this nighttime shot shot at 12 megapixels with pixel binning technology. It&#8217;s more natural looking and less afflicted by noise speckles than the same shot with the iPhone 13 Pro.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

The telephoto camera captures more detail

For several years, Apple has used an AI-based image analysis technique called Deep Fusion to preserve details and color in dim and dark lighting. In the latest iPhone 14 generation, Apple’s Photonic Engine technology runs Deep Fusion earlier in the image processing pipeline to preserve texture and color better.

It works on all the cameras, but I appreciate it most on the telephoto lens that otherwise doesn’t appear to have been changed from the 13 Pro to the 14 Pro.

In one shot of a houseplant I took in the evening, I could clearly see fine detail on the 14 Pro’s shot that was absent with the 13 Pro. You can compare the two above.

For another photo of a dark oil painting I shot when it was dim, the iPhone 13 Pro chose to use its main camera and upscale the photo digitally, with predictably mushy results. The iPhone 14 Pro used its telephoto camera and captured vastly more detail, aided perhaps by the Photonic Engine and by improved image stabilization.

Apple didn’t sacrifice image quality for 48 megapixels

Notably, Apple increased the main camera with a good balance of resolution and picture quality.

Increasing pixel count can require decreasing pixel size to fit them all on a sensor. The problem is that smaller pixels are worse when it comes to color, noise, and capturing a broad dynamic range between bright and dark areas of an image.

When using pixel binning to produce 12-megapixel images, the iPhone 14 Pro’s pixels are effectively 65% larger than on the iPhone 13 Pro, and image quality improves accordingly. But while shooting at 48 megapixels, even though the pixels are 59% smaller than on the 13 Pro, they’re still big enough to produce photos with marvelous detail and good color.

<span class="caption" readability="4"></p> <p>The iPhone 14 Pro&#8217;s main camera photographed our dog&#8217;s fur with crisp detail even at 48 megapixels. This is a crop 2,048 pixels wide.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

The small pixels would be a problem when conditions are dim. But when shooting in night mode or using a flash, the iPhone 14 Pro sidesteps the problem by shooting only 12-megapixel photos.

The 2x telephoto camera is a cool trick

A clever trick with the 48-megapixel camera is just using the central quarter of the image to take 12-megapixel shots with a narrower field of view. Apple shows this option as a 2x camera in the camera app.

It’s a good idea because 2x zoom is often nicer than the more dramatic 3x telephoto camera for midrange subjects. It works with video too.

Optics nerds will rightly point out that the lens properties haven’t changed, which means you get some wide-angle issues like somewhat bulbous noses in portraits. Whatever. It’s still a useful option, and the image quality is good enough when it’s not dim.

The main camera has a usefully wider view

Apple broadened the main camera’s field of view from an equivalent of 26mm focal length to a wider angle 24mm. Especially given the option to shoot at 48 megapixels, I’m happy with that.

Many of us shoot indoors where it’s impractical to walk backward to get everybody in a group shot, so a wider field of view is justified. You can always shoot with the ultrawide camera, but its worse image quality is really apparent when it’s dim.

Shooting in 48 megapixels is slower

One downside to the high-resolution photos: It typically takes more than a second to take a 48-megapixel shot. After you tap the shutter button, it snaps a photo and the phone churns away for a moment before the shutter button becomes available again.

In contrast, taking the same scene at 12 megapixels is much snappier. If you’re shooting fast-moving subjects, stick with the lower resolution.

I’d like an on-screen setting to switch between 12 and 48 megapixels so I don’t have to dig into the camera app’s format settings to change it. But I understand Apple’s preference for a simpler interface, uncluttered with buttons photographers might regret accidentally pressing. I’ll mostly shoot at 48 megapixels.

Apple dials back the oversharpening

One of my longstanding complaints about Apple photos is that it sharpens images too much, cranking up too high the mathematical algorithm that emphasizes contrasting edges. The result is a crispy look that looks artificial and distracting to my eye. Indeed, one reason I shoot ProRaw is so I can choose a lower level of sharpening.

I’m happy to report that Apple has eased back. Adobe Lightroom, the software I use to edit and catalog photos, shows sharpening set to 40 instead of the 50 Apple has used for years. I still often dial it back still further to 20 or 30 to produce a more natural, less digitally processed look.

<span class="caption" readability="8"></p> <p>I like that the iPhone 14 Pro doesn&#8217;t sharpen images as much. In this cropped portion of a photo shot with the main camera at 48 megapixels, the blades of grass aren&#8217;t distractingly crispy from oversharpening, a problem I&#8217;ve had with earlier iPhones. This is the default level, but I might dial it down a bit in editing.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

Your preferences may vary, of course. Adobe points out that people like more sharpening, contrast and color saturation when they’re looking at photos on smaller screens. Even if Apple is optimizing more for phone viewing, I still prefer sharpening that looks more natural to me.

Those 48-megapixel photos take up more space

If you’re pondering how much storage to buy with a new iPhone, factor in that 48-megapixel images take up roughly three times as much space.

Apple says the 12-megapixel shots are about 25MB and 48-megapixel shots are about 75MB. That varies depending on whether you’re shooting simple or complex scenes. The biggest sizes I found were, for one scene busy with lots of leaves, 43MB for 12 megapixels and 125MB for 48 megapixels. Both shots were in ProRaw and framed the same with a tripod.

And if you’re shooting ProRes video, another high-end format from Apple, factor in even more space. One 18-minute video I just shot in ProRes gobbled up 27GB of storage space.

The iPhone 14 Pro’s camera bump is huge

You can’t get a big sensor into a phone without using a big lens, and the price you pay for the iPhone 14 Pro’s better image quality is a chunkier phone. I carry a DSLR around to hikes, birthday parties and conferences, so you won’t be surprised to hear I’m fine with Apple’s choice.

The three rear cameras of a deep purple iPhone 14 Pro point upward
<span class="caption" readability="2"></p> <p>Three rear-facing cameras protrude 4.18mm from the back of Apple&#8217;s iPhone 14 Pro.</p> <p></span><span class="credit"> Stephen Shankland/CNET </span>

Ryan Jones has tracked iPhone camera thickness over the years, and the 14 Pro has the biggest optics package yet. It protrudes 4.18mm beyond the rest of the 7.85mm thick iPhone 14 Pro body. In comparison, the iPhone 13 Pro’s body was about the same at 7.65mm thick, but the cameras protruded 3.6mm.

Apple styles its three cameras to look like traditional cameras. It’s a pain cleaning the pocket lint out from between the barrels, but the intermediate material would add more weight to an already hefty phone. And the protruding cylinders convey a message of photographic seriousness in the same way hulking telephoto lenses on traditional cameras do. 

Just try not to let the camera size go to your head.

Source