I bought my most expensive camera system last week: an iPhone 11 Pro. While the screen and battery life was something I was looking forward to, I was most looking forward to massively upgrading my smartphone camera. The potential to shoot portraits with a 52mm lens (as well as landscapes, street shots, and architecture…50mm is my preferred focal range), plus general shots with a 26mm and a 13mm equivalent was exciting. I’ve printed iPhone photos in the past and been happy with them, but would the new camera system live up to the marketing hype?
To be clear, I am by definition a very amateur photographer. Which, I think, actually makes this review a bit more useful than most. I’m not reviewing the iPhone 11 Pro as a phone or the entirety of the underlying operating system. I’m just focused on how well this device helps me make photos.
For the past few years I’ve shot with a bunch of cameras, including: an iPhone 6 and 7, Fuji x100, Sony rx100ii, and Olympus EM10ii. I’ve printed my work in a book, in photos of various sizes that are now hanging on my walls, and travelled all over the world with a camera in tow. I have historically tended towards street photography (broadly defined), some ‘travel’ photography (usually nature and landscape shots), abstracts, and admittedly relatively few portraits. If you want to get a rough assessment of the kinds, and quality, of photos that I take then I’d suggest you wander over to my Instagram profile.
I should be pretty clear, upfront: I make photos, not videos, and so have pretty well zero comments about the video camera functionalities on the iPhone 11 Pro. Also, if you’re looking for some raw technical stats on the iPhone cameras, I’d suggest you check out Halide’s assessment.
Body, Controls, and Handling
The iPhone 11 Pro is considerably larger in hand than the iPhone 7 that I came from. It’s also, with the Apple-branded clear case, quite slippery. This means that I’ve been super cautious in taking photos where dropping it might mean I’d lose it forever (e.g., shooting outstretched over rivers and major highways). The buttons are significantly more solid than my iPhone 7 and, as such, I’m disinclined to use them as a shutter button for fear of messing up my composition or introducing camera shake. Though if I’m being honest, it was pretty rare that I used anything other than the on-screen shutter button on my iPhone 7.
The screen of the iPhone 11 Pro, itself, is bright and beautiful. It’s night and day between it and the iPhone 7. To activate the camera from the lock screen you press and hold the camera icon; after a second or so, the camera app will open and you’re probably ready to shoot. Probably, you may ask? Yes: there’s a glitch in iOS 13 that means that sometimes the camera app launches but the image of what you’re trying to capture isn’t shown on the display. The solution it to take a shot and, afterwards, the display should display the image the camera is showing. Usually. But not always.
If you used burst mode a lot to get the right shot in a burst, get used to a lot of missed shots. In iOS 13, you press the shutter button in the camera app and slide to the left to initiate bursts; holding down on the shutter button start recording a short video (slide to the right if you want to record video and not hold down on the shutter button in the app). In actual use, I’ve ended up accidentally taking a bunch of short videos instead of a burst of shots, which meant I’ve missed capturing what I wanted to capture. A ‘Pro’ camera should let me set photo controls. The iPhone 11 Pro fails, seriously and significantly, in this regard.
When composing a shot, you’ll routinely see what is beyond the focal length you’re using. This means that, as an example, when you’re shooting with the 26mm lens, you’ll see what would be captured by the 13mm lens. On screen, the extended parts of the scene which would be captured by the wider camera is slightly desaturated and on the outskirts of the grid you can enable in the Camera app settings. Some reviewers have said that this looks like what you might see when looking through a rangerfinder-style camera, like a Fuji x100. I fundamentally disagree: those reviewers have not clearly used a rangefinder for extended periods of time, where you can see to the left and right of the frame when looking through the viewfinder. It’s helpful to have that in a camera you’ve raised to your eye, because the rest of your vision may be obscured and so you may not realize what’s about to step into your frame. This is less of an issue when shooting in a smartphone. Much less of an issue.
If you rely on a tilted screen in a mirrorless or DSLR to get the shots you like, while, you’re going to be out of luck. It’s a camera phone without an articulating screen. Maybe Samsung’s folding phones will integrate this kind of feature into their camera app…
I haven’t shot using the flash, so I can’t comment on what it’ll be like to use.
If you’ve used the iPhone Camera app, you’ll find that few things have meaningfully changed. The ‘big’ changes include a notification along the top left corner if night mode is activated (along with how many seconds it’ll take to use the feature) and an arrow along the top of the app that, if tapped, will let you switch some of the default features (e.g., flash on/off/auto, live images on/off, timer, or filter￼). Despite being a ‘professional’ device—which has a pile of internal gyroscopes!—the camera app doesn’t include a horizon level, though if you’re taking flat shots you’ll get an indicator to show if you’re perfectly level.
I tend to see the stock photos app as part of the control of an iPhone camera. Some of the additions are good—tilt shifts in particular!—but I loath losing how iOS 12 ‘grouped’ features into categories like light, colour, and black and white. And I really miss being able to adjust neutrals and tones in the black and white setting. Why’d you take those away, Apple? WHY!?
The battery life when I’ve taken the iPhone 11 Pro for a day of shooting has been great; I was out for about 7 hours one day to just shoot and took about 250 photos, while listening to podcasts and reading news and such. I had 17% after a full days normal use plus shooting, but I was shooting with a brand new battery in ideal temperatures for batteries (20-24 degrees). The real test will be when winter hits in countries like Canada or the northern USA and we see how well the batteries hold up in semi-hostile environmental conditions.
I’ve been super impressed with the camera system included in the iPhone 11 Pro. Despite being impressed there are definitely areas where computational photography is still very much a work in progress.
I’ve been taken aback by just how much dynamic range is captured by this camera when I’ve been making photos. This is especially the case when I’ve used the camera in low-light or sheer dark conditions. As is true of almost all cameras, it generally performs admirably in well lit situations. What follows are a selection of shots taken over a three day period; they are all edited to my taste, using just the stock photos app. What follows is a (broad) selection of those photos in indoor, high day, and sunset conditions.
I also did a late evening photowalk. It was pitch black (for a major urban city…) and so the following images are good representations of what urban photographers can probably pull off without a tripod. In many of the images I was resting the camera either tightly against my body or something in the natural environment (e.g., a tree trunk) to reduce camera shake.
I did run into some computational…weirdness…in some of the shots. When shooting the Cinesphere, I sometimes got this weird yellow arc that stretched along the top. Also when shooting scenes with the Cinesphere and the Japanese Temple Bell, there were times when it looked like the upper right of the frame (proximate to the Cinesphere in the shot) had extremely severe vignetting. Also, I noticed that I got lens flare when shooting at night; while this could be corrected in post using something like Snapseed I can’t ever recall dealing with flare on a regular basis on prior iPhones.
Also, don’t buy this camera and expect to get cool light trails using the default camera application. While night mode takes a lot of exposures to create the final shot, you’ll only get the slightest of blur from moving vehicles. Similarly, due to the fixed aperture of the cameras you’re not going to get any cool light flares or sun stars , nor can you seriously control the depth of field as you could in a camera with much more manual control.
The iPhone 11 Pro is a marvel of a camera system. Seriously: it’s spectacular for the size of the sensor, though it damn well better be given its sheer cost!
I can see this camera fitting into the lives of a lot of creative amateurs. (Probably professionals, too, but with grumbles.) For me, and people with at my skill level with photography, this is a major equipment investment that I think will be pretty great: it’s a supplement to, not a replacement for, the aging Sony rx100ii I carry with me on a day to day basis, and it’s genuinely fun to shoot on. The Photos app, while annoying in some of its reconfiguration, is generally more powerful than in its last version. And the ability to easily and quickly shift between the 13-52mm focal ranges cannot be appreciated enough: it’s like having a permanent kit lens attached to your smartphone, and that’s just awesome.
Should you upgrade or buy this camera system? I dunno. I had an older phone and totally could have stuck with it for another year or so, and I’m happy with my upgrade. But for around $2,000(CAD) you could get some really nice new glass, which might be a better investment if you’re always carrying your mirrorless camera or DSLR with you, or if having better control of aperture, camera levels, or other ‘niceties’ are the core thing you’re looking for. But if you’ve increasingly been leaving your ‘big’ camera and glass at home, but still want a lot of functionality when making photos on your smartphone, and have the disposable income, then you’ll probably be pretty happy with the iPhone 11 Pro.