An Amateur Photographer’s Long-Term Review of the iPhone 12 Pro Camera System

I bought an iPhone 12 Pro mid-cycle in March 2021 and have been shooting with it for the past several months in a variety of weather conditions. I was very pleased with the iPhone 11 Pro with the exception of the green lens flares that too-frequently erupt when shooting with it at night. Consider this a longish-term review of the 12 Pro with comparisons to the 11 Pro, and scattered with photos taken exclusively with the 12 Pro and edited in Apple Photos and Darkroom on iOS.

Background

I’m by definition an amateur photographer; I shoot using my iPhone as well as a Fuji X100F, and get out to take photos at least once or twice a week during photo walks that last a few hours. I don’t earn any money from making photos and shoot with it for my own personal enjoyment. Most of my photos are street or urban photography, with a smattering of landscape shots and photos of friends and family thrown in.

To be clear up front: this is not a review of the iPhone 12 Pro, proper, but just the camera system. This said, it’s worth noting that the hardware differences between the iPhone 11 Pro and 12 Pro are pretty minor. The 26mm lens is now f/1.6 and the 13mm can be used with night mode. At a software level, the 12 Pro introduced the ability to shoot Apple ProRAW and introduced Smart HDR 3, as well as Deep Fusion to improve photographic results in low to middling light. Deep Fusion, in particular, has no discernible effect on the shots I take, but maybe I’m just not pixel peeping enough to see what it’s doing.

For the past few years I’ve shot with a number of cameras, including: an iPhone 6 and 7, and 11 Pro, a Fuji X100 and 100F, a Sony RX1002, and an Olympus EM10ii. I’ve printed my work in a couple personal books, and also printed photos from all these systems at various sizes and hang the results on my walls. When I travel it’s with a camera or two in tow. If you want a rough gauge of the kinds of photos I take you might want to take a gander at my Instagram.

Also, while I own a bunch of cameras, photos are my jam. I’ll be focusing mostly on how well the iPhone 12 Pro makes images with a small aside to talk about its video capabilities. For more in-depth technical reviews of the 12 Pro I’d suggest checking out Halide’s blog.

The Body

The iPhone 11 Pro had a great camera system but it was always a bit awkward to hold the phone when shooting because of its rounded edges. Don’t get me wrong, it helped the phone feel more inviting than the 12 Pro but was less ideal for actual daily photography and I find it easier to get, and retain, a strong grip on the 12 Pro. Your mileage may vary.

I kept my 11 Pro in an Apple silicon case and I do the same for the 12 Pro. One of the things I do with some regularity is press my phone right against glass to reduce glare when I’m shooting through a window or other transparent substance. With the 12 Pro’s silicon case I can do this without the glass I’m pressed against actually touching the lens because there’s a few millimetres between the case and the lens element. The same was also true of my 11 Pro and the Apple silicon case I had attached to it.

I like the screen of the 12 Pro, though I liked the screen in the 11 Pro as well. Is there a difference? Yeah, a bit, insofar as my blacks are more black on the 12 Pro but I wouldn’t notice the difference unless the 11 Pro and 12 Pro were right against one another. I can see both clearly enough to frame shots in sunny days while shooting which is what I really care about.

While the phone doesn’t have any ability to tilt the screen to frame shots, you can use a tripod to angle your phone and then frame and shoot using an Apple Watch if you have one. It’s a neat function and you can actually use an Apple Watch as a paired screen if you’re taking video using the main lenses. I tend to shoot handheld, however, and so only have used the Apple Watch trick when shooting a video using the main cameras on the back of the 12 Pro.

I don’t ever really use the flash so I can’t comment on it, though I do occasionally use the flash as a light to illuminate subjects I’m shooting with another camera. It’s not amazing but it works in a pinch.

The battery is so-so based on my experience. The 12 Pro’s battery is a tad smaller than the one in my 11 Pro, which means less capacity, though in the five months I’ve owned the 12 Pro the battery health hasn’t degraded at all which wasn’t the case with the 11 Pro. This said, if I’m out shooting exclusively with the 12 Pro I’m going to bring a battery pack with me just like when I went out for a day of shooting with the 11 Pro. If it’s not a heavy day of shooting, however, I reliably end the day with 20% or more battery after the 12 Pro has been off the charger for about 14-17 hours with middling usage.

Probably the coolest feature of the new 12 series iPhones is their ability to use magnetic attachments. I’ve been using a Luma Cube Telepod Tripod stand paired with a Moment Tripod Mount with MagSafe. It’s been pretty great for video conferences and is the coolest hardware feature that was added to the 12-line of phones in my opinion. It’s a shame that there isn’t a wider ecosystem supporting this hardware feature this many months after release.

Camera App

The default Apple camera app is fine, I guess. I like that you can now set the exposure and the app will remember it, which has helpfully meant that I can slightly under-expose my shots by default as is my preference. However, the default app still lacks a spirit guide which is really, really, really stupid, and especially so in a so-called “Pro” camera that costs around $2,000 (CAD) after Apple Care, a case, and taxes. It’s particularly maddening given that the phone includes a gyroscope that is used for so many other things in the default camera app like providing guidance when taking pano shots or top-down shots, and so forth.

It’s not coming back, but I’m still annoyed at how Apple changed burst mode in iOS. It used to be you could hold the shutter button in the native camera app or the volume rocker to active a burst but now you hold the shutter button and pull it to the left. It’s not a muscle memory I’ve developed and also risks screwing up my compositions when I’m shooting on the street so I don’t really use burst anymore which is a shame.

As a note, I edit almost all my photos in the Darkroom extension for Photos. It crashes all the damn time and it is maddening. I’d hoped these crashes would go away when I upgraded from the 11 Pro to the 12 Pro but they haven’t. It is very, very, very frustrating. And the crashes happen all the damn time.

Image Quality

In a theoretical world upgrading my camera would lead to huge differences in image quality, but in practice that’s rarely the case. It is especially not the case when shifting from the 11 Pro to the 12 Pro, save for in very particular situations. The biggest change and improvement that is noticeable in daily situations is when you’re shooting scenes where there is significant dynamism in the scene, such as when you’re outside on a bright day; the sky and the rest of the scene are kept remarkably intact without your highlights or shadows being blown out. Even when compared to a camera with an APS-C or Micro 4/3 sensor it’s impressive, and I can get certain bright day shots with the iPhone 12 Pro that wouldn’t be possible to easily capture with my Fujifilm X100F or Olympus EM10ii.

The other upgrade is definitely that, due to sensor and computational power, you can get amazing lowlight shots using the ultra-wide lens using Night Mode. Shots are sometimes a bit noisy or blotchy but still I can get photos that are impossible to otherwise get handheld with an APS-C sensor.

Relatedly, the ultra-wide’s correction for distortion is pretty great and it’s noticeably better than the ultra-wide lens correction on the 11 Pro. If you’re shooting wide angle a lot then this is likely one of the few software improvements you’ll actually benefit from with some regularity.

One of the most heralded features of the 12 Pros was the ability to shoot ProRaw. In bright conditions it’s not worth using; I rarely detect a noticeable improvement in quality nor does it significantly enhance how I can edit a photo in those cases. However, in darker situations or more challenging low-light indoor situations it can be pretty helpful in retaining details that can be later recovered. That said, it hasn’t transformed how I shoot per se; it’s a nice-to-have, but not something that you’re necessarily going to use all the time.

You might ask how well portrait mode works but, given that I don’t use it that often, I can’t comment much beyond that it’s a neat feature that is sufficiently inconsistent that I don’t use it for much of anything. There are some exceptions, such as when shooting portraits at family events, but on the whole I remain impressed with it from a technology vantage point while being disappointed in it from a photographer’s point of view. If I want a shallow depth of field and need to get a shot I’m going to get one of my bigger cameras and not risk the shot with the 12 Pro.

Video

I don’t really shoot video, per se, and so don’t have a lot of experience with the quality of video production on it. Others have, however, very positively discussed about the capabilities of the cameras and I trust what they’ve said.

That said, I did a short video for a piece I wrote and it turned out pretty well. We shot using the ‘normal’ lens at 4K and my employer’s video editor subsequently graded the video. This was taken in low-light conditions and I used my Apple Watch as a screen so I could track what I was doing while speaking to camera.

I’ve also used my iPhone 12 Pro for pretty well all the numerous video conferences, government presentations (starting at 19:45), classes I’ve taught, and media engagements I’ve had over the course of the pandemic. In those cases I’ve used the selfie camera and in almost all situations persons on the other side of the screen have commented on the high quality of my video. I take that as a recommendation of the quality of the selfie cameras for video-related purposes.

Frustrations

I’ll be honest: what I most hoped would be better with the iPhone 12 Pro was that the default Photos app would play better with extensions. I use Darkroom as my primary editing application and after editing 5-10 photos the extension reliably crashes and I need to totally close out Photos before I can edit using the extension again.1 It is frustrating and it sucks.

What else hasn’t improved? The 12 Pro still has green lens flares when I take photos at night. It is amazingly frustrating that, despite all the computing power in the 12 Pro, this is an issue that Apple’s software engineers can’t fix given the current inability of their hardware engineers to resolve the issue. Is this a problem? Yes, it is, especially if you ever shoot at night. None of my other-less expensive-cameras suffer from this, and it’s maddening the 12 Pro still does. It’s made worse by the fact that the Photos app doesn’t include a healing tool to remove these gross little flares and, thus, requires me to use another app (typically Snapseed) to get rid of them.

Finally, I find that the shots with the 12 Pro are often too sharpened to my preference, which means that I tend to turn down the clarity in Darkroom to soften a lot of the photos I take. It’s an easy fix, though (again) not one you can correct in the default Photos application.

Conclusion

So what do I think of the iPhone 12 Pro? It’s the best non-Fuji X100F that I typically have when I’m out and about, and the water resistance means I’m never worried to shoot with it in the elements.2

If I have a choice, do I shoot with the Fuji X100F or the iPhone 12 Pro? If a 35mm equivalent works, then I shoot with the Fuji. But if I want a wide angle shot it’s pretty common for me to pull the 12 Pro and use it, even while out with the Fuji. They’ve got very different colour profiles but I still like using them both. Sometimes I even go on photowalks with just the 12 Pro and come back with lots of keepers.

This is all to say that the X100F and 12 Pro are both pretty great tools. I’m a fan of them both.

So…is the 12 Pro a major upgrade from the 11 Pro? Not at all. A bigger upgrade from earlier iPhones? Yeah, probably more so. I like the 12 Pro and use it everyday as a smartphone, and I like it as a camera. I also liked the 11 Pro as a portable camera and phone as well.

Should you buy the 12 Pro? Only if you really want the telephoto and the ability to edit ProRaw files. If that’s not you, then you’re probably going to be well off saving a chunk of change and getting the regular 12, instead.

(Note: All photos taken with an iPhone 12 Pro and edited to taste in Apple Photos and Darkroom.)


  1. Yes, I can edit right in Darkroom, and I do, but it’s not as convenient. ↩︎

  2. I admit to not treating the X100F with a lot of respect but I don’t use it when it’s pouring rain. The same isn’t true of the iPhone 12 Pro. ↩︎

Apple Services Subscriptions Confusion

Perhaps I should know better than to adopt any Apple product—or service offering— until the company has worked through the bugs and inconsistencies in whatever it’s selling. Nonetheless, I signed up for Apple One because it actually was a bit cheaper than the services I was already paying for, plus came with some additional storage.

However, the pricing/subscription rollover is incredibly weird. I signed up for Apple One, which was supposed to shift my individual subscriptions to the bundle offering, but that seemingly hasn’t happened. So now I have the pleasure of once again—twice in two months!—trying to resolve billing weirdness in Apple’s part. As lovely as Apple’s customer support representatives are, this is not the delightful experience I signed up for.

I get that Apple is historically bad at services. But this is a level of incompetence that I’d expect of a telecom company and not one of the largest and most customer-focused companies in the world. And, perhaps more problematically for Apple, it definitely means that I’m not about to recommend Apple One to anyone in the near future given that Apple can’t even get their payment processes reliably worked out.

Productivity and the iPad Pro: A Policy Wonk’s Review

Tools by Christopher Parsons
Every time Apple announces a new iPad, a slew of technology reviewers and YouTube personalities ask whether the newest iPad can finally replace a laptop. And, in almost every situation, they argue that the device can mostly, but not quite, serve as a replacement. But reviewers’ workflows—often involving film production, audio editing, and other marginally esoteric requirements—tend to be pretty different from those of non-AV professionals.

I don’t make videos for a living, nor do I engage in audio engineering. I’m a professional policy wonk and amateur photographer, which means that I do a lot of national video and audio interviews, a lot of writing and text-based communication, some image editing, and depressing amounts of media consumption. I also read a crazy numbers of PDFs and have to annotate them. And for the past two weeks I was consigned to work off my iPad Pro (2018) and iPhone Pro because my MacBook Air was getting its keyboard repaired.

So how successfully did I continue to work just from my non-laptop devices? Spoiler: it was pretty great and mostly convinced me I can lead a (mostly) iPad Pro work life.

The Tools

As mentioned, the hardware that I principally relied on included my iPad Pro 11” (2018) and iPhone Pro.

For the iPad I also had a Logitech Bluetooth keyboard and a Magic Trackpad, as well as a cheap stand. For importing my photos, I have an old USB-C hub that has a SD card reader. For the iPhone, I routinely used a knock-off Gorilla Pod tripod, Manfroto head, and AirPods.

On the software side of things, I used Mail, Pages, Wire, GoodNotes, Mendeley, Reeder, Photos and Darkroom, Safari, Google Drive and Docs, Tweetbot, and Apple Notes to get my daily work done on the iPad Pro.

For interviews I was at the mercy of whatever the interviewers wanted me to use on my iPhone Pro, which was usually either FaceTime, Skype, Signal, WhatsApp, or Zoom, and I used Google Meet for non-broadcast communications.

Successes

The Setup by Christopher Parsons
On the whole I was able to do everything using my iPad Pro and iPhone Pro that I was doing when I was relying on my MacBook Air and iPhone Pro. My reading and writing were largely unimpaired, and my communications with colleagues were not noticeably affected.

Specifically, I was able to continue importing and editing photos, and worked in Google Docs and Drive to leave comments and contribute to documents that were in progress. Email continued to be dealt with using the native client, and I kept on working on Word documents using Pages. Apple’s cloud storage meant I had access to all my files on my iPad, just as on my MacBook Air.

Working with PDFs was simple and easy: I imported them to GoodNotes and shared them into Mendeley after I’d annotated them. I then deleted them from GoodNotes to avoid having multiple iterations of a document in different apps.

All of my communications were easy to maintain, though it was admittedly annoying to have to pick up my phone whenever I received or needed to send a message in WhatsApp. It’d be great if Facebook committed to the service, and made it available on all iOS devices like Signal has already done.

Minor Annoyances

There were one or two things that were annoying. I had to take a photo with government identification, and then strip away some of the more sensitive information. It took me a bit of time to figure out that I could move the photo into Notes, scratch out the offending information, and then output the edited photo to Files to then be uploaded. But it was annoying, not impossible.

I also continue to struggle with a good blogging process on iOS devices. I used Ulysses for years but the lack of new updates for non-subscription users was grating. Other non-subscription-based apps, however, don’t really support images as well nor upload as nicely to this blog. So I’ve actually started using the (mediocre) WordPress client. It’s not impressive, but neither are any of the other clients.

Major Pain Points

First, Google Docs is a terrible application that doesn’t work well. Period. In documents where there are a lot of tracked changes and comments it becomes basically non-functional. It got so bad that I’d write text in Apple Notes and then just copy it into Google Docs, or else I’d be stuck waiting for minutes for a sentence to finally be input. Google Docs is generally a dumpster fire, though, and it’s a shame that Google hasn’t properly developed their app or service in all the years that Google has operated it. (In my MacBook Air, editing in Safari is only a marginally better experience. Google really needs to get its act together.)

Second, Slide Over is incredibly confusing to get working. I’ve owned an iPad for years and it was only in the last two weeks that I finally figured out how to control it, and doing so required watching an instructional video. It is bonkers that this feature is so unintuitive to use and yet so easy to trigger. That said, once I figured it out, it was a very positive and transformative productivity enhancement.

Third, I absolutely needed my iPhone for actual video conferencing. The iPad can do conferencing, but it’s form factor sucks for this kind of activity. That’s fine, and I’d be doing the same if I was doing interviews or video chats with a working MacBook Air in my possession. Still, you’re going to want another camera (and a headset with microphones) if you need to so high(ish) quality calls when you’re working purely from an iPad Pro.

And that’s really it. Beyond the Google Docs app being a trash fire (and, I would point out, it is also just a less-bad trashfire when accessed using Safari on a MacBook Air), the inane complexity of Slideover, and need for a separate device for video calls, the iPad Pro pretty nicely replaced my workflow on the Air. I missed the slightly larger screen, but not so much that it was a real issue.

Concluding Thoughts

I really appreciated and liked using my iPad Pro and iPhone Pro full time. It was easy to set up and tear down. It let me get my work done with fewer distractions than on my MacBook Air. And the screen is noticeably higher quality than the Air.

So if you have a relatively writing- and speaking-focused job, and are doing neither a lot of video or audio editing (or, I suspect, spreadsheet work) then the iPad Pro could be a good fit for your workflow. Does that mean that it’s better than working off a laptop? Nope! But also that what a lot of reviewers consider to be ‘normal’ and what authors and policy folks think are ‘normal’ are very different, with the latter category being pretty well supported on iPad Pros.

An Amateur Photographer’s Review of the iPhone 11 Pro Camera System

I bought my most expensive camera system last week: an iPhone 11 Pro. While the screen and battery life was something I was looking forward to, I was most looking forward to massively upgrading my smartphone camera. The potential to shoot portraits with a 52mm lens (as well as landscapes, street shots, and architecture…50mm is my preferred focal range), plus general shots with a 26mm and a 13mm equivalent was exciting. I’ve printed iPhone photos in the past and been happy with them, but would the new camera system live up to the marketing hype?

My Background

To be clear, I am by definition a very amateur photographer. Which, I think, actually makes this review a bit more useful than most. I’m not reviewing the iPhone 11 Pro as a phone or the entirety of the underlying operating system. I’m just focused on how well this device helps me make photos.

For the past few years I’ve shot with a bunch of cameras, including: an iPhone 6 and 7, Fuji x100,1 Sony rx100ii, and Olympus EM10ii. I’ve printed my work in a book, in photos of various sizes that are now hanging on my walls,2 and travelled all over the world with a camera in tow. I have historically tended towards street photography (broadly defined), some ‘travel’ photography (usually nature and landscape shots), abstracts, and admittedly relatively few portraits. If you want to get a rough assessment of the kinds, and quality, of photos that I take then I’d suggest you wander over to my Instagram profile.

I should be pretty clear, upfront: I make photos, not videos, and so have pretty well zero comments about the video camera functionalities on the iPhone 11 Pro. Also, if you’re looking for some raw technical stats on the iPhone cameras, I’d suggest you check out Halide’s assessment.

Body, Controls, and Handling

The iPhone 11 Pro is considerably larger in hand than the iPhone 7 that I came from. It’s also, with the Apple-branded clear case, quite slippery. This means that I’ve been super cautious in taking photos where dropping it might mean I’d lose it forever (e.g., shooting outstretched over rivers and major highways). The buttons are significantly more solid than my iPhone 7 and, as such, I’m disinclined to use them as a shutter button for fear of messing up my composition or introducing camera shake. Though if I’m being honest, it was pretty rare that I used anything other than the on-screen shutter button on my iPhone 7.

The screen of the iPhone 11 Pro, itself, is bright and beautiful. It’s night and day between it and the iPhone 7. To activate the camera from the lock screen you press and hold the camera icon; after a second or so, the camera app will open and you’re probably ready to shoot. Probably, you may ask? Yes: there’s a glitch in iOS 13 that means that sometimes the camera app launches but the image of what you’re trying to capture isn’t shown on the display. The solution it to take a shot and, afterwards, the display should display the image the camera is showing. Usually. But not always.

If you used burst mode a lot to get the right shot in a burst, get used to a lot of missed shots. In iOS 13, you press the shutter button in the camera app and slide to the left to initiate bursts; holding down on the shutter button start recording a short video (slide to the right if you want to record video and not hold down on the shutter button in the app). In actual use, I’ve ended up accidentally taking a bunch of short videos instead of a burst of shots, which meant I’ve missed capturing what I wanted to capture. A ‘Pro’ camera should let me set photo controls. The iPhone 11 Pro fails, seriously and significantly, in this regard.

When composing a shot, you’ll routinely see what is beyond the focal length you’re using. This means that, as an example, when you’re shooting with the 26mm lens, you’ll see what would be captured by the 13mm lens. On screen, the extended parts of the scene which would be captured by the wider camera is slightly desaturated and on the outskirts of the grid you can enable in the Camera app settings. Some reviewers have said that this looks like what you might see when looking through a rangerfinder-style camera, like a Fuji x100. I fundamentally disagree: those reviewers have not clearly used a rangefinder for extended periods of time, where you can see to the left and right of the frame when looking through the viewfinder. It’s helpful to have that in a camera you’ve raised to your eye, because the rest of your vision may be obscured and so you may not realize what’s about to step into your frame. This is less of an issue when shooting in a smartphone. Much less of an issue.

If you rely on a tilted screen in a mirrorless or DSLR to get the shots you like, while, you’re going to be out of luck. It’s a camera phone without an articulating screen. Maybe Samsung’s folding phones will integrate this kind of feature into their camera app…

I haven’t shot using the flash, so I can’t comment on what it’ll be like to use.

If you’ve used the iPhone Camera app, you’ll find that few things have meaningfully changed. The ‘big’ changes include a notification along the top left corner if night mode is activated (along with how many seconds it’ll take to use the feature) and an arrow along the top of the app that, if tapped, will let you switch some of the default features (e.g., flash on/off/auto, live images on/off, timer, or filter). Despite being a ‘professional’ device—which has a pile of internal gyroscopes!—the camera app doesn’t include a horizon level, though if you’re taking flat shots you’ll get an indicator to show if you’re perfectly level.

I tend to see the stock photos app as part of the control of an iPhone camera. Some of the additions are good—tilt shifts in particular!—but I loath losing how iOS 12 ‘grouped’ features into categories like light, colour, and black and white. And I really miss being able to adjust neutrals and tones in the black and white setting. Why’d you take those away, Apple? WHY!?

The battery life when I’ve taken the iPhone 11 Pro for a day of shooting has been great; I was out for about 7 hours one day to just shoot and took about 250 photos, while listening to podcasts and reading news and such. I had 17% after a full days normal use plus shooting, but I was shooting with a brand new battery in ideal temperatures for batteries (20-24 degrees). The real test will be when winter hits in countries like Canada or the northern USA and we see how well the batteries hold up in semi-hostile environmental conditions.

Image Quality

I’ve been super impressed with the camera system included in the iPhone 11 Pro. Despite being impressed there are definitely areas where computational photography is still very much a work in progress.

I’ve been taken aback by just how much dynamic range is captured by this camera when I’ve been making photos. This is especially the case when I’ve used the camera in low-light or sheer dark conditions. As is true of almost all cameras, it generally performs admirably in well lit situations. What follows are a selection of shots taken over a three day period; they are all edited to my taste, using just the stock photos app. What follows is a (broad) selection of those photos in indoor, high day, and sunset conditions.

I also did a late evening photowalk. It was pitch black (for a major urban city…) and so the following images are good representations of what urban photographers can probably pull off without a tripod.4 In many of the images I was resting the camera either tightly against my body or something in the natural environment (e.g., a tree trunk) to reduce camera shake.

I did run into some computational…weirdness…in some of the shots. When shooting the Cinesphere, I sometimes got this weird yellow arc that stretched along the top. Also when shooting scenes with the Cinesphere and the Japanese Temple Bell, there were times when it looked like the upper right of the frame (proximate to the Cinesphere in the shot) had extremely severe vignetting. Also, I noticed that I got lens flare when shooting at night; while this could be corrected in post using something like Snapseed I can’t ever recall dealing with flare on a regular basis on prior iPhones.

Also, don’t buy this camera and expect to get cool light trails using the default camera application. While night mode takes a lot of exposures to create the final shot, you’ll only get the slightest of blur from moving vehicles. Similarly, due to the fixed aperture of the cameras you’re not going to get any cool light flares or sun stars , nor can you seriously control the depth of field as you could in a camera with much more manual control.5

Conclusion

The iPhone 11 Pro is a marvel of a camera system. Seriously: it’s spectacular for the size of the sensor, though it damn well better be given its sheer cost!

I can see this camera fitting into the lives of a lot of creative amateurs. (Probably professionals, too, but with grumbles.) For me, and people with at my skill level with photography, this is a major equipment investment that I think will be pretty great: it’s a supplement to, not a replacement for, the aging Sony rx100ii I carry with me on a day to day basis, and it’s genuinely fun to shoot on. The Photos app, while annoying in some of its reconfiguration, is generally more powerful than in its last version. And the ability to easily and quickly shift between the 13-52mm focal ranges cannot be appreciated enough: it’s like having a permanent kit lens attached to your smartphone, and that’s just awesome.

Should you upgrade or buy this camera system? I dunno. I had an older phone and totally could have stuck with it for another year or so, and I’m happy with my upgrade. But for around $2,000(CAD) you could get some really nice new glass, which might be a better investment if you’re always carrying your mirrorless camera or DSLR with you, or if having better control of aperture, camera levels, or other ‘niceties’ are the core thing you’re looking for. But if you’ve increasingly been leaving your ‘big’ camera and glass at home, but still want a lot of functionality when making photos on your smartphone, and have the disposable income, then you’ll probably be pretty happy with the iPhone 11 Pro.

  1. In honesty, it was too much camera for me at the time, but it taught me to really love and want to work on my photography. ↩︎
  2. My largest prints are 24×36, from my Sony rx100ii and Olympus EM10ii (using an Olympus 17mm 1.8 lens). ↩︎
  3. Why won’t Apple bring the camera filters in Messages straight into their camera app? Oh hey! Did you even know Apple had a pile of filters for fun stuff in Messages? I bet not given how buried they they—open messages, tap the star button in the lower left side, then tap the three concentric rings, fight with the stupid UI a bit, and tada!↩︎
  4. If using a tripod, the internal gyroscopes will detect this and let you take up to a 10s ‘exposure’. ↩︎
  5. Some of this might change as Halide and other competing camera app manufactures update their applications. But the stock camera app is pretty limited in computation control of the aperture, especially for landscape or street photography. ↩︎

China’s Second Continent


Rating: ⭐️⭐️⭐️

Howard W. French’s book is, functionally, a travel log of his most recent tour of Africa where he asks the baseline question, “what, exactly, is happening with Chinese investment and emigration to African states?” He takes the reader across the continent and recounts his experiences, today, versus when he was professionally in the region in past decades. It’s this background experience — which enables him to conduct before/after assessments — combined with his experiences in both the populous and more rural areas of China, along with linguistic fluency, that makes the book as compelling as it is.

The actual findings of the book are pretty common across all cases: Chinese efforts to shore up mineral and vegetative resources are, widely, disliked by the public. This dislike follows from Chinese companies predominantly bringing in skilled labourers from China and minimally employing locals, and while also rarely providing sufficient training so that locals can take on more advanced tasks. Moreover, in many of the cases French recounts the Chinese companies are massively either underpaying locals or, in contrast, engaged in bidding practices that result in poor quality infrastructures being developed and which are often obtained in part through bribery or corrupt dealings.

Many of the Chinese persons who are interviewed in the course of the book hold, frankly, colonial values. They regard African employees as lazy, and uneducated, and as unwilling to adequately develop. And, similarly, Chinese companies and government consular staff are engaged in systematic efforts to, on the one hand, establish control of important resources that will enable China to prosper while, on the other, stripping Africa of its resources at a scale that could only be dreamed by Western colonial powers in the decades and centuries past.

The repetition that emerges through the chapters ultimately makes the book a tad boring to read, especially towards the end, notwithstanding French’s efforts to inject local colour and humour throughout the book. However, it is that very repetitiveness that makes the book as striking as it is: Africa has become a space where China’s transactionalist foreign policy means that Chinese companies can thrive while aggressively stripping resources from Africa whilst the country itself avoids projects focused on developing democratic norms, rule of law, or other governance systems. These latter activities, often associated with American and Western aid projects, are set aside by and large by China and, as a result, the supposed ‘progress’ of African states will only come if the states’ governance structures change on their own, and in the face of exceptional bribes and other corrupt business practices. I remain dubious that a Chinese-facilitated model of “development,” which largely entails economic activities and exclusionary approaches to engaging in broader governance activities, will do any more for Africa than the French, British, and Belgians did when they focused their attentions on Africa.

Review of the Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon

Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon

Rating: ⭐️⭐️⭐️⭐️⭐️

Zetter’s book engages in a heroic effort to summarize, describe, and explain the significance of the NSA’s and Israel’s first ‘cyber weapon’, named Stuxnet. This piece of malware was used to disrupt the production of nuclear material in Iran as part of broader covert efforts to delimit the country’s ability to construct a nuclear weapon. 

Multiple versions of Stuxnet were created, as were a series of complementary or derivative malware species with names such as Duqu and Flame. In all cases the malware was unusually sophisticated and relied on chains of exploits or novel techniques that advanced certain capabilities from academic theory to implementable practice. The reliance on zero-day vulnerabilities, or those for which no patches are available, combined with deliberate efforts to subvert the Windows Update system as well as use fraudulently signed digital certificates, bear the hallmarks of developers being willing to compromise global security for the sake of a specific American-Israeli malware campaign. In effect, the decision to leave the world’s computers vulnerable to the exploits used in the creation of Stuxnet demonstrate that offence was prioritized over defence by the respective governments and their signals intelligence agencies which authored the malware.

The book regales the reader with any number of politically sensitive tidbits of information: the CIA was responsible for providing some information on Iran’s nuclear ambitions to the IAEA, Russian antivirus researchers were monitored by Israeli (and perhaps other nations’) spies, historically the CIA and renown physicists planted false stories in Nature, the formal recognition as cyberspace as the fifth domain of battle in 2010 was merely formal recognition of work that had been ongoing for a decade prior, the shift to a wildly propagating version of Stuxnet likely followed after close access operations were no longer possible and the flagrancy of the propagation was likely an error, amongst many other bits of information.

Zetter spends a significant amount of time unpacking the ways in which the United States government determines if a vulnerability should be secretly retained for government use as part of a vulnerabilities equities process. Representatives from the Department of Homeland Security who were quoted in the book noted that they had never received information from the National Security Agency of a vulnerability and, moreover, that in cases where the Agency was already exploiting a reported vulnerability it was unlikely that disclosure would happen after entering the vulnerability into the equities process. As noted by any number of people in the course of the book, the failure by the United States (and other Western governments) to clearly explain their vulnerabilities disclosure processes, or the manners in which they would respond to a cyber attack, leaves unsettled the norms of digital security as well as leaves unanswered the norms and policies concerning when (and how) a state will respond to cyber attacks. To date these issues remain as murky as when the book was published in 2014.

The Countdown to Zero Day, in many respects, serves to collate a large volume of information that has otherwise existed in the public sphere. It draws in interviews, past technical and policy reports, and a vast quantity of news reports. But more than just collating materials it also explains the meanings of them, draws links between them that had not previously been made in such clear or straightforward fashions, and explains the broader implications of the United States’ and Israel’s actions. Further, the details of the book render (more) transparent how anti-virus companies and malware researchers conduct their work, as well as the threats to that work in an era when a piece of malware could be used by a criminal enterprise or a major nation-state actor with a habit of proactively working to silence researchers. The book remains an important landmark in the history of security journalism, cybersecurity, and the politics of cybersecurity. I would heartily recommend it to a layperson and expert alike.

Review of Happy City: Transforming Our Lives Through Urban Design

Rating: ⭐️⭐️⭐️⭐️⭐️

Mongomery’s book, Happy City: Transforming Our Lives Through Urban Design, explores how decades of urban design are destructive to human happiness, human life, and the life of the planet itself. He tours the world — focused mostly on Vancouver, Portland, Bogotá, Atlanta, and Hong Kong — to understand the different choices that urban designers historically adopted and why communities are railing against those decisions, now.

The book represents a tour de force, insofar as it carefully and clearly explains that urban sprawl — which presumed that we would all have cars and that we all wanted or needed isolated homes — is incredibly harmful. The focus of the book is, really, on how designing for cars leads to designing for things instead of people, and how efforts to facilitate car traffic has been antithetical to human life and flourishing. His call for happy cities really constitutes calls to, first and foremost, invest in urbanization and densification. Common social utilities, like transit and parks and community spaces, are essential for cities to become happy because these utilities both reduce commutes, increase socialization, and the presence of nature relieves the human mind of urban stresses.

While the book is rife with proposals for how to make things better, Montgomery doesn’t go so far as to argue that such changes are easy or that they can be universally applied everywhere. The infrastructure that exists, now, cannot simply be torn up and replaced. As a result he identifies practical ways that even suburban areas can reinvigorate their community spaces: key, in almost all cases, are finding ways to facilitate human contact by way of re-thinking the structures of urban design itself. These changes depend not only on — indeed, they may barely depend at all upon! — city planners and, instead, demand that citizens advocate for their own interests. Such advocacy needn’t entail using the language of architects and urban designers and can, instead, focus on words or themes such as ‘community’ or ‘safe for children to bike’ or ‘closer to community resources’ or ‘slower streets’ or ‘more green space’. After robustly, and regularly, issuing such calls then the landscape may begin to change to facilitate both human happiness and smaller environmental food prints.

If there is a flaw to this book, it is that many of the examples presume that small scale experiments necessarily are scalable to broad communities. I don’t know that these examples do not scale but, because of the relatively small sample-set and regularity at which Montgomery leverages them, it’s not clear how common or effective the interventions he proposes genuinely are. Nevertheless, this is a though-provoking books that challenges the reader to reflect on how cities are, and should be, built to facilitate and enable the citizens who reside within and beyond their boundaries.