An Amateur Photographer’s Long-Term Review of the iPhone 12 Pro Camera System

I bought an iPhone 12 Pro mid-cycle in March 2021 and have been shooting with it for the past several months in a variety of weather conditions. I was very pleased with the iPhone 11 Pro with the exception of the green lens flares that too-frequently erupt when shooting with it at night. Consider this a longish-term review of the 12 Pro with comparisons to the 11 Pro, and scattered with photos taken exclusively with the 12 Pro and edited in Apple Photos and Darkroom on iOS.

Background

I’m by definition an amateur photographer; I shoot using my iPhone as well as a Fuji X100F, and get out to take photos at least once or twice a week during photo walks that last a few hours. I don’t earn any money from making photos and shoot with it for my own personal enjoyment. Most of my photos are street or urban photography, with a smattering of landscape shots and photos of friends and family thrown in.

To be clear up front: this is not a review of the iPhone 12 Pro, proper, but just the camera system. This said, it’s worth noting that the hardware differences between the iPhone 11 Pro and 12 Pro are pretty minor. The 26mm lens is now f/1.6 and the 13mm can be used with night mode. At a software level, the 12 Pro introduced the ability to shoot Apple ProRAW and introduced Smart HDR 3, as well as Deep Fusion to improve photographic results in low to middling light. Deep Fusion, in particular, has no discernible effect on the shots I take, but maybe I’m just not pixel peeping enough to see what it’s doing.

For the past few years I’ve shot with a number of cameras, including: an iPhone 6 and 7, and 11 Pro, a Fuji X100 and 100F, a Sony RX1002, and an Olympus EM10ii. I’ve printed my work in a couple personal books, and also printed photos from all these systems at various sizes and hang the results on my walls. When I travel it’s with a camera or two in tow. If you want a rough gauge of the kinds of photos I take you might want to take a gander at my Instagram.

Also, while I own a bunch of cameras, photos are my jam. I’ll be focusing mostly on how well the iPhone 12 Pro makes images with a small aside to talk about its video capabilities. For more in-depth technical reviews of the 12 Pro I’d suggest checking out Halide’s blog.

The Body

The iPhone 11 Pro had a great camera system but it was always a bit awkward to hold the phone when shooting because of its rounded edges. Don’t get me wrong, it helped the phone feel more inviting than the 12 Pro but was less ideal for actual daily photography and I find it easier to get, and retain, a strong grip on the 12 Pro. Your mileage may vary.

I kept my 11 Pro in an Apple silicon case and I do the same for the 12 Pro. One of the things I do with some regularity is press my phone right against glass to reduce glare when I’m shooting through a window or other transparent substance. With the 12 Pro’s silicon case I can do this without the glass I’m pressed against actually touching the lens because there’s a few millimetres between the case and the lens element. The same was also true of my 11 Pro and the Apple silicon case I had attached to it.

I like the screen of the 12 Pro, though I liked the screen in the 11 Pro as well. Is there a difference? Yeah, a bit, insofar as my blacks are more black on the 12 Pro but I wouldn’t notice the difference unless the 11 Pro and 12 Pro were right against one another. I can see both clearly enough to frame shots in sunny days while shooting which is what I really care about.

While the phone doesn’t have any ability to tilt the screen to frame shots, you can use a tripod to angle your phone and then frame and shoot using an Apple Watch if you have one. It’s a neat function and you can actually use an Apple Watch as a paired screen if you’re taking video using the main lenses. I tend to shoot handheld, however, and so only have used the Apple Watch trick when shooting a video using the main cameras on the back of the 12 Pro.

I don’t ever really use the flash so I can’t comment on it, though I do occasionally use the flash as a light to illuminate subjects I’m shooting with another camera. It’s not amazing but it works in a pinch.

The battery is so-so based on my experience. The 12 Pro’s battery is a tad smaller than the one in my 11 Pro, which means less capacity, though in the five months I’ve owned the 12 Pro the battery health hasn’t degraded at all which wasn’t the case with the 11 Pro. This said, if I’m out shooting exclusively with the 12 Pro I’m going to bring a battery pack with me just like when I went out for a day of shooting with the 11 Pro. If it’s not a heavy day of shooting, however, I reliably end the day with 20% or more battery after the 12 Pro has been off the charger for about 14-17 hours with middling usage.

Probably the coolest feature of the new 12 series iPhones is their ability to use magnetic attachments. I’ve been using a Luma Cube Telepod Tripod stand paired with a Moment Tripod Mount with MagSafe. It’s been pretty great for video conferences and is the coolest hardware feature that was added to the 12-line of phones in my opinion. It’s a shame that there isn’t a wider ecosystem supporting this hardware feature this many months after release.

Camera App

The default Apple camera app is fine, I guess. I like that you can now set the exposure and the app will remember it, which has helpfully meant that I can slightly under-expose my shots by default as is my preference. However, the default app still lacks a spirit guide which is really, really, really stupid, and especially so in a so-called “Pro” camera that costs around $2,000 (CAD) after Apple Care, a case, and taxes. It’s particularly maddening given that the phone includes a gyroscope that is used for so many other things in the default camera app like providing guidance when taking pano shots or top-down shots, and so forth.

It’s not coming back, but I’m still annoyed at how Apple changed burst mode in iOS. It used to be you could hold the shutter button in the native camera app or the volume rocker to active a burst but now you hold the shutter button and pull it to the left. It’s not a muscle memory I’ve developed and also risks screwing up my compositions when I’m shooting on the street so I don’t really use burst anymore which is a shame.

As a note, I edit almost all my photos in the Darkroom extension for Photos. It crashes all the damn time and it is maddening. I’d hoped these crashes would go away when I upgraded from the 11 Pro to the 12 Pro but they haven’t. It is very, very, very frustrating. And the crashes happen all the damn time.

Image Quality

In a theoretical world upgrading my camera would lead to huge differences in image quality, but in practice that’s rarely the case. It is especially not the case when shifting from the 11 Pro to the 12 Pro, save for in very particular situations. The biggest change and improvement that is noticeable in daily situations is when you’re shooting scenes where there is significant dynamism in the scene, such as when you’re outside on a bright day; the sky and the rest of the scene are kept remarkably intact without your highlights or shadows being blown out. Even when compared to a camera with an APS-C or Micro 4/3 sensor it’s impressive, and I can get certain bright day shots with the iPhone 12 Pro that wouldn’t be possible to easily capture with my Fujifilm X100F or Olympus EM10ii.

The other upgrade is definitely that, due to sensor and computational power, you can get amazing lowlight shots using the ultra-wide lens using Night Mode. Shots are sometimes a bit noisy or blotchy but still I can get photos that are impossible to otherwise get handheld with an APS-C sensor.

Relatedly, the ultra-wide’s correction for distortion is pretty great and it’s noticeably better than the ultra-wide lens correction on the 11 Pro. If you’re shooting wide angle a lot then this is likely one of the few software improvements you’ll actually benefit from with some regularity.

One of the most heralded features of the 12 Pros was the ability to shoot ProRaw. In bright conditions it’s not worth using; I rarely detect a noticeable improvement in quality nor does it significantly enhance how I can edit a photo in those cases. However, in darker situations or more challenging low-light indoor situations it can be pretty helpful in retaining details that can be later recovered. That said, it hasn’t transformed how I shoot per se; it’s a nice-to-have, but not something that you’re necessarily going to use all the time.

You might ask how well portrait mode works but, given that I don’t use it that often, I can’t comment much beyond that it’s a neat feature that is sufficiently inconsistent that I don’t use it for much of anything. There are some exceptions, such as when shooting portraits at family events, but on the whole I remain impressed with it from a technology vantage point while being disappointed in it from a photographer’s point of view. If I want a shallow depth of field and need to get a shot I’m going to get one of my bigger cameras and not risk the shot with the 12 Pro.

Video

I don’t really shoot video, per se, and so don’t have a lot of experience with the quality of video production on it. Others have, however, very positively discussed about the capabilities of the cameras and I trust what they’ve said.

That said, I did a short video for a piece I wrote and it turned out pretty well. We shot using the ‘normal’ lens at 4K and my employer’s video editor subsequently graded the video. This was taken in low-light conditions and I used my Apple Watch as a screen so I could track what I was doing while speaking to camera.

I’ve also used my iPhone 12 Pro for pretty well all the numerous video conferences, government presentations (starting at 19:45), classes I’ve taught, and media engagements I’ve had over the course of the pandemic. In those cases I’ve used the selfie camera and in almost all situations persons on the other side of the screen have commented on the high quality of my video. I take that as a recommendation of the quality of the selfie cameras for video-related purposes.

Frustrations

I’ll be honest: what I most hoped would be better with the iPhone 12 Pro was that the default Photos app would play better with extensions. I use Darkroom as my primary editing application and after editing 5-10 photos the extension reliably crashes and I need to totally close out Photos before I can edit using the extension again.1 It is frustrating and it sucks.

What else hasn’t improved? The 12 Pro still has green lens flares when I take photos at night. It is amazingly frustrating that, despite all the computing power in the 12 Pro, this is an issue that Apple’s software engineers can’t fix given the current inability of their hardware engineers to resolve the issue. Is this a problem? Yes, it is, especially if you ever shoot at night. None of my other-less expensive-cameras suffer from this, and it’s maddening the 12 Pro still does. It’s made worse by the fact that the Photos app doesn’t include a healing tool to remove these gross little flares and, thus, requires me to use another app (typically Snapseed) to get rid of them.

Finally, I find that the shots with the 12 Pro are often too sharpened to my preference, which means that I tend to turn down the clarity in Darkroom to soften a lot of the photos I take. It’s an easy fix, though (again) not one you can correct in the default Photos application.

Conclusion

So what do I think of the iPhone 12 Pro? It’s the best non-Fuji X100F that I typically have when I’m out and about, and the water resistance means I’m never worried to shoot with it in the elements.2

If I have a choice, do I shoot with the Fuji X100F or the iPhone 12 Pro? If a 35mm equivalent works, then I shoot with the Fuji. But if I want a wide angle shot it’s pretty common for me to pull the 12 Pro and use it, even while out with the Fuji. They’ve got very different colour profiles but I still like using them both. Sometimes I even go on photowalks with just the 12 Pro and come back with lots of keepers.

This is all to say that the X100F and 12 Pro are both pretty great tools. I’m a fan of them both.

So…is the 12 Pro a major upgrade from the 11 Pro? Not at all. A bigger upgrade from earlier iPhones? Yeah, probably more so. I like the 12 Pro and use it everyday as a smartphone, and I like it as a camera. I also liked the 11 Pro as a portable camera and phone as well.

Should you buy the 12 Pro? Only if you really want the telephoto and the ability to edit ProRaw files. If that’s not you, then you’re probably going to be well off saving a chunk of change and getting the regular 12, instead.

(Note: All photos taken with an iPhone 12 Pro and edited to taste in Apple Photos and Darkroom.)


  1. Yes, I can edit right in Darkroom, and I do, but it’s not as convenient. ↩︎

  2. I admit to not treating the X100F with a lot of respect but I don’t use it when it’s pouring rain. The same isn’t true of the iPhone 12 Pro. ↩︎

Productivity and the iPad Pro: A Policy Wonk’s Review

Tools by Christopher Parsons
Every time Apple announces a new iPad, a slew of technology reviewers and YouTube personalities ask whether the newest iPad can finally replace a laptop. And, in almost every situation, they argue that the device can mostly, but not quite, serve as a replacement. But reviewers’ workflows—often involving film production, audio editing, and other marginally esoteric requirements—tend to be pretty different from those of non-AV professionals.

I don’t make videos for a living, nor do I engage in audio engineering. I’m a professional policy wonk and amateur photographer, which means that I do a lot of national video and audio interviews, a lot of writing and text-based communication, some image editing, and depressing amounts of media consumption. I also read a crazy numbers of PDFs and have to annotate them. And for the past two weeks I was consigned to work off my iPad Pro (2018) and iPhone Pro because my MacBook Air was getting its keyboard repaired.

So how successfully did I continue to work just from my non-laptop devices? Spoiler: it was pretty great and mostly convinced me I can lead a (mostly) iPad Pro work life.

The Tools

As mentioned, the hardware that I principally relied on included my iPad Pro 11” (2018) and iPhone Pro.

For the iPad I also had a Logitech Bluetooth keyboard and a Magic Trackpad, as well as a cheap stand. For importing my photos, I have an old USB-C hub that has a SD card reader. For the iPhone, I routinely used a knock-off Gorilla Pod tripod, Manfroto head, and AirPods.

On the software side of things, I used Mail, Pages, Wire, GoodNotes, Mendeley, Reeder, Photos and Darkroom, Safari, Google Drive and Docs, Tweetbot, and Apple Notes to get my daily work done on the iPad Pro.

For interviews I was at the mercy of whatever the interviewers wanted me to use on my iPhone Pro, which was usually either FaceTime, Skype, Signal, WhatsApp, or Zoom, and I used Google Meet for non-broadcast communications.

Successes

The Setup by Christopher Parsons
On the whole I was able to do everything using my iPad Pro and iPhone Pro that I was doing when I was relying on my MacBook Air and iPhone Pro. My reading and writing were largely unimpaired, and my communications with colleagues were not noticeably affected.

Specifically, I was able to continue importing and editing photos, and worked in Google Docs and Drive to leave comments and contribute to documents that were in progress. Email continued to be dealt with using the native client, and I kept on working on Word documents using Pages. Apple’s cloud storage meant I had access to all my files on my iPad, just as on my MacBook Air.

Working with PDFs was simple and easy: I imported them to GoodNotes and shared them into Mendeley after I’d annotated them. I then deleted them from GoodNotes to avoid having multiple iterations of a document in different apps.

All of my communications were easy to maintain, though it was admittedly annoying to have to pick up my phone whenever I received or needed to send a message in WhatsApp. It’d be great if Facebook committed to the service, and made it available on all iOS devices like Signal has already done.

Minor Annoyances

There were one or two things that were annoying. I had to take a photo with government identification, and then strip away some of the more sensitive information. It took me a bit of time to figure out that I could move the photo into Notes, scratch out the offending information, and then output the edited photo to Files to then be uploaded. But it was annoying, not impossible.

I also continue to struggle with a good blogging process on iOS devices. I used Ulysses for years but the lack of new updates for non-subscription users was grating. Other non-subscription-based apps, however, don’t really support images as well nor upload as nicely to this blog. So I’ve actually started using the (mediocre) WordPress client. It’s not impressive, but neither are any of the other clients.

Major Pain Points

First, Google Docs is a terrible application that doesn’t work well. Period. In documents where there are a lot of tracked changes and comments it becomes basically non-functional. It got so bad that I’d write text in Apple Notes and then just copy it into Google Docs, or else I’d be stuck waiting for minutes for a sentence to finally be input. Google Docs is generally a dumpster fire, though, and it’s a shame that Google hasn’t properly developed their app or service in all the years that Google has operated it. (In my MacBook Air, editing in Safari is only a marginally better experience. Google really needs to get its act together.)

Second, Slide Over is incredibly confusing to get working. I’ve owned an iPad for years and it was only in the last two weeks that I finally figured out how to control it, and doing so required watching an instructional video. It is bonkers that this feature is so unintuitive to use and yet so easy to trigger. That said, once I figured it out, it was a very positive and transformative productivity enhancement.

Third, I absolutely needed my iPhone for actual video conferencing. The iPad can do conferencing, but it’s form factor sucks for this kind of activity. That’s fine, and I’d be doing the same if I was doing interviews or video chats with a working MacBook Air in my possession. Still, you’re going to want another camera (and a headset with microphones) if you need to so high(ish) quality calls when you’re working purely from an iPad Pro.

And that’s really it. Beyond the Google Docs app being a trash fire (and, I would point out, it is also just a less-bad trashfire when accessed using Safari on a MacBook Air), the inane complexity of Slideover, and need for a separate device for video calls, the iPad Pro pretty nicely replaced my workflow on the Air. I missed the slightly larger screen, but not so much that it was a real issue.

Concluding Thoughts

I really appreciated and liked using my iPad Pro and iPhone Pro full time. It was easy to set up and tear down. It let me get my work done with fewer distractions than on my MacBook Air. And the screen is noticeably higher quality than the Air.

So if you have a relatively writing- and speaking-focused job, and are doing neither a lot of video or audio editing (or, I suspect, spreadsheet work) then the iPad Pro could be a good fit for your workflow. Does that mean that it’s better than working off a laptop? Nope! But also that what a lot of reviewers consider to be ‘normal’ and what authors and policy folks think are ‘normal’ are very different, with the latter category being pretty well supported on iPad Pros.

An Amateur Photographer’s Review of the iPhone 11 Pro Camera System

I bought my most expensive camera system last week: an iPhone 11 Pro. While the screen and battery life was something I was looking forward to, I was most looking forward to massively upgrading my smartphone camera. The potential to shoot portraits with a 52mm lens (as well as landscapes, street shots, and architecture…50mm is my preferred focal range), plus general shots with a 26mm and a 13mm equivalent was exciting. I’ve printed iPhone photos in the past and been happy with them, but would the new camera system live up to the marketing hype?

My Background

To be clear, I am by definition a very amateur photographer. Which, I think, actually makes this review a bit more useful than most. I’m not reviewing the iPhone 11 Pro as a phone or the entirety of the underlying operating system. I’m just focused on how well this device helps me make photos.

For the past few years I’ve shot with a bunch of cameras, including: an iPhone 6 and 7, Fuji x100,1 Sony rx100ii, and Olympus EM10ii. I’ve printed my work in a book, in photos of various sizes that are now hanging on my walls,2 and travelled all over the world with a camera in tow. I have historically tended towards street photography (broadly defined), some ‘travel’ photography (usually nature and landscape shots), abstracts, and admittedly relatively few portraits. If you want to get a rough assessment of the kinds, and quality, of photos that I take then I’d suggest you wander over to my Instagram profile.

I should be pretty clear, upfront: I make photos, not videos, and so have pretty well zero comments about the video camera functionalities on the iPhone 11 Pro. Also, if you’re looking for some raw technical stats on the iPhone cameras, I’d suggest you check out Halide’s assessment.

Body, Controls, and Handling

The iPhone 11 Pro is considerably larger in hand than the iPhone 7 that I came from. It’s also, with the Apple-branded clear case, quite slippery. This means that I’ve been super cautious in taking photos where dropping it might mean I’d lose it forever (e.g., shooting outstretched over rivers and major highways). The buttons are significantly more solid than my iPhone 7 and, as such, I’m disinclined to use them as a shutter button for fear of messing up my composition or introducing camera shake. Though if I’m being honest, it was pretty rare that I used anything other than the on-screen shutter button on my iPhone 7.

The screen of the iPhone 11 Pro, itself, is bright and beautiful. It’s night and day between it and the iPhone 7. To activate the camera from the lock screen you press and hold the camera icon; after a second or so, the camera app will open and you’re probably ready to shoot. Probably, you may ask? Yes: there’s a glitch in iOS 13 that means that sometimes the camera app launches but the image of what you’re trying to capture isn’t shown on the display. The solution it to take a shot and, afterwards, the display should display the image the camera is showing. Usually. But not always.

If you used burst mode a lot to get the right shot in a burst, get used to a lot of missed shots. In iOS 13, you press the shutter button in the camera app and slide to the left to initiate bursts; holding down on the shutter button start recording a short video (slide to the right if you want to record video and not hold down on the shutter button in the app). In actual use, I’ve ended up accidentally taking a bunch of short videos instead of a burst of shots, which meant I’ve missed capturing what I wanted to capture. A ‘Pro’ camera should let me set photo controls. The iPhone 11 Pro fails, seriously and significantly, in this regard.

When composing a shot, you’ll routinely see what is beyond the focal length you’re using. This means that, as an example, when you’re shooting with the 26mm lens, you’ll see what would be captured by the 13mm lens. On screen, the extended parts of the scene which would be captured by the wider camera is slightly desaturated and on the outskirts of the grid you can enable in the Camera app settings. Some reviewers have said that this looks like what you might see when looking through a rangerfinder-style camera, like a Fuji x100. I fundamentally disagree: those reviewers have not clearly used a rangefinder for extended periods of time, where you can see to the left and right of the frame when looking through the viewfinder. It’s helpful to have that in a camera you’ve raised to your eye, because the rest of your vision may be obscured and so you may not realize what’s about to step into your frame. This is less of an issue when shooting in a smartphone. Much less of an issue.

If you rely on a tilted screen in a mirrorless or DSLR to get the shots you like, while, you’re going to be out of luck. It’s a camera phone without an articulating screen. Maybe Samsung’s folding phones will integrate this kind of feature into their camera app…

I haven’t shot using the flash, so I can’t comment on what it’ll be like to use.

If you’ve used the iPhone Camera app, you’ll find that few things have meaningfully changed. The ‘big’ changes include a notification along the top left corner if night mode is activated (along with how many seconds it’ll take to use the feature) and an arrow along the top of the app that, if tapped, will let you switch some of the default features (e.g., flash on/off/auto, live images on/off, timer, or filter). Despite being a ‘professional’ device—which has a pile of internal gyroscopes!—the camera app doesn’t include a horizon level, though if you’re taking flat shots you’ll get an indicator to show if you’re perfectly level.

I tend to see the stock photos app as part of the control of an iPhone camera. Some of the additions are good—tilt shifts in particular!—but I loath losing how iOS 12 ‘grouped’ features into categories like light, colour, and black and white. And I really miss being able to adjust neutrals and tones in the black and white setting. Why’d you take those away, Apple? WHY!?

The battery life when I’ve taken the iPhone 11 Pro for a day of shooting has been great; I was out for about 7 hours one day to just shoot and took about 250 photos, while listening to podcasts and reading news and such. I had 17% after a full days normal use plus shooting, but I was shooting with a brand new battery in ideal temperatures for batteries (20-24 degrees). The real test will be when winter hits in countries like Canada or the northern USA and we see how well the batteries hold up in semi-hostile environmental conditions.

Image Quality

I’ve been super impressed with the camera system included in the iPhone 11 Pro. Despite being impressed there are definitely areas where computational photography is still very much a work in progress.

I’ve been taken aback by just how much dynamic range is captured by this camera when I’ve been making photos. This is especially the case when I’ve used the camera in low-light or sheer dark conditions. As is true of almost all cameras, it generally performs admirably in well lit situations. What follows are a selection of shots taken over a three day period; they are all edited to my taste, using just the stock photos app. What follows is a (broad) selection of those photos in indoor, high day, and sunset conditions.

I also did a late evening photowalk. It was pitch black (for a major urban city…) and so the following images are good representations of what urban photographers can probably pull off without a tripod.4 In many of the images I was resting the camera either tightly against my body or something in the natural environment (e.g., a tree trunk) to reduce camera shake.

I did run into some computational…weirdness…in some of the shots. When shooting the Cinesphere, I sometimes got this weird yellow arc that stretched along the top. Also when shooting scenes with the Cinesphere and the Japanese Temple Bell, there were times when it looked like the upper right of the frame (proximate to the Cinesphere in the shot) had extremely severe vignetting. Also, I noticed that I got lens flare when shooting at night; while this could be corrected in post using something like Snapseed I can’t ever recall dealing with flare on a regular basis on prior iPhones.

Also, don’t buy this camera and expect to get cool light trails using the default camera application. While night mode takes a lot of exposures to create the final shot, you’ll only get the slightest of blur from moving vehicles. Similarly, due to the fixed aperture of the cameras you’re not going to get any cool light flares or sun stars , nor can you seriously control the depth of field as you could in a camera with much more manual control.5

Conclusion

The iPhone 11 Pro is a marvel of a camera system. Seriously: it’s spectacular for the size of the sensor, though it damn well better be given its sheer cost!

I can see this camera fitting into the lives of a lot of creative amateurs. (Probably professionals, too, but with grumbles.) For me, and people with at my skill level with photography, this is a major equipment investment that I think will be pretty great: it’s a supplement to, not a replacement for, the aging Sony rx100ii I carry with me on a day to day basis, and it’s genuinely fun to shoot on. The Photos app, while annoying in some of its reconfiguration, is generally more powerful than in its last version. And the ability to easily and quickly shift between the 13-52mm focal ranges cannot be appreciated enough: it’s like having a permanent kit lens attached to your smartphone, and that’s just awesome.

Should you upgrade or buy this camera system? I dunno. I had an older phone and totally could have stuck with it for another year or so, and I’m happy with my upgrade. But for around $2,000(CAD) you could get some really nice new glass, which might be a better investment if you’re always carrying your mirrorless camera or DSLR with you, or if having better control of aperture, camera levels, or other ‘niceties’ are the core thing you’re looking for. But if you’ve increasingly been leaving your ‘big’ camera and glass at home, but still want a lot of functionality when making photos on your smartphone, and have the disposable income, then you’ll probably be pretty happy with the iPhone 11 Pro.

  1. In honesty, it was too much camera for me at the time, but it taught me to really love and want to work on my photography. ↩︎
  2. My largest prints are 24×36, from my Sony rx100ii and Olympus EM10ii (using an Olympus 17mm 1.8 lens). ↩︎
  3. Why won’t Apple bring the camera filters in Messages straight into their camera app? Oh hey! Did you even know Apple had a pile of filters for fun stuff in Messages? I bet not given how buried they they—open messages, tap the star button in the lower left side, then tap the three concentric rings, fight with the stupid UI a bit, and tada!↩︎
  4. If using a tripod, the internal gyroscopes will detect this and let you take up to a 10s ‘exposure’. ↩︎
  5. Some of this might change as Halide and other competing camera app manufactures update their applications. But the stock camera app is pretty limited in computation control of the aperture, especially for landscape or street photography. ↩︎
Link

The Best Coffee Roasters in Toronto

Only helpful for those local to Toronto, but it’s great for those of us that are. I particularly enjoy Pilot and Propeller, though admit that my favorite place to get coffee these days is from Ideal Coffee (the Red Sea beans are absolutely terrific). Still, I look forward to trying the whole list and determining if there is a company that can unseat Ideal Coffee or Pilot and Propeller!

The Best Coffee Roasters in Toronto

Link

CSIS’s New Powers Demand New Accountability Mechanisms

CSIS’s New Powers Demand New Accountability Mechanisms:

It is imperative that the Canadian public trust that CSIS is not acting in a lawless manner. And while improving how SIRC functions, or adding Parliamentary review, could regain or maintain that trust, a more cost-sensitive approach could involve statutory reporting. Regardless, something must be done to ensure that CSIS’ actions remain fully accountable to the public, especially given the new powers the Service may soon enjoy. Doing anything less would irresponsibly expand the state’s surveillance capabilities and threaten to dilute the public’s trust in its intelligence and security service.

 

Quote

…nowhere does he raise the possibility that feedback loops produced by digital technologies might also be harming governance. Consider a 2011 survey by a British insurance company in which 11 percent of respondents claimed to have seen an incident but chose not to report it, worried that higher crime statistics for their neighborhood would significantly reduce the value of their properties. In this case, the quality of future data is intricately dependent on how much of the current data is disclosed; unconditional “openness” is the wrong move here—precisely because of feedback loops.

I would note that this failure to appreciate the social implications of novel monitoring technologies is something that is drastically unappreciated by public policy planners.

Quote

The totalizers would happily follow Johnson in seeking answers to questions such as “So what does the Internet want?”—as if the Internet were a living thing with its own agenda and its own rights. Cue a recent Al Jazeera column: “The internet is not territory to be conquered, but life to be preserved and allowed to evolve freely. … From understanding the internet as a life form that is in part human, it follows that the internet itself has rights.”13 That is the kind of crazy talk to be avoided. The particularizers would not invoke “the Internet” to embark on a quixotic attempt to re-make democratic politics; but the totalizers, in their quasi-religious belief, would do so gladly.

A good account of the Internet would never need to mention that dreadful word at all. This stringent requirement might uproot most of our Internet thinkers from the plateau of banal and erroneous generalizations where they have resided for the last two decades; after all, it is the very notion of “the Internet” that has allowed them to stay there for so long. Now that Internet-centrism is not just a style of thought but also an excuse for a naïve and damaging political ideology, the costs of letting its corrosive influence go unnoticed have become too high.