Aside

2022.4.9

I’ve been doing my own IT for a long while, as well as small tasks for others. But I haven’t had to do an email migration—while ensuring pretty well no downtime—in a long while.

Fortunately the shift from Google Mail (due to the deprecation of grandfathered accounts that offered free custom domain integration) to Apple’s iCloud+ was remarkably smooth and easy. Apple’s instructions were helpful as were those of the host I was dealing with. Downtime was a couple seconds, at most, though there was definitely a brief moment of holding my breath in fear that the transition hadn’t quite taken.

Link

‘Glass Time’ Shortcut

man people woman iphone
Photo by Ron Lach on Pexels.com

Like most photographers I edit my images with the brightness on my screen set to its maximum. Outside of specialized activities, however, I and others don’t tend to set the brightness this high so as to conserve battery power.

The result is that when we, as photographers, as well as members of the viewing public tend to look images on photography platforms we often aren’t seeing them as their creator(s) envisioned. The images are, quite starkly, darker on our screens than on those of the photographers who made them.1

For the past few months whenever I’ve opened Glass or looked at photos on other platforms I’m made an effort to ensure that I’ve maximized the brightness on my devices as I’ve opened the app. This said, I still forget sometimes and only realize halfway through a viewing session. So I went about ensuring this ‘mistake’ didn’t happen any more by creating a Shortcut called ‘Glass Time’!

The Shortcut is pretty simple: when I run it, it maximizes the brightness of my iOS device and opens the Glass app. If you download the Shortcut it’s pretty easy to modify it to instead open a different application (e.g., Instagram, 500px, Flickr, etc). It’s definitely improved my experiences using the app and helped me to better appreciate the images that are shared by individuals on the platform.

Download ‘Glass Time’ Shortcut


  1. Of course there are also issues associated with different devices having variable maximum brightness and colour profiles. These kinds of differences are largely intractable in the current technical milieu. ↩︎

Policing the Location Industry

Photo by Ingo Joseph on Pexels.com

The Markup has a comprehensive and disturbing article on how location information is acquired by third-parties despite efforts by Apple and Google to restrict the availability of this information. In the past, it was common for third-parties to provide SDKs to application developers. The SDKs would inconspicuously transfer location information to those third-parties while also enabling functionality for application developers. With restrictions being put in place by platforms such as Apple and Google, however, it’s now becoming common for application developers to initiate requests for location information themselves and then share it directly with third-party data collectors.

While such activities often violate the terms of service and policy agreements between platforms and application developers, it can be challenging for the platforms to actually detect these violations and subsequently enforce their rules.

Broadly, the issues at play represent significant governmental regulatory failures. The fact that government agencies often benefit from the secretive collection of individuals’ location information makes it that much harder for the governments to muster the will to discipline the secretive collection of personal data by third-parties: if the government cuts off the flow of location information, it will impede the ability of governments themselves obtain this information.

In some cases intelligence and security services obtain location information from third-parties. This sometimes occurs in situations where the services themselves are legally barred from directly collecting this information. Companies selling mobility information can let government agencies do an end-run around the law.

One of the results is that efforts to limit data collectors’ ability to capture personal information often sees parts of government push for carve outs to collecting, selling, and using location information. In Canada, as an example, the government has adopted a legal position that it can collect locational information so long as it is de-identified or anonymized,1 and for the security and intelligence services there are laws on the books that permit the collection of commercially available open source information. This open source information does not need to be anonymized prior to acquisition.2 Lest you think that it sounds paranoid that intelligence services might be interested in location information, consider that American agencies collected bulk location information pertaining to Muslims from third-party location information data brokers and that the Five Eyes historically targeted popular applications such as Google Maps and Angry Birds to obtain location information as well as other metadata and content. As the former head of the NSA announced several years ago, “We kill people based on metadata.”

Any arguments made by either private or public organizations that anonymization or de-identification of location information makes it acceptable to collect, use, or disclose generally relies tricking customers and citizens. Why is this? Because even when location information is aggregated and ‘anonymized’ it might subsequently be re-identified. And in situations where that reversal doesn’t occur, policy decisions can still be made based on the aggregated information. The process of deriving these insights and applying them showcases that while privacy is an important right to protect, it is not the only right that is implicated in the collection and use of locational information. Indeed, it is important to assess the proportionality and necessity of the collection and use, as well as how the associated activities affect individuals’ and communities’ equity and autonomy in society. Doing anything less is merely privacy-washing.

Throughout discussions about data collection, including as it pertains to location information, public agencies and companies alike tend to provide a pair of argument against changing the status quo. First, they assert that consent isn’t really possible anymore given the volumes of data which are collected on a daily basis from individuals; individuals would be overwhelmed with consent requests! Thus we can’t make the requests in the first place! Second, that we can’t regulate the collection of this data because doing so risks impeding innovation in the data economy.

If those arguments sound familiar, they should. They’re very similar to the plays made by industry groups who’s activities have historically had negative environmental consequences. These groups regularly assert that after decades of poor or middling environmental regulation that any new, stronger, regulations would unduly impede the existing dirty economy for power, services, goods, and so forth. Moreover, the dirty way of creating power, services, and goods is just how things are and thus should remain the same.

In both the privacy and environmental worlds, corporate actors (and those whom they sell data/goods to) have benefitted from not having to pay the full cost of acquiring data without meaningful consent or accounting for the environmental cost of their activities. But, just as we demand enhanced environmental regulations to regulate and address the harms industry causes to the environment, we should demand and expect the same when it comes to the personal data economy.

If a business is predicated on sneaking away personal information from individuals then it is clearly not particularly interested or invested in being ethical towards consumers. It’s imperative to continue pushing legislators to not just recognize that such practices are unethical, but to make them illegal as well. Doing so will require being heard over the cries of government’s agencies that have vested interests in obtaining location information in ways that skirt the law that might normally discipline such collection, as well as companies that have grown as a result of their unethical data collection practices. While this will not be an easy task, it’s increasingly important given the limits of platforms to regulate the sneaky collection of this information and increasingly problematic ways our personal data can be weaponized against us.


  1. “PHAC advised that since the information had been de-identified and aggregated, it believed the activity did not engage the Privacy Act as it was not collecting or using “personal information”. ↩︎
  2. See, as example, Section 23 of the CSE Act ↩︎

Solved: Set A Default Email Address in Apple Contacts

I figured out how to set a default email address for a contact in Apple Contacts, where the contact has multiple email addresses associated with them.

The Problem

Apple support claims that Siri is capable of learning which email address to use when someone you are contacting has multiple email addresses associated with them in your contact book. In my experience this is hit and miss. The result is that you need to check, each time, to ensure that an email is being sent to the correct email address.

The Solution

For the contact in question you must ensure that the email you want to most regularly contact them is the first email in the list of emails. Thus, if you had a set of emails ordered as such:

  • example1@email.me
  • example2@email.me
  • example3@email.me

and wanted ‘example 3@email.me’ to be the default email that you send message to, you would:

  1. Open Contacts and the individual’s card, and then click ‘Edit’
  2. Copy the email that you want to remove as the current default (e.g., example1@email.me)
  3. Create a new email record by clicking the field beside ‘Other’ at the bottom of the list and paste the email address you copied at 2
  4. In the top email field (i.e., example1@email.me) replace it with the preferred default email (e.g. example3@email.me)
  5. Delete the now-duplicated example3@email.me
  6. Click ‘done’

At the conclusion of this reordering, your email order list would appear as:

  • example3@email.me
  • example2@email.me
  • example1@email.me

The result of the reordering is that you should, by default, now send email to the contact’s example3@email.me. I hope this helps anyone else who’s running into this problem!

Vacation Street Photography Challenge

(Come Towards the Light by Christopher Parsons)

This year I took a very late vacation while Toronto was returning to its new normal. I’ve been capturing the city throughout the COVID-19 pandemic and I wanted to focus in on how the streets felt.

During the pandemic we’ve all been attached to our devices, and our phones in particular, and thus decided to document the city through the lens of our ever-present screen: the smartphone. I exclusively shot with my iPhone 12 Pro using the Noir filter. This filter created a strong black and white contrast, with the effect of deepening shadows and blacks and lifting highlights and whites. I choose this, over a monotone, as I wanted to emphasize that while the city was waking up there were still stark divides between the lived experiences of the pandemic and a continuation of strong social distancing from one another.

95% of my photos were captured using ProRaw with the exception of those where I wanted to utilize Apple’s long exposure functionality in the Photos application.

Darkroom Settings

In excess of the default Noir filter, I also created a secondary filter in Darkroom that adjusted what came off the iPhone just a bit to establish tones that were to my liking. My intent was to make the Noir that much punchier, while also trying to reduce a bit of the sharpness/clarity that I associate with Apple’s smartphone cameras. This adjustment reflected, I think, that digital communications themselves are often blurrier or more confused than our face-to-face interactions. Even that which seems clear, when communicated over digital systems, often carries with it a misrepresentation of meaning or intent.

(more…)

Apple Music Voice Plan- The New iPod Shuffle?

A lot of tech commentators are scratching their heads over Apple’s new Apple Music Voice Plan. The plan is half the price of a ‘normal’ Apple Music subscription. If subscribed, individuals will can ask Siri to play songs or playlists but will not have access to a text-based or icon-based way to search for or play music.

I am dubious that this will be a particularly successful music plan. Siri is the definition of a not-good (and very bad) voice assistant.

Nevertheless, Apple has released this music plan into the world. I think that it’s probably most like the old iPod Shuffle that lacked any ability to really select or manage an individual’s music. The Shuffle was a cult favourite.

I have a hard time imagining a Siri-based interface developing a cult following like the iPods of yore, but the same thing was thought about the old Shuffle, too.

Aside

2021.10.14

I’m liking the incremental update to the Photos application on iOS and iPadOS in the newest release of the operating systems. The ability to easily add titles to my photos and also access the EXIF metadata helps to maintain a (slightly) more organized photo library. Access to this information also makes it easier to share out photos straight from the Photos app, since I can copy the title of an image as part of sharing it.

However, I’m still missing the ability to create Smart Folders. Specifically I want to be able to have folders that are accessible, on iOS devices, and which sort based on the camera that took a given set of images. It’s been in MacOS for a very, very long time and it’s nuts that this kind of feature parity hasn’t been reached between operating systems.

I haven’t seen evidence that the newest version of iOS has fixed the green flare issue (which I first encountered when reviewing my iPhone 11 Pro). I know it was in an earlier beta but haven’t yet seen it implemented in a production release.

Aside

2021.8.12

If iOS 15 automatically removes the green lens flares that appear when shooting with the device at night that’d go a long way to improving the quality of night photos taken with the device (and fix one of the annoyances I raised in my reviews of the iPhone 11 Pro and 12 Pro). Here’s hoping that the software-side corrections make their way into the final release.

I do wonder, however, whether there are any photographers who have leaned into this lens flare and thus will have their photography negatively affected by Apple’s decision?

An Amateur Photographer’s Long-Term Review of the iPhone 12 Pro Camera System

I bought an iPhone 12 Pro mid-cycle in March 2021 and have been shooting with it for the past several months in a variety of weather conditions. I was very pleased with the iPhone 11 Pro with the exception of the green lens flares that too-frequently erupt when shooting with it at night. Consider this a longish-term review of the 12 Pro with comparisons to the 11 Pro, and scattered with photos taken exclusively with the 12 Pro and edited in Apple Photos and Darkroom on iOS.

Background

I’m by definition an amateur photographer; I shoot using my iPhone as well as a Fuji X100F, and get out to take photos at least once or twice a week during photo walks that last a few hours. I don’t earn any money from making photos and shoot with it for my own personal enjoyment. Most of my photos are street or urban photography, with a smattering of landscape shots and photos of friends and family thrown in.

To be clear up front: this is not a review of the iPhone 12 Pro, proper, but just the camera system. This said, it’s worth noting that the hardware differences between the iPhone 11 Pro and 12 Pro are pretty minor. The 26mm lens is now f/1.6 and the 13mm can be used with night mode. At a software level, the 12 Pro introduced the ability to shoot Apple ProRAW and introduced Smart HDR 3, as well as Deep Fusion to improve photographic results in low to middling light. Deep Fusion, in particular, has no discernible effect on the shots I take, but maybe I’m just not pixel peeping enough to see what it’s doing.

For the past few years I’ve shot with a number of cameras, including: an iPhone 6 and 7, and 11 Pro, a Fuji X100 and 100F, a Sony RX1002, and an Olympus EM10ii. I’ve printed my work in a couple personal books, and also printed photos from all these systems at various sizes and hang the results on my walls. When I travel it’s with a camera or two in tow. If you want a rough gauge of the kinds of photos I take you might want to take a gander at my Instagram.

Also, while I own a bunch of cameras, photos are my jam. I’ll be focusing mostly on how well the iPhone 12 Pro makes images with a small aside to talk about its video capabilities. For more in-depth technical reviews of the 12 Pro I’d suggest checking out Halide’s blog.

The Body

The iPhone 11 Pro had a great camera system but it was always a bit awkward to hold the phone when shooting because of its rounded edges. Don’t get me wrong, it helped the phone feel more inviting than the 12 Pro but was less ideal for actual daily photography and I find it easier to get, and retain, a strong grip on the 12 Pro. Your mileage may vary.

I kept my 11 Pro in an Apple silicon case and I do the same for the 12 Pro. One of the things I do with some regularity is press my phone right against glass to reduce glare when I’m shooting through a window or other transparent substance. With the 12 Pro’s silicon case I can do this without the glass I’m pressed against actually touching the lens because there’s a few millimetres between the case and the lens element. The same was also true of my 11 Pro and the Apple silicon case I had attached to it.

I like the screen of the 12 Pro, though I liked the screen in the 11 Pro as well. Is there a difference? Yeah, a bit, insofar as my blacks are more black on the 12 Pro but I wouldn’t notice the difference unless the 11 Pro and 12 Pro were right against one another. I can see both clearly enough to frame shots in sunny days while shooting which is what I really care about.

While the phone doesn’t have any ability to tilt the screen to frame shots, you can use a tripod to angle your phone and then frame and shoot using an Apple Watch if you have one. It’s a neat function and you can actually use an Apple Watch as a paired screen if you’re taking video using the main lenses. I tend to shoot handheld, however, and so only have used the Apple Watch trick when shooting a video using the main cameras on the back of the 12 Pro.

I don’t ever really use the flash so I can’t comment on it, though I do occasionally use the flash as a light to illuminate subjects I’m shooting with another camera. It’s not amazing but it works in a pinch.

The battery is so-so based on my experience. The 12 Pro’s battery is a tad smaller than the one in my 11 Pro, which means less capacity, though in the five months I’ve owned the 12 Pro the battery health hasn’t degraded at all which wasn’t the case with the 11 Pro. This said, if I’m out shooting exclusively with the 12 Pro I’m going to bring a battery pack with me just like when I went out for a day of shooting with the 11 Pro. If it’s not a heavy day of shooting, however, I reliably end the day with 20% or more battery after the 12 Pro has been off the charger for about 14-17 hours with middling usage.

Probably the coolest feature of the new 12 series iPhones is their ability to use magnetic attachments. I’ve been using a Luma Cube Telepod Tripod stand paired with a Moment Tripod Mount with MagSafe. It’s been pretty great for video conferences and is the coolest hardware feature that was added to the 12-line of phones in my opinion. It’s a shame that there isn’t a wider ecosystem supporting this hardware feature this many months after release.

Camera App

The default Apple camera app is fine, I guess. I like that you can now set the exposure and the app will remember it, which has helpfully meant that I can slightly under-expose my shots by default as is my preference. However, the default app still lacks a spirit guide which is really, really, really stupid, and especially so in a so-called “Pro” camera that costs around $2,000 (CAD) after Apple Care, a case, and taxes. It’s particularly maddening given that the phone includes a gyroscope that is used for so many other things in the default camera app like providing guidance when taking pano shots or top-down shots, and so forth.

It’s not coming back, but I’m still annoyed at how Apple changed burst mode in iOS. It used to be you could hold the shutter button in the native camera app or the volume rocker to active a burst but now you hold the shutter button and pull it to the left. It’s not a muscle memory I’ve developed and also risks screwing up my compositions when I’m shooting on the street so I don’t really use burst anymore which is a shame.

As a note, I edit almost all my photos in the Darkroom extension for Photos. It crashes all the damn time and it is maddening. I’d hoped these crashes would go away when I upgraded from the 11 Pro to the 12 Pro but they haven’t. It is very, very, very frustrating. And the crashes happen all the damn time.

Image Quality

In a theoretical world upgrading my camera would lead to huge differences in image quality, but in practice that’s rarely the case. It is especially not the case when shifting from the 11 Pro to the 12 Pro, save for in very particular situations. The biggest change and improvement that is noticeable in daily situations is when you’re shooting scenes where there is significant dynamism in the scene, such as when you’re outside on a bright day; the sky and the rest of the scene are kept remarkably intact without your highlights or shadows being blown out. Even when compared to a camera with an APS-C or Micro 4/3 sensor it’s impressive, and I can get certain bright day shots with the iPhone 12 Pro that wouldn’t be possible to easily capture with my Fujifilm X100F or Olympus EM10ii.

The other upgrade is definitely that, due to sensor and computational power, you can get amazing lowlight shots using the ultra-wide lens using Night Mode. Shots are sometimes a bit noisy or blotchy but still I can get photos that are impossible to otherwise get handheld with an APS-C sensor.

Relatedly, the ultra-wide’s correction for distortion is pretty great and it’s noticeably better than the ultra-wide lens correction on the 11 Pro. If you’re shooting wide angle a lot then this is likely one of the few software improvements you’ll actually benefit from with some regularity.

One of the most heralded features of the 12 Pros was the ability to shoot ProRaw. In bright conditions it’s not worth using; I rarely detect a noticeable improvement in quality nor does it significantly enhance how I can edit a photo in those cases. However, in darker situations or more challenging low-light indoor situations it can be pretty helpful in retaining details that can be later recovered. That said, it hasn’t transformed how I shoot per se; it’s a nice-to-have, but not something that you’re necessarily going to use all the time.

You might ask how well portrait mode works but, given that I don’t use it that often, I can’t comment much beyond that it’s a neat feature that is sufficiently inconsistent that I don’t use it for much of anything. There are some exceptions, such as when shooting portraits at family events, but on the whole I remain impressed with it from a technology vantage point while being disappointed in it from a photographer’s point of view. If I want a shallow depth of field and need to get a shot I’m going to get one of my bigger cameras and not risk the shot with the 12 Pro.

Video

I don’t really shoot video, per se, and so don’t have a lot of experience with the quality of video production on it. Others have, however, very positively discussed about the capabilities of the cameras and I trust what they’ve said.

That said, I did a short video for a piece I wrote and it turned out pretty well. We shot using the ‘normal’ lens at 4K and my employer’s video editor subsequently graded the video. This was taken in low-light conditions and I used my Apple Watch as a screen so I could track what I was doing while speaking to camera.

I’ve also used my iPhone 12 Pro for pretty well all the numerous video conferences, government presentations (starting at 19:45), classes I’ve taught, and media engagements I’ve had over the course of the pandemic. In those cases I’ve used the selfie camera and in almost all situations persons on the other side of the screen have commented on the high quality of my video. I take that as a recommendation of the quality of the selfie cameras for video-related purposes.

Frustrations

I’ll be honest: what I most hoped would be better with the iPhone 12 Pro was that the default Photos app would play better with extensions. I use Darkroom as my primary editing application and after editing 5-10 photos the extension reliably crashes and I need to totally close out Photos before I can edit using the extension again.1 It is frustrating and it sucks.

What else hasn’t improved? The 12 Pro still has green lens flares when I take photos at night. It is amazingly frustrating that, despite all the computing power in the 12 Pro, this is an issue that Apple’s software engineers can’t fix given the current inability of their hardware engineers to resolve the issue. Is this a problem? Yes, it is, especially if you ever shoot at night. None of my other-less expensive-cameras suffer from this, and it’s maddening the 12 Pro still does. It’s made worse by the fact that the Photos app doesn’t include a healing tool to remove these gross little flares and, thus, requires me to use another app (typically Snapseed) to get rid of them.

Finally, I find that the shots with the 12 Pro are often too sharpened to my preference, which means that I tend to turn down the clarity in Darkroom to soften a lot of the photos I take. It’s an easy fix, though (again) not one you can correct in the default Photos application.

Conclusion

So what do I think of the iPhone 12 Pro? It’s the best non-Fuji X100F that I typically have when I’m out and about, and the water resistance means I’m never worried to shoot with it in the elements.2

If I have a choice, do I shoot with the Fuji X100F or the iPhone 12 Pro? If a 35mm equivalent works, then I shoot with the Fuji. But if I want a wide angle shot it’s pretty common for me to pull the 12 Pro and use it, even while out with the Fuji. They’ve got very different colour profiles but I still like using them both. Sometimes I even go on photowalks with just the 12 Pro and come back with lots of keepers.

This is all to say that the X100F and 12 Pro are both pretty great tools. I’m a fan of them both.

So…is the 12 Pro a major upgrade from the 11 Pro? Not at all. A bigger upgrade from earlier iPhones? Yeah, probably more so. I like the 12 Pro and use it everyday as a smartphone, and I like it as a camera. I also liked the 11 Pro as a portable camera and phone as well.

Should you buy the 12 Pro? Only if you really want the telephoto and the ability to edit ProRaw files. If that’s not you, then you’re probably going to be well off saving a chunk of change and getting the regular 12, instead.

(Note: All photos taken with an iPhone 12 Pro and edited to taste in Apple Photos and Darkroom.)


  1. Yes, I can edit right in Darkroom, and I do, but it’s not as convenient. ↩︎

  2. I admit to not treating the X100F with a lot of respect but I don’t use it when it’s pouring rain. The same isn’t true of the iPhone 12 Pro. ↩︎