Link

Medical Photography is Failing Patients With Darker Skin

Georgina Gonzalez, reporting for the Verge:

Most clinical photos are taken by well-intentioned doctors who haven’t been trained in the nuances of photographing patients of different races. There are fundamental differences in the physics of how light interacts with different skin tones that can make documenting conditions on skin of color more difficult, says Chrystye Sisson, associate professor and chair of the photographic science program at Rochester Institute of Technology, the only such program in the nation. 

Interactions between light, objects, and our eyes allow us to perceive color. For instance, a red object absorbs every wavelength of light except red, which it reflects back into our eyes. The more melanin there is in the skin, the more light it absorbs, and the less light it reflects back.

But standard photographic setups don’t account for those differences.

One of the things that I routinely experience shooting street photography in a multicultural city is just how screwy camera defaults treat individuals of different racial backgrounds. And I’ve yet to find a single default that captures darker skin accurately despite shooting for many years.

My Glass Public Profile

I’ve recently written about the concerns that I have about Instagram, and my assessment of whether I wanted to port my online photo sharing to either Flickr or Glass. As of October 27, Glass has enabled public profiles so non-members can view the work that photographers have published on the service. You can check mine out!

I…really like how the profiles look on Glass at the moment. I’ve been posting with some frequency (all black and whites, with a focus on street photography) and the flow model to capture and then post photographs has been simple and seamless.

I also really like the experience of having to comment on other photographs instead of ‘liking’ them. This engagement strategy means that when I interact with other photographers’ pieces I need to leave at least some kind of meaningful comment. As a result, I need to slow down and think a bit more about a photograph and I think that’s a good thing for me–the viewer–and the photographer who hopefully gets more meaningful (if less frequent) engagement.

I like Glass enough that I’ve ponied up for a one year subscription. The developers are pushing out significant quality of life updates to the application and, on the whole, it’s currently pretty fun to use and is clearly intended to be used by photographers, as well as other individuals who are interested in photography and just don’t want to deal with the grossness of Instagram and want something a little fresher than Flickr.

Based on my experiences thus far I’d heartily recommend that you check out the service, as well as my public profile!

Vacation Street Photography Challenge

(Come Towards the Light by Christopher Parsons)

This year I took a very late vacation while Toronto was returning to its new normal. I’ve been capturing the city throughout the COVID-19 pandemic and I wanted to focus in on how the streets felt.

During the pandemic we’ve all been attached to our devices, and our phones in particular, and thus decided to document the city through the lens of our ever-present screen: the smartphone. I exclusively shot with my iPhone 12 Pro using the Noir filter. This filter created a strong black and white contrast, with the effect of deepening shadows and blacks and lifting highlights and whites. I choose this, over a monotone, as I wanted to emphasize that while the city was waking up there were still stark divides between the lived experiences of the pandemic and a continuation of strong social distancing from one another.

95% of my photos were captured using ProRaw with the exception of those where I wanted to utilize Apple’s long exposure functionality in the Photos application.

Darkroom Settings

In excess of the default Noir filter, I also created a secondary filter in Darkroom that adjusted what came off the iPhone just a bit to establish tones that were to my liking. My intent was to make the Noir that much punchier, while also trying to reduce a bit of the sharpness/clarity that I associate with Apple’s smartphone cameras. This adjustment reflected, I think, that digital communications themselves are often blurrier or more confused than our face-to-face interactions. Even that which seems clear, when communicated over digital systems, often carries with it a misrepresentation of meaning or intent.

(more…)
Gallery

Over Flow by John Notten

Climate change is a reality of contemporary life and is leading to increasingly numbers of weather-related catastrophes. One of the many threats now facing humanity is severe flooding. Such threats have been, and continue to be, driven by harmful and destructive human activities that impair and change the climate, and amplified by housing councils that permit developers to build homes on floodplains along with other development pressures linked to humans moving in increasing numbers into urban environments.

With the climate emergency in mind, Toronto artist John Notten has created a series of styrofoam installations that are presently located in Ontario Place. On the one side they show the image of an iceberg and the other show homes, vehicles, and other urban architecture. As discussed in the artist statement, the installation is intended to offer:

… an opportunity for the viewer to consider connections between this provocative material, the image of floating icebergs, and those of half-submerged iconic institutions.

It was particularly special to have a pair of kayakers visit the exhibit at the same time that I was there. Their presence—and my effort to present them as blurred subjects—helps to give a sense that climate change affects all subjects—all people—and isn’t something that is linked to any one specific subject. In essence, I wanted to convey that all humans are threatened by climate change and that focusing on individuals and their efforts does not adequately appreciate the structural and collective drivers that endanger all life on Earth.

Over Flow will be in Ontario Place until October 31, 2021, and will then be moved to other locations in the spring of 2021.

All images were made using an iPhone 12 Pro and the Noir filter, and then slightly edited using a filter in Darkroom.

Aside

2021.10.14

I’m liking the incremental update to the Photos application on iOS and iPadOS in the newest release of the operating systems. The ability to easily add titles to my photos and also access the EXIF metadata helps to maintain a (slightly) more organized photo library. Access to this information also makes it easier to share out photos straight from the Photos app, since I can copy the title of an image as part of sharing it.

However, I’m still missing the ability to create Smart Folders. Specifically I want to be able to have folders that are accessible, on iOS devices, and which sort based on the camera that took a given set of images. It’s been in MacOS for a very, very long time and it’s nuts that this kind of feature parity hasn’t been reached between operating systems.

I haven’t seen evidence that the newest version of iOS has fixed the green flare issue (which I first encountered when reviewing my iPhone 11 Pro). I know it was in an earlier beta but haven’t yet seen it implemented in a production release.

Gallery

Canadian Genocide

The history of Canada is linked to settle colonialism and white supremacy. Only recently have elements of Canada come to truly think through what this means: Canada, and settler Canadians, owe their existence to the forceful removal of indigenous populations from their terrorities.

Toronto is currently hosting an art exhibit, “Built on Genocide.” It’s created by the indigenous artist, Jay Soule | CHIPPERWAR,1 and provides a visual record of the link between the deliberate decimation of the buffalo and its correlation with the genocide of indigenous populations. From the description of the exhibit:

Built on Genocide is a powerful visual record of the 19th-century buffalo genocide that accompanied John A. MacDonald’s colonial expansion west with the railroad. In the mid-19th century, an estimated 30 to 60 million buffalo roamed the prairies, by the late 1880s, fewer than 300 remained. As the buffalo were slaughtered and the prairie ecosystem decimated, Indigenous peoples were robbed of their foods, lands, and cultures. The buffalo genocide became a genocide of the people.

Working from archival records, Soule combines installation and paintings to connect the past with the present, demanding the uncomfortable acknowledgement that Canada is a nation built on genocide.

What follows are a series of photographs that I made while visiting the exhibit on October 13, 2021. All images were made using an iPhone 12 Pro using the ‘Noir’ filter in Apple Photos, and subsequently edited using a Darkroom App filter.

Canada is, and needs to be, going through a reckoning concerning its past. This process is challenging for settlers, both to appreciate their actual histories and to be made to account for how they arrived at their current life situations. There are, obviously, settlers who are in challenging life situations—som experience poverty and are otherwise disadvantaged in society—but their challenges routinely pale in comparison to what is sadly normal and typical in Canada’s indigenous societies. As just one example, while poverty is a real issue for some white and immigrant Canadians, few lack routine access to safe and clean drinking water. None have lacked access to safe and clean water for over 26 years but this is the lived reality of indigenous populations in Canada.


  1. Jay creates art under the name CHIPPEWAR, which represents the hostile relationship that Canada’s Indigenous peoples have with the government of the land they have resided in since their creation. CHIPPEWAR is also a reminder of the importance of the traditional warrior role that exists in Indigenous cultures across North America that survives into the present day. ↩︎

Photography and Social Media

5CC1F7AF-32C8-44BD-9E44-C061F92CC924
(Passer By by Christopher Parsons)

Why do we want to share our photos online? What platforms are better or worse to use in sharing images? These are some of the questions I’ve been pondering for the past few weeks.

Backstory

About a month ago a colleague stated that she would be leaving Instagram given the nature of Facebook’s activities and the company’s seeming lack of remorse. Her decision has stuck with me and left me wondering whether I want to follow her lead.

I deleted my Facebook accounts some time ago, and have almost entirely migrated my community away from WhatsApp. But as an amateur photographer I’ve hesitated to leave an app that was, at least initially, designed with photographers in mind. I’ve used the application over the years to develop and improve my photographic abilities and so there’s an element of ‘sunk cost’ that has historically factored into my decision to stay or leave.

But Instagram isn’t really for photographers anymore. The app is increasingly stuffed with either videos or ads, and is meant to create a soft landing point for when/if Facebook truly pivots away from its main Facebook app.1 The company’s pivot makes it a lot easier to justify leaving the application though, at the same time, leaves me wondering what application or platform, if any, I want to move my photos over to.

The Competition(?)

Over the past week or two I’ve tried Flickr.2 While it’s the OG of photo sharing sites its mobile apps are just broken. I can’t create albums unless I use the web app. The sharing straight from the Apple Photos app is janky. I worry (for no good reason, really) about the cost for the professional version (do I even need that as an amateur?) as well as the annoyance of tagging photos in order to ‘find my tribe.’

It’s also not apparent to me how much community truly exists on Flickr: the whole platform seems a bit like a graveyard with only a small handful of active photographers still inhabiting the space.

I’m also trying Glass at the moment. It’s not perfect: search is non-existent, you can’t share your gallery of photos with non-Glass users at the moment, discovery is a bit rough, there’s no Web version, and it’s currently iPhone only. However, I do like that the app (and its creators) is focused on sharing images and that it has a clear monetization schema in the form of a yearly subscription. The company’s formal roadmap also indicates that some of these rough edges may be filed away in the coming months.

I also like that Glass doesn’t require me to develop a tagging system (that’s all done in the app using presets), let’s me share quickly and easily from the Photos app, looks modern, and has a relatively low yearly subscription cost. And, at least so far, most of the comments are better than on the other platforms, which I think is important to developing my own photography.

Finally, there’s my blog here! And while I like to host photo series here this site isn’t really designed as a photo blog first and foremost. Part of the problem is that WordPress continues to suck for posting media in my experience but, more substantively, this blog hosts a lot more text than images. I don’t foresee changing this focus anytime in the near or even distant future.

The Necessity of Photo Sharing?

It’s an entirely fair question to ask why even bother sharing photos with strangers. Why not just keep my images on my devices and engage in my own self-critique?

I do engage in such critique but I’ve personally learned more from putting my images into the public eye than I would just by keeping them on my own devices.3 Some of that is from comments but, also, it’s been based on what people have ‘liked’ or left emoji comments on. These kinds of signals have helped me better understand what is a better or less good photograph.

However, at this point I don’t think that likes and emojis are the source of my future photography development: I want actual feedback, even if it’s limited to just a sentence or so. I’m hoping that Glass might provide that kind of feedback though I guess only time will tell.


  1. For a good take on Facebook and why its functionally ‘over’ as a positive brand check out M.G. Siegler’s article, “Facebook is Too Big, Fail.” ↩︎
  2. This is my second time with Flickr, as I closed a very old account several years ago given that I just wasn’t using it. ↩︎
  3. If I’m entirely honest, I bet I’ve learned as much or more from reading photography teaching/course books, but that’s a different kind of learning entirely. ↩︎
Aside

2021.8.12

If iOS 15 automatically removes the green lens flares that appear when shooting with the device at night that’d go a long way to improving the quality of night photos taken with the device (and fix one of the annoyances I raised in my reviews of the iPhone 11 Pro and 12 Pro). Here’s hoping that the software-side corrections make their way into the final release.

I do wonder, however, whether there are any photographers who have leaned into this lens flare and thus will have their photography negatively affected by Apple’s decision?

An Amateur Photographer’s Long-Term Review of the iPhone 12 Pro Camera System

I bought an iPhone 12 Pro mid-cycle in March 2021 and have been shooting with it for the past several months in a variety of weather conditions. I was very pleased with the iPhone 11 Pro with the exception of the green lens flares that too-frequently erupt when shooting with it at night. Consider this a longish-term review of the 12 Pro with comparisons to the 11 Pro, and scattered with photos taken exclusively with the 12 Pro and edited in Apple Photos and Darkroom on iOS.

Background

I’m by definition an amateur photographer; I shoot using my iPhone as well as a Fuji X100F, and get out to take photos at least once or twice a week during photo walks that last a few hours. I don’t earn any money from making photos and shoot with it for my own personal enjoyment. Most of my photos are street or urban photography, with a smattering of landscape shots and photos of friends and family thrown in.

To be clear up front: this is not a review of the iPhone 12 Pro, proper, but just the camera system. This said, it’s worth noting that the hardware differences between the iPhone 11 Pro and 12 Pro are pretty minor. The 26mm lens is now f/1.6 and the 13mm can be used with night mode. At a software level, the 12 Pro introduced the ability to shoot Apple ProRAW and introduced Smart HDR 3, as well as Deep Fusion to improve photographic results in low to middling light. Deep Fusion, in particular, has no discernible effect on the shots I take, but maybe I’m just not pixel peeping enough to see what it’s doing.

For the past few years I’ve shot with a number of cameras, including: an iPhone 6 and 7, and 11 Pro, a Fuji X100 and 100F, a Sony RX1002, and an Olympus EM10ii. I’ve printed my work in a couple personal books, and also printed photos from all these systems at various sizes and hang the results on my walls. When I travel it’s with a camera or two in tow. If you want a rough gauge of the kinds of photos I take you might want to take a gander at my Instagram.

Also, while I own a bunch of cameras, photos are my jam. I’ll be focusing mostly on how well the iPhone 12 Pro makes images with a small aside to talk about its video capabilities. For more in-depth technical reviews of the 12 Pro I’d suggest checking out Halide’s blog.

The Body

The iPhone 11 Pro had a great camera system but it was always a bit awkward to hold the phone when shooting because of its rounded edges. Don’t get me wrong, it helped the phone feel more inviting than the 12 Pro but was less ideal for actual daily photography and I find it easier to get, and retain, a strong grip on the 12 Pro. Your mileage may vary.

I kept my 11 Pro in an Apple silicon case and I do the same for the 12 Pro. One of the things I do with some regularity is press my phone right against glass to reduce glare when I’m shooting through a window or other transparent substance. With the 12 Pro’s silicon case I can do this without the glass I’m pressed against actually touching the lens because there’s a few millimetres between the case and the lens element. The same was also true of my 11 Pro and the Apple silicon case I had attached to it.

I like the screen of the 12 Pro, though I liked the screen in the 11 Pro as well. Is there a difference? Yeah, a bit, insofar as my blacks are more black on the 12 Pro but I wouldn’t notice the difference unless the 11 Pro and 12 Pro were right against one another. I can see both clearly enough to frame shots in sunny days while shooting which is what I really care about.

While the phone doesn’t have any ability to tilt the screen to frame shots, you can use a tripod to angle your phone and then frame and shoot using an Apple Watch if you have one. It’s a neat function and you can actually use an Apple Watch as a paired screen if you’re taking video using the main lenses. I tend to shoot handheld, however, and so only have used the Apple Watch trick when shooting a video using the main cameras on the back of the 12 Pro.

I don’t ever really use the flash so I can’t comment on it, though I do occasionally use the flash as a light to illuminate subjects I’m shooting with another camera. It’s not amazing but it works in a pinch.

The battery is so-so based on my experience. The 12 Pro’s battery is a tad smaller than the one in my 11 Pro, which means less capacity, though in the five months I’ve owned the 12 Pro the battery health hasn’t degraded at all which wasn’t the case with the 11 Pro. This said, if I’m out shooting exclusively with the 12 Pro I’m going to bring a battery pack with me just like when I went out for a day of shooting with the 11 Pro. If it’s not a heavy day of shooting, however, I reliably end the day with 20% or more battery after the 12 Pro has been off the charger for about 14-17 hours with middling usage.

Probably the coolest feature of the new 12 series iPhones is their ability to use magnetic attachments. I’ve been using a Luma Cube Telepod Tripod stand paired with a Moment Tripod Mount with MagSafe. It’s been pretty great for video conferences and is the coolest hardware feature that was added to the 12-line of phones in my opinion. It’s a shame that there isn’t a wider ecosystem supporting this hardware feature this many months after release.

Camera App

The default Apple camera app is fine, I guess. I like that you can now set the exposure and the app will remember it, which has helpfully meant that I can slightly under-expose my shots by default as is my preference. However, the default app still lacks a spirit guide which is really, really, really stupid, and especially so in a so-called “Pro” camera that costs around $2,000 (CAD) after Apple Care, a case, and taxes. It’s particularly maddening given that the phone includes a gyroscope that is used for so many other things in the default camera app like providing guidance when taking pano shots or top-down shots, and so forth.

It’s not coming back, but I’m still annoyed at how Apple changed burst mode in iOS. It used to be you could hold the shutter button in the native camera app or the volume rocker to active a burst but now you hold the shutter button and pull it to the left. It’s not a muscle memory I’ve developed and also risks screwing up my compositions when I’m shooting on the street so I don’t really use burst anymore which is a shame.

As a note, I edit almost all my photos in the Darkroom extension for Photos. It crashes all the damn time and it is maddening. I’d hoped these crashes would go away when I upgraded from the 11 Pro to the 12 Pro but they haven’t. It is very, very, very frustrating. And the crashes happen all the damn time.

Image Quality

In a theoretical world upgrading my camera would lead to huge differences in image quality, but in practice that’s rarely the case. It is especially not the case when shifting from the 11 Pro to the 12 Pro, save for in very particular situations. The biggest change and improvement that is noticeable in daily situations is when you’re shooting scenes where there is significant dynamism in the scene, such as when you’re outside on a bright day; the sky and the rest of the scene are kept remarkably intact without your highlights or shadows being blown out. Even when compared to a camera with an APS-C or Micro 4/3 sensor it’s impressive, and I can get certain bright day shots with the iPhone 12 Pro that wouldn’t be possible to easily capture with my Fujifilm X100F or Olympus EM10ii.

The other upgrade is definitely that, due to sensor and computational power, you can get amazing lowlight shots using the ultra-wide lens using Night Mode. Shots are sometimes a bit noisy or blotchy but still I can get photos that are impossible to otherwise get handheld with an APS-C sensor.

Relatedly, the ultra-wide’s correction for distortion is pretty great and it’s noticeably better than the ultra-wide lens correction on the 11 Pro. If you’re shooting wide angle a lot then this is likely one of the few software improvements you’ll actually benefit from with some regularity.

One of the most heralded features of the 12 Pros was the ability to shoot ProRaw. In bright conditions it’s not worth using; I rarely detect a noticeable improvement in quality nor does it significantly enhance how I can edit a photo in those cases. However, in darker situations or more challenging low-light indoor situations it can be pretty helpful in retaining details that can be later recovered. That said, it hasn’t transformed how I shoot per se; it’s a nice-to-have, but not something that you’re necessarily going to use all the time.

You might ask how well portrait mode works but, given that I don’t use it that often, I can’t comment much beyond that it’s a neat feature that is sufficiently inconsistent that I don’t use it for much of anything. There are some exceptions, such as when shooting portraits at family events, but on the whole I remain impressed with it from a technology vantage point while being disappointed in it from a photographer’s point of view. If I want a shallow depth of field and need to get a shot I’m going to get one of my bigger cameras and not risk the shot with the 12 Pro.

Video

I don’t really shoot video, per se, and so don’t have a lot of experience with the quality of video production on it. Others have, however, very positively discussed about the capabilities of the cameras and I trust what they’ve said.

That said, I did a short video for a piece I wrote and it turned out pretty well. We shot using the ‘normal’ lens at 4K and my employer’s video editor subsequently graded the video. This was taken in low-light conditions and I used my Apple Watch as a screen so I could track what I was doing while speaking to camera.

I’ve also used my iPhone 12 Pro for pretty well all the numerous video conferences, government presentations (starting at 19:45), classes I’ve taught, and media engagements I’ve had over the course of the pandemic. In those cases I’ve used the selfie camera and in almost all situations persons on the other side of the screen have commented on the high quality of my video. I take that as a recommendation of the quality of the selfie cameras for video-related purposes.

Frustrations

I’ll be honest: what I most hoped would be better with the iPhone 12 Pro was that the default Photos app would play better with extensions. I use Darkroom as my primary editing application and after editing 5-10 photos the extension reliably crashes and I need to totally close out Photos before I can edit using the extension again.1 It is frustrating and it sucks.

What else hasn’t improved? The 12 Pro still has green lens flares when I take photos at night. It is amazingly frustrating that, despite all the computing power in the 12 Pro, this is an issue that Apple’s software engineers can’t fix given the current inability of their hardware engineers to resolve the issue. Is this a problem? Yes, it is, especially if you ever shoot at night. None of my other-less expensive-cameras suffer from this, and it’s maddening the 12 Pro still does. It’s made worse by the fact that the Photos app doesn’t include a healing tool to remove these gross little flares and, thus, requires me to use another app (typically Snapseed) to get rid of them.

Finally, I find that the shots with the 12 Pro are often too sharpened to my preference, which means that I tend to turn down the clarity in Darkroom to soften a lot of the photos I take. It’s an easy fix, though (again) not one you can correct in the default Photos application.

Conclusion

So what do I think of the iPhone 12 Pro? It’s the best non-Fuji X100F that I typically have when I’m out and about, and the water resistance means I’m never worried to shoot with it in the elements.2

If I have a choice, do I shoot with the Fuji X100F or the iPhone 12 Pro? If a 35mm equivalent works, then I shoot with the Fuji. But if I want a wide angle shot it’s pretty common for me to pull the 12 Pro and use it, even while out with the Fuji. They’ve got very different colour profiles but I still like using them both. Sometimes I even go on photowalks with just the 12 Pro and come back with lots of keepers.

This is all to say that the X100F and 12 Pro are both pretty great tools. I’m a fan of them both.

So…is the 12 Pro a major upgrade from the 11 Pro? Not at all. A bigger upgrade from earlier iPhones? Yeah, probably more so. I like the 12 Pro and use it everyday as a smartphone, and I like it as a camera. I also liked the 11 Pro as a portable camera and phone as well.

Should you buy the 12 Pro? Only if you really want the telephoto and the ability to edit ProRaw files. If that’s not you, then you’re probably going to be well off saving a chunk of change and getting the regular 12, instead.

(Note: All photos taken with an iPhone 12 Pro and edited to taste in Apple Photos and Darkroom.)


  1. Yes, I can edit right in Darkroom, and I do, but it’s not as convenient. ↩︎

  2. I admit to not treating the X100F with a lot of respect but I don’t use it when it’s pouring rain. The same isn’t true of the iPhone 12 Pro. ↩︎