Policing the Location Industry

Photo by Ingo Joseph on Pexels.com

The Markup has a comprehensive and disturbing article on how location information is acquired by third-parties despite efforts by Apple and Google to restrict the availability of this information. In the past, it was common for third-parties to provide SDKs to application developers. The SDKs would inconspicuously transfer location information to those third-parties while also enabling functionality for application developers. With restrictions being put in place by platforms such as Apple and Google, however, it’s now becoming common for application developers to initiate requests for location information themselves and then share it directly with third-party data collectors.

While such activities often violate the terms of service and policy agreements between platforms and application developers, it can be challenging for the platforms to actually detect these violations and subsequently enforce their rules.

Broadly, the issues at play represent significant governmental regulatory failures. The fact that government agencies often benefit from the secretive collection of individuals’ location information makes it that much harder for the governments to muster the will to discipline the secretive collection of personal data by third-parties: if the government cuts off the flow of location information, it will impede the ability of governments themselves obtain this information.

In some cases intelligence and security services obtain location information from third-parties. This sometimes occurs in situations where the services themselves are legally barred from directly collecting this information. Companies selling mobility information can let government agencies do an end-run around the law.

One of the results is that efforts to limit data collectors’ ability to capture personal information often sees parts of government push for carve outs to collecting, selling, and using location information. In Canada, as an example, the government has adopted a legal position that it can collect locational information so long as it is de-identified or anonymized,1 and for the security and intelligence services there are laws on the books that permit the collection of commercially available open source information. This open source information does not need to be anonymized prior to acquisition.2 Lest you think that it sounds paranoid that intelligence services might be interested in location information, consider that American agencies collected bulk location information pertaining to Muslims from third-party location information data brokers and that the Five Eyes historically targeted popular applications such as Google Maps and Angry Birds to obtain location information as well as other metadata and content. As the former head of the NSA announced several years ago, “We kill people based on metadata.”

Any arguments made by either private or public organizations that anonymization or de-identification of location information makes it acceptable to collect, use, or disclose generally relies tricking customers and citizens. Why is this? Because even when location information is aggregated and ‘anonymized’ it might subsequently be re-identified. And in situations where that reversal doesn’t occur, policy decisions can still be made based on the aggregated information. The process of deriving these insights and applying them showcases that while privacy is an important right to protect, it is not the only right that is implicated in the collection and use of locational information. Indeed, it is important to assess the proportionality and necessity of the collection and use, as well as how the associated activities affect individuals’ and communities’ equity and autonomy in society. Doing anything less is merely privacy-washing.

Throughout discussions about data collection, including as it pertains to location information, public agencies and companies alike tend to provide a pair of argument against changing the status quo. First, they assert that consent isn’t really possible anymore given the volumes of data which are collected on a daily basis from individuals; individuals would be overwhelmed with consent requests! Thus we can’t make the requests in the first place! Second, that we can’t regulate the collection of this data because doing so risks impeding innovation in the data economy.

If those arguments sound familiar, they should. They’re very similar to the plays made by industry groups who’s activities have historically had negative environmental consequences. These groups regularly assert that after decades of poor or middling environmental regulation that any new, stronger, regulations would unduly impede the existing dirty economy for power, services, goods, and so forth. Moreover, the dirty way of creating power, services, and goods is just how things are and thus should remain the same.

In both the privacy and environmental worlds, corporate actors (and those whom they sell data/goods to) have benefitted from not having to pay the full cost of acquiring data without meaningful consent or accounting for the environmental cost of their activities. But, just as we demand enhanced environmental regulations to regulate and address the harms industry causes to the environment, we should demand and expect the same when it comes to the personal data economy.

If a business is predicated on sneaking away personal information from individuals then it is clearly not particularly interested or invested in being ethical towards consumers. It’s imperative to continue pushing legislators to not just recognize that such practices are unethical, but to make them illegal as well. Doing so will require being heard over the cries of government’s agencies that have vested interests in obtaining location information in ways that skirt the law that might normally discipline such collection, as well as companies that have grown as a result of their unethical data collection practices. While this will not be an easy task, it’s increasingly important given the limits of platforms to regulate the sneaky collection of this information and increasingly problematic ways our personal data can be weaponized against us.


  1. “PHAC advised that since the information had been de-identified and aggregated, it believed the activity did not engage the Privacy Act as it was not collecting or using “personal information”. ↩︎
  2. See, as example, Section 23 of the CSE Act ↩︎

The Roundup November 19-24, 2017 Edition

It’s another week closer to the end of the year, and another where high profile men have been identified as having engaged in absolutely horrible and inappropriate behaviours towards women. And rather than the most powerful man in the world — himself having self-confessed to engaging in these kinds of behaviour — exhibiting an ounce of shame, he’s instead supporting an accused man and failing to account for his past activities.


I keep going back and forth as to whether I want to buy a new Apple Watch; I have zero need for one with cellular functionality and, really, just want an upgrade to take advantage of some more advanced heart monitoring features. The initial reviews of the Apple Watch Series 3 were…not inspiring. But Dan Seifert’s review of the Apple Watch Series 3 (non-LTE) is more heartening: on the whole, it’s fast and if you already have a very old Apple Watch and like it, it’s an obviously good purchase. I just keep struggling, though, to spend $600 for a device that I know would be useful but isn’t self-evidently necessary. Maybe I’ll just wait until Apple Canada starts selling some of the refurbished Series 3 models…


While photographers deal with Gear Acquisition Syndrome (GAS), which is usually fuelled by the prayer that better stuff will mean better photos, I think that writers deal with the related Software Acquisition Syndrome (SAS). SAS entails buying new authoring programs, finding new places to write, or new apps that will make writing easier, faster, and more enjoyable. But the truth is that the time spent learning the new software, getting a voice in the new writing space, or new apps tend to just take away from time that would otherwise be spent writing. But if you’re feeling a SAS-driven urge to purchase either Ulysses or iA Writer, you should check out Marius Masalar’s comprehensive review of the two writing tools. (As a small disclosure, I paid for Ulysses and use it personally to update this website.)


New Apps and Great App Updates from this Week

Great Photography Shots

If tapeworms are your thing then there’s some terrific shots of them included as part of an interview with tapeworm experts. A few gems include:

Music I’m Digging

Neat Podcast Episodes

Good Reads for the Week

Link

iMessage apps offer more layers of encryption, but do you need one?

Macworld:

Adding encryption you control inside an iMessage transmission can provide more assurances that your messages remain unreadable to others, but there a whole lot of provisos you need to consider before accepting this as a higher level of security.

It’s nice to see reviewers of applications present the concerns, first, before what might be nice about new ‘security’ apps. Namely that crypto is hard to do, not all crypto is the same, and there are basic questions concerning the reliability of the companies providing the security assurance.

More broadly, that applications can route double-encrypted messages through Apple Messages will not necessarily enhance security but, instead, mean that comunications are only as secure as the application applying the second layer of security. Apple is a great big target that everyone wants to penetrate and so Apple hires terrific technical and legal staff to keep government and others at bay. Can we expect that app developers selling encryption apps for a dollar or two will possess an equivalent commitment and competency?

Aside

Bit9 has released a report that outlines a host of fairly serious concerns around Android devices and app permissions. To be upfront: Android isn’t special in this regard, as if you have a Blackberry, iPhone, or Windows Phone Device you’ll also find a pile of apps that have very, very strange permission requests (e.g. can a wallpaper application access your GPS and contact book?). The video (above) is a quick overview of some findings; the executive summary can be found here and the full report here (.pdf).

Link

App Developers Face Fines for Lacking Privacy Policies

To be clear and up front: privacy policies suck. I’m currently analyzing the policies of major social networks and if the policies were merely horrific then they’d be massively better than they actually are today.

That said, a privacy policy at least indicates that an organization took the time to copy someone else’s policy. For the briefest of moments there was some (however marginal) contemplation about how the organization’s actions related to privacy. While most companies will just hire a lawyer to slap legalese on their websites, a few will actually think about their data collection and its implications for individuals’ privacy. That’s really all you can hope for privacy policies to generally accomplish unless the company out-and-out lies in their policy. If they do lie then you can get the FTC involved.

The potential for ‘enjoying’ a $2,500 fine per download if a company lacks a privacy policy is a massive stick and, hopefully, will get developers to at least consider how their collection of data implicates users’ privacy. The California approach is not the solution to the problem of people’s data being collected without their genuine consent but at least it’s a start.