Policing the Location Industry

Photo by Ingo Joseph on Pexels.com

The Markup has a comprehensive and disturbing article on how location information is acquired by third-parties despite efforts by Apple and Google to restrict the availability of this information. In the past, it was common for third-parties to provide SDKs to application developers. The SDKs would inconspicuously transfer location information to those third-parties while also enabling functionality for application developers. With restrictions being put in place by platforms such as Apple and Google, however, it’s now becoming common for application developers to initiate requests for location information themselves and then share it directly with third-party data collectors.

While such activities often violate the terms of service and policy agreements between platforms and application developers, it can be challenging for the platforms to actually detect these violations and subsequently enforce their rules.

Broadly, the issues at play represent significant governmental regulatory failures. The fact that government agencies often benefit from the secretive collection of individuals’ location information makes it that much harder for the governments to muster the will to discipline the secretive collection of personal data by third-parties: if the government cuts off the flow of location information, it will impede the ability of governments themselves obtain this information.

In some cases intelligence and security services obtain location information from third-parties. This sometimes occurs in situations where the services themselves are legally barred from directly collecting this information. Companies selling mobility information can let government agencies do an end-run around the law.

One of the results is that efforts to limit data collectors’ ability to capture personal information often sees parts of government push for carve outs to collecting, selling, and using location information. In Canada, as an example, the government has adopted a legal position that it can collect locational information so long as it is de-identified or anonymized,1 and for the security and intelligence services there are laws on the books that permit the collection of commercially available open source information. This open source information does not need to be anonymized prior to acquisition.2 Lest you think that it sounds paranoid that intelligence services might be interested in location information, consider that American agencies collected bulk location information pertaining to Muslims from third-party location information data brokers and that the Five Eyes historically targeted popular applications such as Google Maps and Angry Birds to obtain location information as well as other metadata and content. As the former head of the NSA announced several years ago, “We kill people based on metadata.”

Any arguments made by either private or public organizations that anonymization or de-identification of location information makes it acceptable to collect, use, or disclose generally relies tricking customers and citizens. Why is this? Because even when location information is aggregated and ‘anonymized’ it might subsequently be re-identified. And in situations where that reversal doesn’t occur, policy decisions can still be made based on the aggregated information. The process of deriving these insights and applying them showcases that while privacy is an important right to protect, it is not the only right that is implicated in the collection and use of locational information. Indeed, it is important to assess the proportionality and necessity of the collection and use, as well as how the associated activities affect individuals’ and communities’ equity and autonomy in society. Doing anything less is merely privacy-washing.

Throughout discussions about data collection, including as it pertains to location information, public agencies and companies alike tend to provide a pair of argument against changing the status quo. First, they assert that consent isn’t really possible anymore given the volumes of data which are collected on a daily basis from individuals; individuals would be overwhelmed with consent requests! Thus we can’t make the requests in the first place! Second, that we can’t regulate the collection of this data because doing so risks impeding innovation in the data economy.

If those arguments sound familiar, they should. They’re very similar to the plays made by industry groups who’s activities have historically had negative environmental consequences. These groups regularly assert that after decades of poor or middling environmental regulation that any new, stronger, regulations would unduly impede the existing dirty economy for power, services, goods, and so forth. Moreover, the dirty way of creating power, services, and goods is just how things are and thus should remain the same.

In both the privacy and environmental worlds, corporate actors (and those whom they sell data/goods to) have benefitted from not having to pay the full cost of acquiring data without meaningful consent or accounting for the environmental cost of their activities. But, just as we demand enhanced environmental regulations to regulate and address the harms industry causes to the environment, we should demand and expect the same when it comes to the personal data economy.

If a business is predicated on sneaking away personal information from individuals then it is clearly not particularly interested or invested in being ethical towards consumers. It’s imperative to continue pushing legislators to not just recognize that such practices are unethical, but to make them illegal as well. Doing so will require being heard over the cries of government’s agencies that have vested interests in obtaining location information in ways that skirt the law that might normally discipline such collection, as well as companies that have grown as a result of their unethical data collection practices. While this will not be an easy task, it’s increasingly important given the limits of platforms to regulate the sneaky collection of this information and increasingly problematic ways our personal data can be weaponized against us.


  1. “PHAC advised that since the information had been de-identified and aggregated, it believed the activity did not engage the Privacy Act as it was not collecting or using “personal information”. ↩︎
  2. See, as example, Section 23 of the CSE Act ↩︎
Link

Ontario’s Path Towards Legitimizing Employee Surveillance

Earlier this week, the Ontario government declared that it would be introducing a series of labour reforms. As part of these reforms, employers will be required to inform their employees of how they are being electronically monitored. These requirements will be applied to all employers with 25 or more employees.

Employers already undertake workplace surveillance, though it has become more common and extensive as a result of the pandemic. Where surveillance is undertaken, however, businesses must seek out specialized counsel or services to craft appropriate labour policies or contracting language. This imposes costs and, also, means that different firms may provide slightly different information. The effect is that employers may be more cautious in what surveillance they adopt and be required to expend funds to obtain semi-boutique legal opinions.

While introducing legislation would seem to extend privacy protections for employees, as understood at the moment the reforms will only require a notification to employees of the relevant surveillance. It will not bar the surveillance itself. Further, with a law on the books it will likely be easier for Ontario consulting firms to provide pretty rote advice based on the legislative language. The result, I expect, will be to drive down the transaction costs in developing workplace surveillance policies at the same time that workplace surveillance technologies become more affordable and extensively deployed.

While I suspect that many will herald this law reform as positive for employees, on the basis that at least now they will know how they are being monitored, I am far less optimistic. The specificity of notice will matter, a lot, and unless great care is taken in drafting the legislation employers will obtain a significant degree of latitude in the actual kinds of intrusive surveillance that can be used. Moreover, unless required in legislative language, we can expect employers to conceal the specific modes of surveillance on grounds of needing to protect the methods for operational business reasons. This latter element is of particular concern given that major companies, including office productivity companies like Microsoft, are baking extensive workplace surveillance functionality into their core offerings. Ontario’s reforms are not, in fact, good for employees but are almost certain to be a major boon for their employers.

Link

When the Government Decides to Waylay Parliament

Steven Chaplin has a really great explanation of whether the Canadian government can rely on national security and evidentiary laws to lawfully justify refusing to provide documents to the House of Commons, and to House committees. His analysis and explanation arose as a result of the Canadian government doing everything it could to, first, refuse to provide documents to the Parliamentary Committee which was studying Canadian-Chinese relations and, subsequently, refusing to provide the documents when compelled to do so by the House of Commons itself.

Rather than releasing the requested documents the government turned to the courts to adjudicate whether the documents in question–which were asserted to contain sensitive national security information–must, in fact, be released to the House or whether they could instead be sent to an executive committee, filled with Members of Parliament and Senators, to assess the contents instead. As Chaplin notes,

Having the courts intervene, as proposed by the government’s application in the Federal Court, is not an option. The application is clearly precluded by Article 9 of the Bill of Rights, 1689, which provides that a proceeding in Parliament ought not to be impeached or questioned in court. Article 9 not only allows for free speech; it is also a constitutional limit on the jurisdiction of the courts to preclude judicial interference in the business of the House.

The House ordered that the documents be tabled without redaction. Any decision of the court that found to the contrary would impeach or question the proceeding that led to the Order. And any attempt by the courts to balance the interests involved would constitute the courts becoming involved in ascertaining, and thereby questioning, the needs of the House and why the House wants the documents.

Beyond the Court’s involvement impeding into the territory of Parliament, there could be serious and long-term implications of letting the court become a space wherein the government and the House fight to obtain information that has been demanded. Specifically,

It may be that at the end of the day the government will continue to refuse to produce documents. In the same way that the government cannot use the courts to withhold documents, the House cannot go to court to compel the government to produce them, or to order witnesses to attend proceedings. It could also invite disobedience of witnesses, requiring the House to either drop inquiries or involve the courts to compel attendance or evidence. Allowing, or requiring, the government and the House to resolve their differences in the courts would not only be contrary to the constitutional principles of Article 9, but “would inevitably create delays, disruption, uncertainties and costs which would hold up the nation’s business and on that account would be unacceptable even if, in the end, the Speaker’s rulings were vindicated as entirely proper” (Canada (House of Commons) v. Vaid [2005]). In short, the courts have no business intervening one way or the other.

Throughout the discussions that have taken place about this issue in Canada, what has been most striking is that the national security commentators and elites have envisioned that the National Security and Intelligence Committee of Parliamentarians (NSICOP) could (and should) be tasked to resolve any and all particularly sensitive national security issues that might be of interest to Parliament. None, however, seems to have contemplated that Parliament, itself, might take issue with the government trying to exclude Parliament from engaging in assessments of the government’s national security decisions nor that issue would be taken when topics of interest to Parliamentarians were punted into an executive body, wherein their fellow Members of Parliament on the body were sworn to the strictest secrecy. Instead, elites have hand waved to the importance of preserving secrecy in order for Canada to receive intelligence from allies, as well as asserted that the government would never mislead Parliament on national security matters (about which, these same experts explain, Members of Parliament are not prepared to receive, process, or understand given the sophistication of the intelligence and the apparent simplicity of most Parliamentarians themselves).

This was the topic of a recent episode of the Intrepid Podcast, where Philippe Lagassé noted that the exclusion of parliamentary experts when creating NSICOP meant that these entirely predictable showdown situations were functionally baked into how the executive body was composed. As someone who raised the issue of adopting an executive, versus a standing House, committee and was rebuffed as being ignorant of the reality of national security it’s with more than a little satisfaction that the very concerns which were raised when NSICOP was being created are, in fact, arising on the political agenda.

With regard to the documents that the House Committee was seeking, I don’t know or particularly care what their contents include. From my own experience I’m all too well aware that ‘national security’ is often stamped on things that either governments want to keep from the public because they can be politically damaging, be kept from the public just generally because of a culture of non-transparency and refusal of accountability, as well as (less often) be kept from the public on the basis that there are bonafide national security interests at stake. I do, however, care that the Government of Canada has (again) acted counter to Parliament’s wishes and has deliberately worked to impede the House from doing its work.

Successive governments seem to genuinely believe that they get to ‘rule’ Canada absolutely and with little accountability. While this is, in function, largely true given how cowed Members of Parliament are to their party leaders it’s incredibly serious and depressing to see the government further erode Parliament’s powers and abilities to fulfil its duties. A healthy democracy is filled with bumps for the government as it is held to account but, sadly, the Government of Canada–regardless of the party in power–is incredibly active in keeping itself, and its behaviours, from the public eye and thus held to account.

If only a committee might be struck to solve this problem…

Quote

We have come a long way in routing the taboos that stand in the way of justice for victims of sexual assault. But there is still a distance to go. The problems are complex and rooted in centuries of culture and myth. The law, imperfect as it may be, is a powerful tool in achieving lasting change. But real justice will come only when we change attitudes—when respect for the autonomy of every person replaces old myths grounded in ownership, control, and power.

– Beverly McLachlin, Truth Be Told: My Journey Through Life and the Law

Two Thoughts on China’s Draft Privacy Law

Alexa Lee, Samm Sacks, Rogier Creemers, Mingli Shi, and Graham Webster have collectively written a helpful summary of the new Chinese Data Privacy Law over at Stanford’s DigiChina.

There were a pair of features that most jump out to me.

First, that the proposed legislation will compel Chinese companies “to police the personal data practices across their platforms” as part of Article 57. As noted by the team at Stanford,

“the three responsibilities identified for big platform companies here resonate with the “gatekeeper” concept for online intermediaries in Europe, and a requirement for public social responsibility reports echoes the DMA/DSA mandate to provide access to platform data by academic researchers and others. The new groups could also be compared with Facebook’s nominally independent Oversight Board, which the company established to review content moderation decisions.”

I’ll be particularly curious to see the kinds of transparency reporting that emerges out of these companies. I doubt the reports will parallel those in the West, which tend to focus on the processes and number of disclosures from private companies to government and, instead, the Chinese companies’ reports will focus on how companies are being ‘socially responsible’ with how they collect, process, and disclose data to other Chinese businesses. Still, if we see this more consumer-focused approach it will demonstrate yet another transparency report tradition that will be useful to assess in academic and public policy writing.

Second, the Stanford team notes that,

“new drafts of both the PIPL and the DSL added language toughening requirements for Chinese government approval before data holders in China cooperate with foreign judicial or law enforcement requests for data, making failure to gain permission a clear violation punishable by financial penalties up to 1 million RMB.”

While not surprising, this kind of restriction will continue to raise data sovereignty borders around personal information held in China. The effect? Western states will still need to push for Mutual Legal Assistant Treaty (MLAT) reform to successfully extract information from Chinese companies (and, perhaps in all likelihood, fail to conclude these reforms).1

It’s perhaps noteworthy that while China is moving to build up walls there is a simultaneous attempt by the Council of Europe to address issues of law enforcement access to information held by cloud providers (amongst other things). The United States passed the CLOUD Act in 2018 to begin to try and alleviate the issue of states gaining access to information held by cloud providers operating in foreign jurisdictions (though did not address human rights concerns which were mitigated through traditional MLAT processes). Based on the proposed Chinese law, it’s unlikely that the CLOUD Act will gain substantial traction with the Chinese government, though admittedly this wasn’t the aim of the CLOUD Act or an expected outcome of its passage.

Nevertheless, as competing legal frameworks are established that place the West on one side, and China and Russia on the other, the effect will be further entrenching the legal cultures of the Internet between different economic and political (and security) regimes. At the same time, data will be easily stored anywhere in the world including out of reach of relevant law enforcement agencies by criminal actors that routinely behave with technical and legal savvy.

Ultimately, the raising of regional and national digital borders is a topic to watch, both to keep an eye on what the forthcoming legal regimes will look like and, also, to assess the extents to which we see languages of ‘strong sovereignty’ or nationalism creep functionally into legislation around the world.


  1. For more on MLAT reform, see these pieces from Lawfare ↩︎
Link

Privacy and Contemporary Motor Vehicles

Writing for NBC News, Olivia Solon provides a useful overview of just how much data is collected by motor vehicles—using sensors embedded in the vehicles as well as collected by infotainment systems when linked with a smartphone—and how law enforcement agencies are using that information.

Law enforcement agencies have been focusing their investigative efforts on two main information sources: the telematics system — which is like the “black box” — and the infotainment system. The telematics system stores a vehicle’s turn-by-turn navigation, speed, acceleration and deceleration information, as well as more granular clues, such as when and where the lights were switched on, the doors were opened, seat belts were put on and airbags were deployed.

The infotainment system records recent destinations, call logs, contact lists, text messages, emails, pictures, videos, web histories, voice commands and social media feeds. It can also keep track of the phones that have been connected to the vehicle via USB cable or Bluetooth, as well as all the apps installed on the device.

Together, the data allows investigators to reconstruct a vehicle’s journey and paint a picture of driver and passenger behavior. In a criminal case, the sequence of doors opening and seat belts being inserted could help show that a suspect had an accomplice.

Of note, rental cars as well as second hand vehicles also retain all of this information and it can then be accessed by third-parties. It’s pretty easy to envision a situation where rental companies are obligated to assess retained data to determine if a certain class or classes of offences have been committed, and then overshare information collected by rental vehicles to avoid their own liability that could follow from failing to fully meet whatever obligations are placed upon them.

Of course, outright nefarious actors can also take advantage of the digital connectivity built into contemporary vehicles.

Just as the trove of data can be helpful for solving crimes, it can also be used to commit them, Amico said. He pointed to a case in Australia, where a man stalked his ex-girlfriend using an app that connected to her high-tech Land Rover and sent him live information about her movements. The app also allowed him to remotely start and stop her vehicle and open and close the windows.

As in so many different areas, connectivity is being included into vehicles without real or sufficient assessment of how to secure new technologies and defray harmful or undesirable secondary uses of data. Engineers rarely worry about these outcomes, corporate lawyers aren’t attentive to these classes of issues, and the security of contemporary vehicles is generally garbage. Combined, this means that government bodies are almost certainly going to expand the ranges of data they can access without having to first go through a public debate about the appropriateness of doing so or creation of specialized warrants that would limit data mining. Moreover, in countries with weak policing accountability structures, it will be impossible to even assess the regularity at which government officials obtain access to information from cars, how such data lets them overcome other issues they state they are encountering (e.g., encryption), or the utility of this data in investigating crimes and introducing it as evidence in court cases.

Quote

If those responsible for security believe that the law does not give them enough power to protect security effectively, they must try to persuade the law-makers, Parliament and the provincial legislatures, to change the law. They must not take the law into their own hands. This is a requirement of a liberal society.

  • Canada, Commission of Inquiry Concerning Certain Activities of the Royal Canadian Mounted Police, Second Report: Freedom and Security Under the Law, vol 1, Part II (Ottawa: Privy Council Office, 1981) at 45.

Aside

2018.2.15

As I return from an event I was invited to I have to reflect on, and admit, how profoundly…weird…it is that stuff I write about and the activities in which I’m engaged increasingly influence the course of justice in my county. How weird it is that the leader of my country is briefed on the work that I and my colleagues write about. How it feels epically strange that things which seem to have no impact on public debate whatsoever reverberate behind closed doors. It’s just really, really weird to know that people who are intrinsically involved with law, security, and justice — to say nothing of policy and politics — closely watch what I do, with the intent of using it when making decisions that may affect the lives of people across Canada, and around the world.

When I was doing my PhD I laughed out loud at my colleagues who spoke of how the work of political scientists can lead to exceptional impacts in the worlds. As a philosopher I thought such conversations were borne of a group of people who took themselves too seriously in their (ongoing) moments of hubris. But I get it now: that which we say, when we’re deliberately involved with public debate with an eye to inform (if not influence) policy can have unexpected and exciting and unintended impacts on the lives of millions of people. And in living this reality I have remarkably more sympathy for those who’s work isn’t just read and taken up, but misread and subsequently misappropriated to justify governmental activities that the political scientists in question might not have anticipated or endorsed.

Link

The Insanity of ‘Terrorism’ Offences

The Fool by Christopher Parsons, All Rights Reserved

Via The Intercept:

At the end of a quick one-day trial, Judge Emma Arbuthnot at Westminster Magistrates Court ruled that Rabbani had willfully obstructed police when he declined to hand over his passwords. Rabbani avoided a possible three-month jail term and was instead handed a 12-month conditional discharge and told he must pay court costs of £620 ($835). This means a Terrorism Act offense will be recorded on his criminal record. But as long as he does not re-offend within the 12-month period, no further action will be taken against him.

Rabbani had argued his electronic devices should have been protected under the latter category, as they contained confidential information related to his work. The judge said that Rabbani did not make this clear to the officers who initially interrogated him, but did say so later in a prepared statement following his arrest. She described Rabbani as “of good character,” acknowledged he was “trying to protect confidential material on his devices,” and noted that “the importance of passwords and PIN numbers in the 21st century cannot be overstated.” However, she still concluded that his “decision not to provide the information when requested by the examining officers” amounted to “a wilful obstruction of the lawful examination in the circumstances.”

A lawyer was charged and found guilty of a terrorism offence for refusing to decrypt a device containing sensitive client information. A baseline part of the criminal justice system is that what is said between a client and their lawyer is protected speech, but this protection is under threat in the UK: solicitors who do their duty and uphold the oaths to their clients risk serious convictions that may permanently refigure their lives and liberties. This dismantling of baseline aspects of our legal systems to fight ‘terrorism’ are ludicrous and do more harm to our societies than can be inflicted upon us by violent extremists and criminals.

Link

How Canada’s Anti-Cyberbullying Law Is Being Used to Spy on Journalists

From Motherboard:

According to Citizen Lab researcher Christopher Parsons, these same powers that target journalists can be used against non-journalists under C-13. And the only reason we know about the aforementioned cases is that the press has a platform to speak out.

“This is an area where transparency and accountability are essential,” Parsons said in an interview. “We’ve given piles and piles of new powers to law enforcement and security agencies alike. What’s happened to this journalist shows we desperately need to know how the government uses its powers to ensure they’re not abused in any way.”

“I expect that the use of these particular powers will become more common as the police get more used to using it and more savvy in using them,” Parsons said.

These were powers that were ultimately sold to the public (and passed into law) as needed to ‘child pornography’. And now they’re being used to snoop on journalists to figure out who their sources are, without being mandated to report on the regularity at which the powers are used to the efficacy of such uses. For some reason, this process doesn’t inspire a lot of confidence in me.