If those responsible for security believe that the law does not give them enough power to protect security effectively, they must try to persuade the law-makers, Parliament and the provincial legislatures, to change the law. They must not take the law into their own hands. This is a requirement of a liberal society.
Canada, Commission of Inquiry Concerning Certain Activities of the Royal Canadian Mounted Police, Second Report: Freedom and Security Under the Law, vol 1, Part II (Ottawa: Privy Council Office, 1981) at 45.
As I return from an event I was invited to I have to reflect on, and admit, how profoundly…weird…it is that stuff I write about and the activities in which I’m engaged increasingly influence the course of justice in my county. How weird it is that the leader of my country is briefed on the work that I and my colleagues write about. How it feels epically strange that things which seem to have no impact on public debate whatsoever reverberate behind closed doors. It’s just really, really weird to know that people who are intrinsically involved with law, security, and justice — to say nothing of policy and politics — closely watch what I do, with the intent of using it when making decisions that may affect the lives of people across Canada, and around the world.
When I was doing my PhD I laughed out loud at my colleagues who spoke of how the work of political scientists can lead to exceptional impacts in the worlds. As a philosopher I thought such conversations were borne of a group of people who took themselves too seriously in their (ongoing) moments of hubris. But I get it now: that which we say, when we’re deliberately involved with public debate with an eye to inform (if not influence) policy can have unexpected and exciting and unintended impacts on the lives of millions of people. And in living this reality I have remarkably more sympathy for those who’s work isn’t just read and taken up, but misread and subsequently misappropriated to justify governmental activities that the political scientists in question might not have anticipated or endorsed.
Via The Intercept:
At the end of a quick one-day trial, Judge Emma Arbuthnot at Westminster Magistrates Court ruled that Rabbani had willfully obstructed police when he declined to hand over his passwords. Rabbani avoided a possible three-month jail term and was instead handed a 12-month conditional discharge and told he must pay court costs of £620 ($835). This means a Terrorism Act offense will be recorded on his criminal record. But as long as he does not re-offend within the 12-month period, no further action will be taken against him.
Rabbani had argued his electronic devices should have been protected under the latter category, as they contained confidential information related to his work. The judge said that Rabbani did not make this clear to the officers who initially interrogated him, but did say so later in a prepared statement following his arrest. She described Rabbani as “of good character,” acknowledged he was “trying to protect confidential material on his devices,” and noted that “the importance of passwords and PIN numbers in the 21st century cannot be overstated.” However, she still concluded that his “decision not to provide the information when requested by the examining officers” amounted to “a wilful obstruction of the lawful examination in the circumstances.”
A lawyer was charged and found guilty of a terrorism offence for refusing to decrypt a device containing sensitive client information. A baseline part of the criminal justice system is that what is said between a client and their lawyer is protected speech, but this protection is under threat in the UK: solicitors who do their duty and uphold the oaths to their clients risk serious convictions that may permanently refigure their lives and liberties. This dismantling of baseline aspects of our legal systems to fight ‘terrorism’ are ludicrous and do more harm to our societies than can be inflicted upon us by violent extremists and criminals.
According to Citizen Lab researcher Christopher Parsons, these same powers that target journalists can be used against non-journalists under C-13. And the only reason we know about the aforementioned cases is that the press has a platform to speak out.
“This is an area where transparency and accountability are essential,” Parsons said in an interview. “We’ve given piles and piles of new powers to law enforcement and security agencies alike. What’s happened to this journalist shows we desperately need to know how the government uses its powers to ensure they’re not abused in any way.”
“I expect that the use of these particular powers will become more common as the police get more used to using it and more savvy in using them,” Parsons said.
These were powers that were ultimately sold to the public (and passed into law) as needed to ‘child pornography’. And now they’re being used to snoop on journalists to figure out who their sources are, without being mandated to report on the regularity at which the powers are used to the efficacy of such uses. For some reason, this process doesn’t inspire a lot of confidence in me.
Last month, Public Safety Canada followed through on commitments to review and consult on Canada’s national security framework. The process reviews powers that were passed into law following the passage of Bill C-51, Canada’s recent controversial anti-terrorism overhaul, as well as invite a broader debate about Canada’s security apparatus. While many consultation processes have explored expansions of Canada’s national security framework, the current consultation constitutes the first modern day attempt to explore Canada’s national security excesses and deficiencies. Unfortunately, the framing of the consultation demonstrates minimal direct regard for privacy and civil liberties because it is primarily preoccupied with defending the existing security framework while introducing a range of additional intrusive powers. Such powers include some that have been soundly rejected by the Canadian public as drawing the wrong balance between digital privacy and law enforcement objectives, and heavily criticized by legal experts as well as by all of Canada’s federal and provincial privacy commissioners.
The government has framed the discussion in two constituent documents, a National Security Green Paper and an accompanying Background Document. The government’s framings of the issues are highly deficient. Specifically, the consultation documents make little attempt to explain the privacy and civil liberties implications that can result from the contemplated powers. And while the government is open to suggestions on privacy and civil liberties-enhancing measures, few such proposals are explored in the document itself. Moreover, key commitments, such as the need to impose judicial control over Canada’s foreign intelligence agency (CSE) and regulate the agency’s expansive metadata surveillance activities, are neither presented nor discussed (although the government has mentioned independently that it still hopes to introduce such reforms). The consultation documents also fail to provide detailed suggestions for improving government accountability and transparency surrounding state agencies’ use of already-existent surveillance and investigative tools.
In light of these deficiencies, we will be discussing a number of the consultation document’s problematic elements in a series of posts, beginning with the government’s reincarnation of a highly controversial telecommunication subscriber identification power.
I wrote the first of what will be many analyses of the Canadian government’s national security consultation with a good friend and colleague, Tamir Israel.
The subscriber identification powers we write about are not really intended for national security but will, instead, be adopted more broadly by law enforcement so they can access the data indiscriminately. Past legislative efforts have rejected equivalent powers: it remains to be seen if the proposal will (once more) be successfully rejected, or whether this parliament will actually establish some process or law that lets government agencies get access to subscriber identification information absent a warrant.
The new, six-month extension of emergency powers creates France’s longest state of emergency since the Algerian War in the 1950s. The new law restores or extends previous emergency provisions, such as empowering police to carry out raids and local authorities to place suspects under house arrest without prior judicial approval. It also expands those powers, for example allowing the police to search luggage and vehicles without judicial warrants. In addition it reinstates warrantless seizures of computer and cellphone data that France’s highest legal authority had struck down as unconstitutional, adding a few restrictions that still fall short of judicial oversight.
In separate reports in February, Human Rights Watch and Amnesty International documented more than three dozen cases in which the use of these emergency powers violated universal rights to liberty, privacy, or freedoms of movement, association and expression. The two groups also found that the emergency acts lost suspects jobs, traumatized children, and damaged homes. The vast majority of those targeted were Muslims. Those interviewed said the actions left them feeling stigmatized and eroded their trust in the French authorities. The latest version of the emergency law risks compounding these effects.
The decisions to advance unconstitutional and discriminatory ‘security’ laws and policies following serious crimes threaten to undermine democracies while potentially strengthening states. But worryingly there are fewer and fewer loud voices for the rough and tumble consequences of maintaining a democratic form of governance as opposed to those who assert that a powerful state apparatus is needed if normalcy is to exist. The result may be the sleepwalking from governments for and by the people, to those that protect citizen-serfs and harshly discriminate against difference.
Even so, the effort is raising concern from safety experts who say the technology has major limitations that can be very dangerous. Self-driving cars have trouble seeing in bad weather. Sudden downpours, snow and especially puddles make it difficult for autonomous vehicles to detect lines on pavement and thereby stay in one lane.
Walker Smith added that self-driving cars have sometimes confused bridges for other obstacles. “People need to understand both the potential and the limitations of these systems, and inviting them inside is part of that education,” he said.
The vehicles also have difficulty understanding human gestures — for example, a crosswalk guard in front of a local elementary school may not be understood, said Mary Cummings, director of Duke University’s Humans and Autonomy Lab, at a Senate hearing in March. She recommended that the vehicles not be allowed to operate near schools.
Then there’s a the human factor: Researchers have shown that people like to test and prank robots. Today, a GPS jammer, which some people keep in their trunks to block police from tracking them, will easily throw off a self-driving car’s ability to sense where it is, Cummings said.
Current self-driving cars often cannot see which lane they’re in, if it’s raining. They don’t understand what a bridge is versus other road-terrain. They don’t understand what a cross-walk guard is. And they are reliant on a notoriously brittle location technology.
What can go wrong with testing them in urban centres then, exactly?
In the past year, the Australian Federal Police has been asked to investigate a piece in The Australian about the Government’s’ leaked Draft Defence White Paper, and a Fairfax Media story on a proposal to reform to citizenship laws.
Just last week, police raided Parliament House in an attempt to track down the source of an embarrassing leak about the National Broadband Network. It’s feared that these investigations, along with increased penalties for whistleblowers, are hindering the ability of journalists to hold policymakers to account.
It was with this in mind that the Opposition eventually voted for the amendments that created the Journalist Information Warrant scheme, and allowed the Data Retention laws to pass last year. In a last minute effort to shore up support for the legislation, the Government agreed to add provisions for ‘safeguards’ that would, in theory, prevent the scheme being used to target journalists’ sources. However, a closer look at the scheme reveals its flaws.
When a democracy creates warranting schemes solely to determine who is willing to speak with journalists, the democracy is demonstrably in danger of slipping free of the grasp of the citizenry.
Litt’s article focuses on finding new ways of conceptualizing privacy such that the current activities of intelligence agencies and law enforcement organizations are made legal, and thus shift the means by which their activities are legally and constitutionally evaluated. While his proposal to overturn much of the third-party doctrine coheres with the positions of many contemporary scholars his suggested replacement — that we should no longer focus on collecting data, but on use of collected data — would eviscerate basic privacy protections. In particular, I think that it’s important we not just ignore the ‘search’ aspect of fourth amendment law: we need to recalibrate what a search is within the context of today’s reality. And that doesn’t mean just letting the government collect with fewer baseline restrictions but instead modifying what a ‘search’ is itself.
The core aspects of the article that give a flavour of the entire argument are:
I suggest that—at least in the context of government acquisition of digital data—we should think about eliminating the separate inquiry into whether there was a “reasonable expectation of privacy” as a gatekeeper for Fourth Amendment analysis. In an era in which huge amounts of data are flowing across the Internet; in which people expose previously unimagined quantities and kinds of information through social media; in which private companies monetize information derived from search requests and GPS location; and in which our cars, dishwashers, and even light bulbs are connected to the Internet, trying to parse out the information in which we do and do not have a reasonable expectation of privacy strikes me as a difficult and sterile task of line-drawing. Rather, we should simply accept that any acquisition of digital information by the Government implicates Fourth Amendment interests.
After all, the concept of a “reasonable expectation of privacy” as a talisman of Fourth Amendment protection is not found in the text of the Fourth Amendment itself, which says merely that “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.” It was only in 1967, in Katz, that the Supreme Court defined a search as the invasion of a “reasonable expectation of privacy.” Katz revisited Olmstead v. United States after 40 years; the accelerating pace of modern technological change suggests to me that fifty years is not too soon to revisit Katz. My proposal is that the law should focus on determining what is unreasonable rather than on what is a search.
What I have suggested, however, is that—at least in the area of government collection of digital data—we eliminate the preliminary analysis of whether someone has a reasonable expectation of privacy in the data and proceed directly to the issue of whether the collection is reasonable; that the privacy side of that analysis should be focused on concrete rather than theoretical invasions of privacy; and that courts in evaluating reasonableness should look at the entirety of the government’s activity, including the “back end” use, retention restrictions, and the degree of transparency, not just the “front end” activity of collection.
Crocker’s article is a defining summary of the legal problems associated with the U.S. Government’s attempts to use malware to conduct lawful surveillance of persons suspected of breaking the law. He explores how even after the law is shifted to authorize magistrates to issue warrants pertaining to persons outside of their jurisdictions, broader precedent concerning wiretaps may prevent the FBI or other actors from using currently-drafted warrants to deploy malware en masse. Specifically, the current framework adopted might violate basic constitutional guarantees that have been defined in caselaw over the past century, to the effect of rendering mass issuance of malware an unlawful means of surveillance.