Categories
Writing

Amendments in Bill C-2 Would Establish an Intelligence Role for the Canadian Coast Guard

While much of the attention around Canada’s Bill C-2: An Act respecting certain measures relating to the security of the border between Canada and the United States and respecting other related security measures has focused on its lawful access and interception aspects, one notable change has flown under the radar: amendments to the Oceans Act that quietly expand the Canadian Coast Guard’s mandate to include intelligence functions.

Specifically, the bill proposes updating the Coast Guard’s responsibilities to include:

security, including security patrols and the collection, analysis and disclosure of information or intelligence.1

This language, paired with provisions granting the Minister explicit authority to collect, analyze, and disclose intelligence,2 marks a meaningful shift. The update would echo the U.S. model, where the Coast Guard is both a maritime safety organization and an intelligence actor. The U.S. Coast Guard Intelligence (CG-2) has long played a dual role in maritime domain awareness and national security operations.

Why does this matter?

There are a few strategic implications:
1. NATO and National Security Alignment: The expanded role may help Canada meet NATO funding expectations, especially where the Coast Guard is deployed to conduct maritime surveillance and to maintain an Arctic presence.
2. Statutory Authority: These changes might establish a legal basis for intelligence collection practices that are already occurring, but until now may have lacked clear legislative grounding.
3. Redundancy and Resilience: With global intelligence sharing under strain, having a domestic maritime intelligence function could serve as a backstop if access to allied intelligence is reduced.
4. Northern Operations: Coast Guard vessels, which are not militarized like Royal Canadian Navy warships, are well-positioned to operate in the Arctic and northern waters, offering intelligence capabilities without the geopolitical weight of a military presence.

To be clear, this wouldn’t transform the Canadian Coast Guard into an intelligence agency. But it would give the institution statutory authorities that, until now, have not explicitly been within its official purview.

It’s a small clause in a big bill, but one worth watching. As researchers, journalists, and civil society take a closer look at Bill C-2, this expansion of maritime intelligence authority could (and should) draw more attention.


  1. 30(2) of C-2, amending 41(1)(f) of the Oceans Act ↩︎
  2. 30(2) of C-2, amending 41(2) of the Oceans Act ↩︎
Categories
Links Writing

Japan’s New Active Cyberdefence Law

Japan has passed legislation that will significantly reshape the range of cyber operations that its government agencies can undertake. As reported by The Record, the law will enable the following.

  1. Japan’s Self-Defence Forces will be able to provide material support to allies under the justification that failing to do so could endanger the whole of the country.
  2. Japanese LEAs can infiltrate and neutralize hostile servers before any malicious activity has taken place and to do so below the level of an armed attack against Japan.
  3. The Self-Defence Forces be authorized to undertake offensive cyber operations against particularly sophisticated incidents.
  4. The government will be empowered to analyze foreign internet traffic entering the country or just transiting through it. (The government has claimed it won’t collect or analyze the contents of this traffic.) Of note: the new law will not authorize the government to collect or analyze domestically generated internet traffic.
  5. Japan will establish an independent oversight panel that will give prior authorization to all acts of data collection and analysis, as well as for offensive operations intended to target attackers’ servers. This has some relationship to Ministerial oversight of the CSE in Canada, though perhaps (?) with a greater degree of control over the activities understand by Japanese agencies.

The broader result of this legislative update will be to further align the Japanese government, and its agencies, with its Five Eyes friends and allies.

It will be interesting to learn over time whether these activities are impaired by the historical stovepiping of Japan’s defence and SIGINT competencies. Historically the strong division between these organizations impeded cyber operations and was an issue that the USA (and NSA in particular) had sought to have remedied over a decade ago. If these issues persist then the new law may not be taken up as effectively as would otherwise be possible.

Categories
Writing

Details from the DNI’s Annual VEP Report

For a long time external observers wondered how many vulnerabilities were retained vs disclosed by FVEY SIGINT agencies. Following years of policy advocacy there is some small visibility into this by way of Section 6270 of Public Law 116-92. This law requires the U.S. Director of National Intelligence (DNI) to disclose certain annual data about the vulnerabilities disclosed and retained by US government agencies.

The Fiscal Year 2023 VEP Annual Report Unclassified Appendix reveals “the aggregate number of vulnerabilities disclosed to vendors or the public pursuant to the [VEP] was 39. Of those disclosed, 29 of them were initial submissions, and 10 of them were reconsiderations that originated in prior years.”1

There can be many reasons to reassess vulnerability equities. Some include:

  1. Utility of given vulnerabilities decrease either due to changes in the environment or research showing a vulnerability would not (or would no longer) have desired effect(s) or possess desired operational characteristics.
  2. Adversaries have identified the vulnerabilities themselves, or through 4th party collection, and disclosure is a defensive action to protect US or allied assets.
  3. Independent researchers / organizations are pursuing lines of research that would likely result in finding the vulnerabilities.
  4. By disclosing the vulnerabilities the U.S. agencies hope or expect adversaries to develop similar attacks on still-vulnerable systems, with the effect of masking future U.S. actions on similarly vulnerable systems.
  5. Organizations responsible for the affected software (e.g., open source projects) are now perceived as competent / resourced to remediate vulnerabilities.
  6. The effects of vulnerabilities are identified as having greater possible effects than initially perceived which rebalances disclosure equities.
  7. Orders from the President in securing certain systems result in a rebalancing of equities regarding holding the vulnerabilities in question.
  8. Newly discovered vulnerabilities are seen as more effective in mission tasks, thus deprecating the need for the vulnerabilities which were previously retained.
  9. Disclosure of vulnerabilities may enable adversaries to better target one another and thus enable new (deniable) 4th party collection opportunities.
  10. Vulnerabilities were in fact long used by adversaries (and not the U.S. / FVEY) and this disclosure burns some of their infrastructure or operational capacity.
  11. Vulnerabilities are associated with long-terminated programs and the release has no effect of current, recent, or deprecated activities.

This is just a very small subset of possible reasons to disclose previously-withheld vulnerabilities. While we don’t have a strong sense of how many vulnerabilities are retained each year, we do at least have a sense that rebalancing of equities year-over-year(s) is occurring. Though without a sense of scale the disclosed information is of middling value, at best.

Categories
Writing

What is the Role of Cyber Operators in Assessing Effectiveness or Shaping Cyber Policy?

An anonymous European Intelligence Official wrote an oped in July entitled, “Can lawyers lose wars by stifling cyber capabilities?” The article does a good job in laying out why a cyber operator — that is, someone who is presumably relatively close to either planning or undertaking cyber operations — is deeply frustrated by the way in which decision-making is undertaken.

While I admit to having some sympathy for the author’s plight I fundamentally disagree with much of their argument, and think that the positions they hold should be taken up and scrutinised. In this post, I’m really just pulling out quotations from the article and then providing some rebuttal or analysis — you’re best off reading it, first, if you want to more fully follow along and assess whether I’m being fair to the author and the points they are making.

With that out of the way, here we go….

Law is no longer seen as a system of checks and balances but as a way to shape state behaviour in cyberspace

Yes, this is one of the things that laws are actually supposed to do. You may (reasonably in some cases) disagree with the nature of the laws and their effects, but law isn’t a mere “check and balance.” And, especially where there is no real ability to contest interpretations of law (because they are administered by government agencies largely behind closed doors) it is particularly important for law to have a stronger guiding function in order to maintain democratic legitimacy and social trust in government operations.

Idealistic legalism causes legal debates on cyber capabilities to miss a crucial discussion point: what operational constraints are we willing to accept and what consequences does that have for our national security?

Sure, but some of this is because the USA government is so closed mouthed about its capacities. Consider if there was a more robust effort to explain practice such as in the case of some European agencies? I would note that the Dutch, as an example, are sometimes pretty explicit about their operations which is then helpful for considering their activities with respect to authorising laws and associated national and international norms.

Laws attempt to capture as many activities in cyberspace as possible. To do so, legal frameworks must oversimplify. This is ill-suited to such a complex domain

This seems to not appreciate how law tends, at least in some jurisdictions, to be broader in scope and then supplemented by regulations or policies. However, where regulations or policies have been determined as regularly insufficient there may be a decision that more detailed laws are now necessary. To an extent, this is the case post-Snowden and with very good reason, and as demonstrated in the various non-compliance reports that has been found with certain NSA (and other American intelligence community) operations over time.

The influence of practitioners slowly diminishes as lawyers increasingly take the lead in shaping senior leadership opinions on proposed cyber operations rather than merely advising.

I can appreciate the frustration of seeing the leadership move from operations practitioners to policy/legal practitioners.1 But that shift between whether organisations are being led by operations practitioners or those focused in law/policy can be a normal back and forth.

And to be entirely honest the key thing — and the implicit critique throughout this whole piece — is that the decision makers understand what the ops folks are saying.2 Those in decision making roles have a lot of responsibilities and, often, a bigger or different picture of the implications of operations.

I’m in no way saying that lawyers should be the folks to always call the shots3 but just because you’re in operations doesn’t mean that you necessarily are making the right calls broadly and, instead, may be seeing the right calls through your particular lens and mission. That lens and mission may not always be sufficient in coming to a conclusion that aligns more broadly with agency or national or international policy intents/goals.

… a law might stipulate that a (foreign) intelligence agency cannot collect information from systems owned by the citizens of its country. But what if, as Chinese and Russian cyber threat actors do, a system belonging to a citizen is being abused to route attack traffic through? Such an operational development is not foreseen, and thus not prescribed, by law. To collect information would then be illegal and require judicial overhaul – a process that can take years in a domain that can see modus operandi shift in a matter of days.

There may be cases where you have particularly risk adverse decision makers or, alternately, particularly strong legal limitations that preclude certain kinds of operations.

I would note that it is against the law to simply target civilians in conflict scenarios on grounds that doing so runs counter to the agreed-upon laws of war (recognising they are often not adhered to). Does this have the effect of impeding certain kinds of military activities? Yes. And that may still be the right decisions notwithstanding the consequences it may have on the ability to conduct some operations and/or reduce their efficacy.

In the cyber context, the complaint is that certain activities are precluded on the basis that the law doesn’t explicitly recognise and authorise them. Law routinely leaves wiggle rooms and part of the popular (and sometimes private…) problem has been how intelligence lawyers are perceived of as abusing that wiggle room — again, see the NSA and other agencies as they were denuded in some of the Snowden revelations, and openly opposite interpretations of legislation that was adopted to authorise actions that legislators had deliberately sought to preclude.4 For further reasons the mistrust may exist between operators and legislators, in Canada you can turn to the ongoing historical issues between CSIS and the Federal Court which suggests that the “secret law and practices” adopted by Canada’s IC community may counter to the actual law and legal processes, and then combine that with some NSIRA findings that CSE activities may have taken place in contravention of Canadian privacy law.

In the above context, I would say that lots of legislators (and publics) have good ground to doubt the good will or decision-making capacity of the various parties within national ICs. You don’t get to undertake the kind of activities that happened, previously, and then just pretend that “it was all in the recent past, everything’s changed, trust us guys.”

I would also note: the quoted material makes an assumption that policy makers have not, in fact, considered the scenario the author is proposing and then rejected it as a legitimate way of operating. The fact that a decision may not have gone your way is not the same as your concerns not being evaluated in the process of reaching a conclusion.

When effectiveness is seen as secondary, cyber activities may be compliant, but they are not winning the fight.

As I have been writing in various (frustrating) peer reviews I’ve been doing: evidence of this, please, as opposed to opinion and supposition. Also, “the fight” will be understood and perceived by different people in different positions in different agencies: a universal definition should not be presumed.

…constraints also incur costs due to increased bureaucratic complexity. This hampers operational flexibility and innovation – a trade-off often not adequately weighed by, or even visible to, law- and decision-makers. When appointing ex-ante oversight boards or judicial approval, preparation time for conducting cyber operations inevitably increases, even for those perfectly legal from the beginning.

So, in this case the stated problem is that legislators and decision makers aren’t getting the discrete kinds of operational detail that this particular writer thinks are needed to make the “right” trade off decisions.

In some cases….yeah. That’ll be the case. Welcome to the hell of people not briefing up properly, or people not understanding because briefing materials weren’t scoped or prepared right, and so forth. That is: welcome to the government (or any sufficiently large bureaucracy)!

But more broadly, the complaint is that the operator in question knows better than the other parties but without, again, specific and clear evidence that the trade offs are incorrect. I get that spooky things can’t be spoken aloud without them becoming de-spookified, but picture a similar kind of argument in any other sector of government and you’ll get the same kind of complaint. Ops people will regularly complain about legislators or decision makers when they don’t get their way, their sandcastles get crushed, or they have to do things in less-efficient ways in their busy days. And sometimes they’re right to complain and, in others, there is a lot more at stake than what they see operationally going on.

This is a losing game because, as Calder Walton noted, ‘Chinese and Russian services are limited only by operational effectiveness’.

I don’t want to suggest I disagree! But, at the same time, this is along the lines of “autocracies are great because they move faster than democracies and we have to recognise their efficiency” arguments that float around periodically.5

All of which is to say: autocracies and dictatorships have different internal logics to their bureaucracies that can have corresponding effects on their operations.

While it may be “the law” that impedes some Five Eyes/Western agencies’ activities, you can picture the need to advance the interests of kleptocrats or dictators’ kids, gin up enough ransomware dollars to put food on the team’s table, and so forth, as establishing some limits on the operational effectiveness of autocratic governments’ intelligence agencies.

It’s also worth noting that “effectiveness” can be a contested concept. If you’re OK blundering around and burning your tools and are identified pretty often then you may have a different approach to cyber operations, generally, as opposed to situations where being invisible is a key part of operational development. I’m not trying to suggest that the Russians, Chinese, and other adversaries just blunder about, nor that the FVEY are magical ghosts that no one sees on boxes and undertaking operations. However, how you perceive or define “effective” will have corresponding consequences for the nature and types of operations you undertake and which are perceived as achieving the mission’s goals.

Are agencies going to publicly admit they were unable to collect intelligence on certain adversary cyber actors because of legal boundaries?

This speaks to the “everything is secret and thus trust us” that is generally antithetical to democratic governance. To reverse things on the author: should there be more revelation of operations that don’t work so that they can more broadly be learned from? The complaint seems to be that the lawyers et al don’t know what they’re doing because they aren’t necessarily exposed to the important spooky stuff, or understand its significance and importance. To what extent, then, do the curtains need to open some and communicate this in effective ways and, also, the ways in which successes have previously happened.

I know: if anything is shown then it blows the whole premise of secret operations. But it’s hard to complain that people don’t get the issues if no facts are brought to the table, whereas the lawyers and such can point to the laws and at least talk to them. If you can’t talk about ops, then don’t be surprised that people will talk about what is publicly discussable…and your ops arguments won’t have weight because they don’t even really exist in the room where the substantive discussions about guardrails may be taking place.


In summary: while I tend to not agree with the author — and disagree as someone who has always been more on the policy and/or law side of the analytic space — their article was at least thought provoking. And for that alone I think that it’s worth taking the time to read their article and consider the arguments within it.


  1. I would, however, would hasten to note that the head of NSA/Cyber Command tends to be a hella lot closer to “ops” by merit of a military leadership. ↩︎
  2. And, also, what the legal and policy teams are saying… ↩︎
  3. Believe me on this point… ↩︎
  4. See, as example: “In 2006, after Congress added the requirement that Section 215 orders be “relevant to” an investigation, the DOJ acknowledged that language was intended to impose new protections. A fact sheet about the new law published by the DOJ stated: “The reauthorizing legislation’s amendments provide significant additional safeguards of Americans’ civil liberties and privacy,” in part by clarifying, “that a section 215 order cannot be issued unless the information sought is relevant to an authorized national security investigation.” Yet just months later, the DOJ convinced the FISC that “relevant to” meant “all” in the first Section 215 bulk dragnet order. In other words, the language inserted by Congress to ​limit ​the scope of what information could be gathered was used by the government to say that there were ​no limits​.” From: Section 215: A Brief History of Violations. ↩︎
  5. See, as examples, the past 2-4 years ago when there was a perception that the Chinese response to Covid-19 and the economy was superior to everyone else that was grappling with the global pandemic. ↩︎
Categories
Writing

Publicly Normalizing Significant Espionage Operations is a Good Thing

The USA government recently took a bad beat when it came to light that alleged Chinese threat actors undertook a pretty sophisticated espionage operation that got them access to sensitive email communications of members of the US government. As the details come out it seems as though the Secretary of State and his inner circle weren’t breached but that other senior officials managing the USA-China relationship were.

Still, the actual language the US government is using to describe the espionage operation is really good to read. As an example, the cybersecurity director of the NSA, Rob Joyce, has stated that:

“It is China doing espionage […] That is what nation-states do. We need to defend against it, we need to push back on it, but that is something that happens.”

Why is this good? Because the USA was successfully targeted by an advanced espionage operation that has likely serious effects but this is normal, and Joyce is saying so publicly. Adopting the right language in this space is all too rare when espionage or other activities are often cast as serious ‘attacks’ or described using other inappropriate or bombastic language.

The US government’s language helps to clarify what are, and are not, norms-violating actions. Major and successful espionage operations don’t violate acceptable international norms. Moreover, not only does this make clear what is a fair operation to take against the USA; it, also, makes clear what the USA/FVEY think are appropriate actions to take towards other international actors. The language must be read as also justifying the allies’ own actions and effectively preempts any arguments from China or other nations that successful USA or FVEY espionage operations are anything other than another day on the international stage.

Clearly this is not new language. Former DNI Clapper, when describing the Office of Personnel Management hack in 2015, said,

You have to kind of salute the Chinese for what they did. If we had the opportunity to do that, I don’t think we’d hesitate for a minute.

But it bears regularly repeating to establish what remain ‘appropriate’ in terms of signalling ongoing international norms. This signalling is not just to adversary nations or friendly allies however, but also to more regular laypersons, national security practitioners, or other operators who might someday work on the national or international stage. Signalling has a broader educational value for them (and for new reporters who end up picking up the national security beat someday in the future).

At an operational level, it’s also worth noting that this is intelligence gathering that can potentially lower temperatures. Knowing what the other side is thinking or how they’re interpreting things is super handy if you want to defrost some of your diplomatic relations. Though it can obviously hurt by losing advantages in your diplomatic positions, too, of course! And especially if it lets the other side outflank you.

Still, I have faith in the EquationGroup’s ongoing collection against even hard targets in China and elsewhere to help balance the information asymmetry equation. While the US suffered a now-publicly reported loss of information security, the NSA is actively working to achieve similar (if less public) successes of its own on a daily basis. And I’m sure they’re racking up wins of their own!

Categories
Links Roundup

The Roundup for December 1-31, 2019 Edition

Alone Amongst Ghosts by Christopher Parsons

Welcome to this edition of The Roundup! Enjoy the collection of interesting, informative, and entertaining links. Brew a fresh cup of coffee or grab yourself a drink, find a comfortable place, and relax.


This month’s update is late, accounting for holidays and my generally re-thinking how to move forward (or not) with these kinds of posts. I find them really valuable, but the actual interface of using my current client (Ulysses) to draft elements of them is less than optimal. So expect some sort of changes as I muddle through how to improve workflow and/or consider the kinds of content that make the most sense to post.


Inspiring Quotation

Be intensely yourself. Don’t try to be outstanding; don’t try to be a success; don’t try to do pictures for others to look at—just please yourself.

  • Ralph Steiner

Great Photography Shots

Natalia Elena Massi’s photographs of Venice, flooded, are exquisite insofar as they are objectively well shot while, simultaneously, reminding us of the consequences of climate change. I dream of going to Venice to shoot photos at some point and her work only further inspires those dreams.

Music I’m Digging

I spent a lot of the month listening to my ‘Best of 2019’ playlist, and so my Songs I Liked in December playlist is a tad threadbare. That said, it’s more diverse in genre and styles than most monthly lists, though not a lot of the tracks made the grade to get onto my best of 2019 list.

  • Beck-Guero // I spent a lot of time re-listening to Beck’s corpus throughout December. I discovered that I really like his music: it’s moody, excitable,and catchy, and always evolving from album to album.
  • Little V.-Spoiler (Cyberpunk 2077) (Single) // Cyberpunk 2077 is one of the most hyped video games for 2020, and if all of the music is as solid and genre-fitting as this track, then the ambiance for the game is going to be absolutely stellar.

Neat Podcast Episodes

  • 99% Invisible-Racoon Resistance // As a Torontonian I’m legally obligated to share this. Racoons are a big part of the city’s identity, and in recent years new organic garbage containers were (literally) rolled out that were designed such that racoons couldn’t get into them. Except that some racoons could! The good news is that racoons are not ‘social learners’ and, thus, those who can open the bins are unlikely to teach all the others. But with the sheer number of trash pandas in the city it’s almost a certainty that a number of them will naturally be smart enough and, thus, garbage will continue to litter our sidewalks and laneways.

Good Reads

  • America’s Dark History of Killing Its Own Troops With Cluster Munitions // Ismay’s longform piece on cluster munitions is not a happy article, nor does the reader leave with a sense that this deadly weapon is likely to be less used. His writing–and especially the tragedies associated with the use of these weapons–is poignant and painful. And yet it’s also critically important to read given the barbarity of cluster munitions and their deadly consequences to friends, foes, and civilians alike. No civilized nation should use these weapons and all which do use them cannot claim to respect the lives of civilians stuck in conflict situations.
  • Project DREAD: White House Veterans Helped Gulf Monarchy Build Secret Surveillance Unit // The failure or unwillingness of the principals, their deputies, or staff to acknowledge they created a surveillance system that has systematically been used to hunt down illegitimate targets—human rights defenders, civil society advocates, and the like—is disgusting. What’s worse is that democratizing these surveillance capabilities and justifying the means by which the program was orchestrated almost guarantees that American signals intelligence employees will continue to spread American surveillance know-how to the detriment of the world for a pay check, the consequences be damned (if even ever considered in the first place).
  • The War That Continues to Shape Russia, 25 Years Later // The combination of the (re)telling of the first Russia-Chechen War and photographs from the conflict serve as reminders of what it looks like when well-armed nation-states engage in fullscale destruction, the human costs, and the lingering political consequences of wars-now-past.
  • A New Kind of Spy: How China obtains American technological secrets // Bhattacharjee’s 2014 article on Chinese spying continues to strike me as memorable, and helpful in understanding how the Chinese government recruits agents to facilitate its technological objectives. Reading the piece helps to humanize why Chinese-Americans may spy for the Chinese government and, also, the breadth and significance of such activities for advancing China’s interests to the detriment of America’s own.
  • Below the Asphalt Lies the Beach: There is still much to learn from the radical legacy of critical theory // Benhabib’s essay showcasing how the history of European political philosophy over the past 60 years or so are in the common service of critique, and the role(s) of Habermasian political theory in both taking account of such critique whilst offering thoughts on how to proceed in a world of imperfect praxis, is an exciting consideration of political philosophy today. She mounts a considered defense of Habermas and, in particular, the claims that his work is overly Eurocentric. Her drawing a line between the need to seek emancipation while standing to confront and overcome the xenophobia, authoritarianism, and racism that is sweeping the world writ large is deeply grounded on the need for subjects like human rights to orient and ground critique. While some may oppose such universalism on the same grounds as they would reject the Habermasian project there is a danger: in doing so, not only might we do a disservice to the intellectual depth that undergirds the concept of human rights but, also, we run the risk of losing the core means by which we can (re)orient the world towards enabling the conditions of freedom itself.
  • Ghost ships, crop circles, and soft gold: A GPS mystery in Shanghai // This very curious article explores the recent problem of ships’ GPS transponders being significantly affected while transiting the Yangtze in China. Specifically, transponders are routinely misplacing the location of ships, sometimes with dangerous and serious implications. The cause, however, remains unknown: it could be a major step up in the (effective) electronic warfare capabilities of sand thieves who illegally dredge the river, and who seek to escape undetected, or could be the Chinese government itself testing electronic warfare capabilities on the shipping lane in preparation of potentially deploying it elsewhere in the region. Either way, threats such as this to critical infrastructure pose serious risks to safe navigation and, also, to the potential for largely civilian infrastructures to be potentially targeted by nation-state adversaries.
  • A Date I Still Think About // These beautiful stories of memorable and special dates speak to just how much joy exists in the world, and how it unexpectedly erupts into our lives. In an increasingly dark time, stories like this are a kind of nourishment for the soul.

Cool Things

  • The Deep Sea // This interactive website that showcases the sea life we know exists, and the depths at which it lives, is simple and spectacular.
  • 100 Great Works Of Dystopian Fiction // A pretty terrific listing of books that have defined the genre.
Categories
Links

A Deep Dive Into Russian Surveillance In The Silicon Valley Area

Via Foreign Policy:

This focus on signals and technical intelligence persisted until much more recently, multiple former U.S. intelligence officials told me. “It was almost like everyone they had there was a technical guy, as opposed to a human-intelligence guy,” one former official recalled. “The way they protected those people — they were rarely out in the community. It was work, home, work, home. When they’d go out and about, to play hockey or to drink, they’d be in a group. It was hard to penetrate.” The same official also noted that San Francisco was integral to the discovery by U.S. intelligence of a new class of Russian “technical-type” intelligence officer, working for the rough Russian equivalent of the National Security Agency, before this organization was eventually folded by Putin back into the FSB. This group, which was not based at the consulate itself, was identified via its members’ travel patterns — they would visit the Bay Area frequently — and the types of individuals, all in high-tech development, with whom they sought contact. According to this former U.S. official, these Russian intelligence officers were particularly interested in discussing cryptology and the Next Generation Internet program.

But it was the consulate’s location — perched high atop that hill in Pacific Heights, with a direct line of sight out to the ocean — that likely determined the concentration of signals activity. Certain types of highly encrypted communications cannot be transmitted over long distances, and multiple sources told me that U.S. officials believed that Russian intelligence potentially took advantage of the consulate’s location to communicate with submarines, trawlers, or listening posts located in international waters off the Northern California coast. (Russian intelligence officers may also have been remotely transmitting data to spy stations offshore, multiple former intelligence officials told me, explaining the odd behaviors on Stinson Beach.) It is also “very possible,” said one former intelligence official, that the Russians were using the San Francisco consulate to monitor the movements, and perhaps communications, of the dozen or so U.S. nuclear-armed submarines that routinely patrol the Pacific from their base in Washington state.

All in all, said this same official, it was “very likely” that the consulate functioned for Russia as a classified communications hub for the entire western United States — and, perhaps, the entire western part of the hemisphere.

There is a lot to this very long form piece, including descriptions of Russian intelligence operations and communications patterns, how lawful Russian overflights of American territory might be used for a variety of intelligence purposes, and the Trump administration’s likely cluelessness about why closing the Russian consulate in San Francisco was so significant. But most interestingly, for me, was how the consulate likely functioned as an outpost for Russian signals intelligence operations, both due to the depth of analysis in the article but also for what it tells us about how Western-allied consulates and diplomatic facilities are likely used.1 In effect, the concerns raised by former FBI and other American counter-intelligence officers speaks to how America and her allies may conduct their own forms of surveillance.

  1. In a provincial sense, the concerns and opinions espoused by American counter-intelligence officers also raises questions as to the role of Canada’s significant number of diplomatic facilities scattered throughout China and other regions where the United States is more challenged in building out State Department facilities.
Categories
Quotations

2017.11.29

Most fundamentally, is it in Canada’s interest to further normalize the growing use of CNA (Computer Network Attack) activities by states? Should CNA be classified as just another tool of statecraft? Should such capabilities be restricted to a deterrent role? Is cyber deterrence, whether through CNA capabilities or more conventional responses, even a practical goal, given difficulties of attribution and the inevitable overlap between CNE (Computer Network Exploitation) and CNA? Would improved defence and resilience be a preferable, or at least sufficient, response or are all three required?

Bill Robinson, “CSE to get foreign cyber operations mandate
Categories
Quotations

2017.11.28

As effective encryption spreads, it may well be that the future of SIGINT lies increasingly in “end point” operations and other activities designed to cripple or bypass that encryption, and some of those activities could certainly benefit from HUMINT assistance. But there are also pitfalls to that approach. Using on-the-scene people in foreign jurisdictions can mean putting individuals at extreme risk, and such operations also have increased potential to go wrong in ways that could expose Canada to extreme embarrassment and even retaliation. If the government is contemplating going down that road, it should probably be open with parliament and the public about its intentions.

Informed consent. Because it’s 2017.

Bill Robinson, “CSE and Bill C-59 overview
Categories
Links

Metadata in Context – An Ontological and Normative Analysis of the NSA’s Bulk Telephony Metadata Collection Program

Abstract:

In the aftermath of the Snowden revelations, the National Security Agency (NSA) responded to fears about warrantless domestic surveillance programs by emphasizing that it was collecting only the metadata, and not the content, of communications. When justifying its activities, the NSA offered the following rationale: because data involves content and metadata does not, a reasonable expectation of privacy extends only to the former but not the latter. Our paper questions the soundness of this argument. More specifically, we argue that privacy is defined not only by the types of information at hand, but also by the context in which the information is collected. This context has changed dramatically. Defining privacy as contextual integrity we are able, in the first place, to explain why the bulk telephony metadata collection program violated expectations of privacy and, in the second, to evaluate whether the benefits to national security provided by the program can be justified in light of the program’s material costs, on the one hand, and its infringements on civil liberties, on the other hand.

A terrific paper from Paula Kift and Helen Nissenbaum.