- An (app-agnostic) iOS shortcut for link-blogging. // I’ve been trying this and, on the whole, I’m pretty happy with it so far. It took me a bit to realize that I had to copy text I wanted to automatically include in the text from the article the shortcut can paste, but beyond that has been working really well!
- Woman ordered to stop smoking at home in ontario ruling. “If you smoke and you live in a condominium in Ontario, a little-noticed ruling may have stubbed out your ability to light up inside your own home. At the very least, it has given new legal heft to a condominium corporation’s ability to ban all smoking indoors if it so chooses … In what is seen as a first in Ontario, Justice Jana Steele ruled in the Ontario Superior Court of Justice on Oct. 15 against Ms. Linhart and ordered her to stop smoking in her own home.” // Not going to lie: as someone who lives in a shared building this is pretty exciting news, though also reveals just how much power condo rules have over how individuals can enjoy the space they rent or own.
- To report on tech, journalists must also learn to report on china. “Two years ago, Sean McDonald, cofounder of Digital Public, and I described a global internet landscape fractured by what we called digitalpolitik, or the political, regulatory, military, and commercial strategies employed by governments to project influence in global markets. Now technology stories are just as much about policy, diplomacy, and power as they are about society, engineering, and business.” // This is definitely one of the most succinct, and well sourced, pieces I’ve come across recently that warns of how China needs to be covered by technology journalists. I would just hasten to affirm that similar warnings should apply to scholars and policy makers as well.
- Chinese-style censorship is no fix for the covid-19 infodemic. “Rather than creating an efficient information curation model, regulator and company wars against ‘rumours’ and ‘harmful content’ have allowed misinformation and extreme content to thrive on the Chinese internet.”
- The Huawei war. “Whatever happens to Huawei in the near future, China, Russia and other countries have received the message loud and clear: achieving technological sovereignty is imperative. China had grasped the importance of this even before Trump launched his attack, which only strengthened the sense of urgency. It would be ironic if the ultimate effect of the US’s war on Huawei was a much more technologically advanced and independent China, with a completely different supply chain that included no American companies.” // Definitely one of the better summations of where things are with Huawei as it stands today.
It’s time to admit that mere transparency isn’t enough, and that every decision to censor content is a political decision. Companies should act accordingly. They must carefully consider the long-term effects of complying with requests, and take a stand when those requests run counter to human rights principles. The more we accept everyday censorship, the more of it there seems to be, and before we know it, the window of acceptable information will only be open a crack.
Facebook, Microsoft, Twitter, and YouTube are coming together to help curb the spread of terrorist content online. There is no place for content that promotes terrorism on our hosted consumer services. When alerted, we take swift action against this kind of content in accordance with our respective policies.
Starting today, we commit to the creation of a shared industry database of “hashes” — unique digital “fingerprints” — for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services. By sharing this information with each other, we may use the shared hashes to help identify potential terrorist content on our respective hosted consumer platforms. We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help curb the pressing global issue of terrorist content online.
The creation of the industry database of hashes both shows the world that these companies are ‘doing something’ without that something being particularly onerous: any change to a file will result it in having a different hash and thus undetectable by the filtering system being rolled out by these companies. But that technical deficiency is actually the least interesting aspect of what these companies are doing. Rather than being compelled to inhibit speech – by way of a law that might not hold up to a First Amendment challenge in the United States – the companies are voluntarily adopting this process.
The result is that some files will be more challenging to find without someone putting in the effort to seek them out. But it also means that the governments of the world cannot say that the companies aren’t doing anything, and most people aren’t going to be interested in the nuances of the technical deficits of this mode of censorship. So what we’re witnessing is (another) privatized method of censorship that is arguably more designed to rebut political barbs about the discoverability of horrible material on these companies’ services than intended to ‘solve’ the actual problem of the content’s creation and baseline availability.
While a realist might argue that anything is better than nothing, I think that the very existence of these kinds of filtering and censoring programs is inherently dangerous. While it’s all fine and good for ‘bad content’ to be blocked who will be defining what is ‘bad’? And how likely is it that, at some point, ‘good’ content will be either intentionally or accidentally blocked? These are systems that can be used in a multitude of ways once established, and which are often incredibly challenging to retire when in operation.
Netsweeper is a small Canadian company with a disarmingly boring name and an office nestled among the squat buildings of Waterloo, Ontario. But its services—namely, online censorship—are offered in countries as far-flung as Bahrain and Yemen.
In 2015, University of Toronto-based research hub Citizen Lab reported that Netsweeper was providing Yemeni rebels with censorship technology. In response, Citizen Lab director Ron Deibert revealed in a blog post on Tuesday, Netsweeper sued the university and Deibert for defamation. Netsweeper discontinued its lawsuit in its entirety in April.
The lesson here isn’t that Hollywood executives, producers, agents and stars must watch themselves. It isn’t to beware of totalitarian states. It’s to beware, period. If it isn’t a foreign nemesis monitoring and meddling with you, then it’s potentially a merchant examining your buying patterns, an employer trawling for signs of disloyalty or indolence, an acquaintance turned enemy, a random hacker with an amorphous grudge — or of course the federal government.
And while this spooky realization prompts better behavior in certain circumstances that call for it and is only a minor inconvenience in other instances, make no mistake: It’s a major loss. Those moments and nooks in life that permit you to be your messiest, stupidest, most heedless self? They’re quickly disappearing if not already gone.
Though I find various aspects of Bruni’s article insulting (e.g. “…the flesh that Jennifer Lawrence flashed to more people than she ever intended…”) the discussion of who are the most common threat actors that people have to worry about is a fair point. It’s also important to discuss, and discuss regularly, that the ‘defences’ which are commonly preached to protect our privacy are fraught with risk. While being silent, not associating with one another, or not reading certain things online might keep one ‘safe’, engaging in such censorious activities runs counter to the freedoms that we ought to cherish.
Such responses ignore the costs — often paid in blood or years of people’s lives— that have gone into fighting for the freedoms that we now enjoy and that are engrained in our constitutions, our laws, and our social norms. They forget the men and women who fight and die on battlefields to protect the freedoms of citizens of other nations. And, perhaps most significantly, such responses demonstrate how larger social movements directed at enshrining our freedoms through collective action are set aside, often cynically, so that we can try and resolve the problems we all face as individuals instead of as collective political actors. Self-censorship isn’t just a means of ensuring self-protection; it’s an exhibition of citizens’ unwillingness to at try and utilize our political processes to resolve common social ills.
And then there’s the sheer randomness of it all. Some services you can’t access for no apparent reason, others are so slow that you can’t figure out if they’re blocked or just snail-paced. And as I experience this, I wish some of our politicians and media people, those who see net neutrality as the enemy, I wish they’d come here and experience what a radical version of non-neutrality is. Again, I have a VPN service to overcome most of this (at the cost of speed) but most people don’t and/or can’t afford one.
Don’t get me wrong, I’m not suggesting that not enshrining net neutrality is the equivalent of doing what the Chinese (or Iranian, or Indian) government does. But I look at the UK’s blocking mechanisms supposed to protect children but really targeting just about any kind of site for arcane reasons that no one can figure out, and I think that what I have here is an extreme version of the same thing.
VICE Exclusive: Canadian political staffers are using their work time to delete large sections of Wikipedia articles in order to remove controversies, misinformation, and unpleasant truths.
Surprising? No. Sad? Kinda. Reason to ban House of Commons IP addresses from editing Wikipedia? Almost certainly.
Basically, the Russian approach is all about instigating self-censorship. To do this, you need to draft the legislation as broad as possible, to have the restrictions constantly expanded – like the recent law which requires bloggers with more than 3.000 followers to be registered – and companies, internet service providers, NGOs and media will rush to you to be consulted and told what’s allowed. You should also show that you don’t hesitate to block entire services like YouTube – and companies will come to you suggesting technical solutions, as happened with DPI (deep packet inspection). It helps the government to shift the task of developing a technical solution to business, as well as costs.
You also need to encourage pro-government activists to attack the most vocal critics, to launch websites with list of so-called national traitors, and then to have Vladimir Putin himself to use this very term in a speech.
All that sends a very strong message. And as a result, journalists will be fired for critical reporting from Ukraine by media owners, not by the government; the largest internet companies will seek private meetings with Putin, and users of social networks will become more cautious in their comments.
Billion-dollar project will complement Google’s balloons and drones.
It would be particularly interesting to see if Google tried to marry its satellites with its Loom project, to the effect of not having to integrate Loom balloon networks with known censorious ISPs in various countries around the world. If Google could overcome technical and regulatory hurdles it could, by routing through space, try to proxy data access via ‘open’ Internet nations. Of course, this would mean that Google would become the ‘real’ pipe to the Internet itself…
Some have suggested that the [Nova Scotia cyberbullying] law has to be so broad to capture all the harmful conduct and we should leave it to the courts and the cybercops to use their judgement in how it is applied. I’m sorry, but as soon as an employee of the government of Nova Scotia picks up the phone and tells a citizen to remove Charter protected speech from the internet, that crosses the line. That goes waaaaay over the line. Canadians have an absolute right to speak truth to power. Canadians have an obligation to call out politicians on hypocrisy and idiocy. An elected official like Lenore Zann, before publicly admonishing a minor, should educate herself about “copyrwite (sic) law”, fair dealing and the criminal code. (A bit of free advice: Bill C-12 isn’t the law yet and an image taken on a sound stage surrounded by a filming crew for the purpose of international broadcast on cable television likely does not qualify as an intimate image “in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy”.)