Categories
Aside

Consolidation of Writing

One of the long(er) term goals of this site is to host my personal thoughts. But it’s meant consolidating stuff from medium-personal (as opposed to journal-like) sites. Today marked when I migrated + published more than 600 items. Only multiple hundred left!

Categories
Aside Links

Covernames Versus Code / Strategy Versus Tactics

From the New York Times:

Mr. Snowden’s cascade of disclosures to journalists and his defiant public stance drew far more media coverage than this new breach. But Mr. Snowden released code words, while the Shadow Brokers have released the actual code; if he shared what might be described as battle plans, they have loosed the weapons themselves. Created at huge expense to American taxpayers, those cyberweapons have now been picked up by hackers from North Korea to Russia and shot back at the United States and its allies.

While the revelation of code facilitates a more immediate kind of repurposing and attack, I think that the Shadow Brokers have tended to reveal tactical information versus the strategic information released by Snowden. Few have done the requisite work to actually pull together the comprehensive narratives that emerge in the Snowden documents and, instead, have focused on specific programs or tools. Those few of us who have comprehensively analyzed his documents, however, now possess insights into strategic thinking, decision making, and resource allocation of the Five Eyes intelligence agencies. The long term value of such information is just as, if not more, valuable than code drops.

Categories
Aside Links

Exploited for Advertising

As part of a long-feature for The Guardian:

The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.

Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.

Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.

The problems facing many Internet users today are predicated on how companies’ services are paid: by companies doing everything they can to capture and hold your attention regardless of your own interests. If there were alternate models of financing social media companies, such as paying small monthly or yearly fees, imagine how different online communications would be: communities would likely be smaller, yes, but the developers would be motivated to do whatever they could to support the communities instead of advertisers targeting those communities. Silicon Valley has absorbed many of the best minds for the past decade and a half in order to make advertisements better. Imagine what would be different if all that excitement had been channeled towards less socially destructive outputs.

Categories
Aside Links

The Dangers of Political ‘Marketing’

‘Politics’ by Samuel Thorne (CC BY-NC-ND 2.0) at https://flic.kr/p/kAgBCR

From n+1:

Given that some of the major players involved in Trump’s campaign effort have obsessions with war tactics and strategy, it’s easy to imagine that weaponized targeting may not only be a pre-election phenomenon. Such efforts could be employed as part of an ongoing campaign to weaken any resistance to the Trump Administration and thwart political opposition through ratcheting up in-fighting and splintering. It’s not an overstatement to suggest that the infrastructure of mass consumer surveillance enables new kinds of actors to take up the work of COINTELPRO on a mass scale. Former Cambridge Analytica employees have said the company internally discusses their operations as psychological warfare.

Cambridge Analytica may not be alone in pursuing these types of psychological warfare tactics. In response to the recent revelations of Russian-bought Facebook ads, Senator Mark Warner told the Washington Post that the aim of the ads was “to sow chaos.” Yet, rather than promoting general chaos, some ads may have been specifically designed to fuel infighting among the Trump opposition. Earlier this year, The Intercept showed that TigerSwan, a shady mercenary firm hired by Energy Transfer Partners to combat communities opposing the Dakota Access Pipeline, used knowledge gleaned from surveillance as part of their own strategy to splinter their opponents. A leaked TigerSwan document declared, “Exploitation of ongoing native versus non-native rifts, and tribal rifts between peaceful and violent elements is critical in our effort to delegitimize the anti-DAPL movement.”

What our current digital environment affords are opportunities for efficient, large-scale use of such tactics, which can be refined by data-rich feedback loops. Manipulation campaigns can plug into the commercial surveillance infrastructure and draw on lessons of behavioral science. They can use testing to refine strategies that take account of the personal traits of targets and identify interventions that may be most potent. This might mean identifying marginal participants, let’s say for joining a march or boycott, and zeroing in on interventions to dissuade them from taking action. Even more worrisomely, such targeting could try to push potential allies in different directions. Targets predicted to have more radical inklings could be pushed toward radical tactics and fed stories deriding compromise with liberal allies. Simultaneously, those predicted to have more liberal sympathies may be fed stories that hype fears about radical takeover of the resistance. Such campaigns would likely play off divisions along race, gender, issue-specific priorities, and other lines of identity and affinity.

We’re reaching the pinnacle of what online advertising can do: identify persons of interest, separate specific persons from others to discretely target them, and motivate targets to change their emotional states and act based on those states. It’s bad enough this is done to push products but, now, the same activities are seeping into the political systems and damaging democratic undertakings in the process. Such activity has to be regulated, if not stopped entirely.

Categories
Aside

Betatesting iOS 11

As one of the many people on iOS 11, but who didn’t enrol in the beta testing, I was very surprised that the Twitter and Facebook share integrations were removed as system settings. As it stands it’s not entirely clear how such sharing is supposed to take place in many apps, where the share sheet still points to the settings in iOS 10. I can only hope that app developers update quickly to return this functionality to their applications.

Categories
Aside Links Photography

Manufacturing Gear Acquisition Syndrome

Nasim Mansurow at Photography Life:

Don’t be a victim of The Hype. Don’t be a cameraholic and a brainless consumer. Stop yourself from the Internet hysteria that surrounds cameras, lenses and other gear. Instead, spend time learning about photography techniques and improving your skills. Travel more, see more, shoot more. And when I review a piece of camera gear, don’t buy it because I praised it. Only buy what you truly need, not what you want. That’s all I have to say for today.

Mansurov’s article spends a lot of time explaining the economics that drive individual ‘influencers’ and websites to get people excited about buying the new ‘best’ camera equipment. By drawing on Photography Life’s website analytics and the marketing material that he receives, he lays bare the economic incentives to focus of gear instead of techniques, skills, and neat locations to visit. In the process he also makes it very clear how the commercial aspects of selling equipment work in a way that most people may think or believe is happening but don’t have evidence or data to substantiate those thoughts or beliefs. It’s not a shocking read but does serve as a reminder that companies are actively attempting to manipulate consumers into buying the newest lenses or body with the hope or dream that it will turn us all into master photographers.

Categories
Aside Quotations

Hyper-Regulated Mass Surveillance

The difficult project of establishing meaningful oversight would be aided by a deeper appreciation by all sides of the surveillance debates that their adversaries are generally acting in good faith. Too often it seems that we occupy parallel universes. In the first, the U.S. intelligence community operates in a framework so regulated and constrained that it should be the envy of the world, not the target of its scorn. No intelligence agency in the world can match our respect for rules and laws. In the second, the U.S. surveillance state has outgrown legal restraints and allowed its surveillance activities to be driven by technological capabilities. It developed and deployed a global system of mass surveillance without the knowledge or consent of the public, and it is sitting on massive databases of private information that constitute a genuine threat to free societies.

We should acknowledge the possibility that both of these pictures are largely accurate. The intelligence community is staffed by honorable public servants who have an abiding respect for the Constitution. And history gives us reason to be concerned that information collected for one purpose will likely be put to other purposes, particularly in the aftermath of a terrorist attack or other national trauma. We might even elect a president who has no regard for the rule of law.

Ben Wizner, ACLU

The question of how to draft a system of secret rules while simultaneously ensuring that the actors solely operate within the realm of the rules continues to vex policymakers, academics, politicians, and lawyers. What definitely seems to not work is maintaining a veil of secrecy over the baseline set of rules themselves, to say nothing of cloaking the interpretations of those rules in their own layers of secrecy.

Categories
Aside Writing

Limits of Data Access Requests

Last week I wrote about the limits of data access requests, as they related to car sharing applications like Uber. A data access request involves you contacting a private company and requesting a copy of your personal information, as well as the ways in which that data is processed, disclosed, and the periods of time for which data is retained.

Research has repeatedly shown that companies are very poor at comprehensively responding to data access requests. Sometimes this is because of divides between technical teams that collect and use the data, policy teams that determine what is and isn’t appropriate to do with data, and legal teams that ascertain whether collections and uses of data comport with the law. In other situations companies simply refuse to respond because they adopt a confused-nationalist understanding of law: if the company doesn’t have an office somewhere then that jurisdiction’s laws aren’t seen as applying to the company, even if the company does business in the jurisdiction.

Automated Data Export As Solution?

Some companies, such as Facebook and Google, have developed automated data download services. Ostensibly these services are designed so that you can download the data you’ve input into the companies, thus revealing precisely what is collected about you. In reality, these services don’t let you export all of the information that these respective companies collect. As a result when people tend to use these download services they end up with a false impression of just what information the companies collect and how its used.

A shining example of the kinds of information that are not revealed to users of these services has come to light. A recently leaked document from Facebook Australia revealed that:

Facebook’s algorithms can determine, and allow advertisers to pinpoint, "moments when young people need a confidence boost." If that phrase isn’t clear enough, Facebook’s document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including "worthless," "insecure," "defeated," "anxious," "silly," "useless," "stupid," "overwhelmed," "stressed," and "a failure."

This targeting of emotions isn’t necessarily surprising: in a past exposé we learned that Facebook conducted experiments during an American presidential election to see if they could sway voters. Indeed, the company’s raison d’être is figure out how to pitch ads to customers, and figuring out when Facebook users are more or less likely to be affected by advertisements is just good business. If you use the self-download service provided by Facebook, or any other data broker, you will not receive data on how and why your data is exploited: without understanding how their algorithms act on the data they collect from you, you can never really understand how your personal information is processed.

But that raison d’être of pitching ads to people — which is why Facebook could internally justify the deliberate targeting of vulnerable youth — ignores baseline ethics of whether it is appropriate to exploit our psychology to sell us products. To be clear, this isn’t a company stalking you around the Internet with ads for a car or couch or jewelry that you were browsing about. This is a deliberate effort to mine your communications to sell products at times of psychological vulnerability. The difference is between somewhat stupid tracking versus deliberate exploitation of our emotional state.1

Solving for Bad Actors

There are laws around what you can do with the information provided by children. Whether Facebook’s actions run afoul of such law may never actually be tested in a court or privacy commissioner’s decision. In part, this is because actually mounting legal challenges is extremely challenging, expensive, and time consuming. These hurdles automatically tilt the balance towards activities such as this continuing, even if Facebook stops this particular activity. But, also, part of the problem is Australia’s historically weak privacy commissioner as well as the limitations of such offices around the world: Privacy Commissioners Offices are often understaffed, under resourced, and unable to chase every legally and ethically questionable practice undertaken by private companies. Companies know about these limitations and, as such, know they can get away with unethical and frankly illegal activities unless someone talks to the press about the activities in question.

So what’s the solution? The rote advice is to stop using Facebook. While that might be good advice for some, for a lot of other people leaving Facebook is very, very challenging. You might use it to sign into a lot of other services and so don’t think you can easily abandon Facebook. You might have stored years of photos or conversations and Facebook doesn’t give you a nice way to pull them out. It might be a place where all of your friends and family congregate to share information and so leaving would amount to being excised from your core communities. And depending on where you live you might rely on Facebook for finding jobs, community events, or other activities that are essential to your life.

In essence, solving for Facebook, Google, Uber, and all the other large data broker problems is a collective action problem. It’s not a problem that is best solved on an individualistic basis.

A more realistic kind of advice would be this: file complaints to your local politicians. File complaints to your domestic privacy commissioners. File complaints to every conference, academic association, and industry event that takes Facebook money.2 Make it very public and very clear that you and groups you are associated with are offended by the company in question that is profiting off the psychological exploitation of children and adults alike.3 Now, will your efforts to raise attention to the issue and draw negative attention to companies and groups profiting from Facebook and other data brokers stop unethical data exploitation tomorrow? No. But by consistently raising our concerns about how large data brokers collect and use personal information, and attributing some degree of negative publicity to all those who benefit from such practices, we can decrease the public stock of a company.

History is dotted with individuals who are seen as standing up to end bad practices by governments and private companies alike. But behind them tend to be a mass of citizens who are supportive of those individuals: while standing up en masse may mean that we don’t each get individual praise for stopping some tasteless and unethical practices, our collective standing up will make it more likely that such practices will be stopped. By each working a little we can do something that, individually, we’d be hard pressed to change as individuals.

NOTE: This blog was first published on Medium on May 1, 2017.


  1. 1 Other advertising companies adopt the same practices as Facebook. So I’m not suggesting that Facebook is worst-of-class and letting the others off the hook. ↩︎
  2. 2 Replace ‘Facebook’ with whatever company you think is behaving inappropriately, unethically, or perhaps illegally. ↩︎
  3. 3 Surely you don’t think that Facebook is only targeting kids, right? ↩︎
Categories
Aside Links

The Subtle Ways Your Digital Assistant Might Manipulate You

From Wired:

Amazon’s Echo and Alphabet’s Home cost less than $200 today, and that price will likely drop. So who will pay our butler’s salary, especially as it offers additional services? Advertisers, most likely. Our butler may recommend services and products that further the super-platform’s financial interests, rather than our own interests. By serving its true masters—the platforms—it may distort our view of the market and lead us to services and products that its masters wish to promote.

But the potential harm transcends the search bias issue, which Google is currently defending in Europe. The increase in the super-platform’s economic power can translate into political power. As we increasingly rely on one or two head butlers, the super-platform will learn about our political beliefs and have the power to affect our views and the public debate.

The discussions about algorithmic bias often have an almost science fiction feel to them. But as personal assistant platforms are monetized by platforms by inking deals with advertisers and designing secretive business practices designed to extract value from users, the threat of attitude shaping will become even more important. Why did your assistant recommend a particular route? (Answer: because it took you past businesses the platform owner believes you are predisposed to spend money at.) Why did your assistant present a particular piece of news? (Answer: because the piece in question conformed with your existing views and thus increased time you spent on the site, during which you were exposed to the platform’s associated advertising partners’ content.)

We are shifting to a world where algorithms are functionally what we call magic. A type of magic that can be used to exploit us while we think that algorithmically-designed digital assistants are markedly changing our lives for the better.

Categories
Aside Links

Twenty-four pedestrians were hit on Toronto’s roads on Tuesday — including an 87-year-old who died

“Do we recognize that weather plays a part in it? Yes, that’s a contributing factor. But what do you do when you can’t see where you’re going? You slow down, you look around. Unfortunately, drivers, let’s be quite frank, are somewhat lazy. They don’t adjust for the driving conditions they face. They’re still trying to push the envelope.”

It’s always a bit shocking to have the Toronto police holding drivers to account for, you know, killing people with their vehicles. It’s a nice change from just blaming pedestrians.

But, at the same time, I don’t think that drivers being “somewhat lazy” is a legitimate comment when talking about people being killed. People get lazy and don’t wash the dishes. Or don’t take the dog out. When they get lazy and kill someone we tend to use another word when we’re not referring to drivers killing pedestrians.

That word? Manslaughter.