… surely there is no automatic, positive link between knowledge and power, especially if that means power in a social or political sense. At times knowledge brings merely an enlightened impotence or paralysis. One may know exactly what to do but lack the wherewithal to act. Of the many conditions that affect the phenomenon of power, knowledge is but one and by no means the most important.
- Langdon Winner, The Whale and the Reactor: A Search for Limits in an Age of High Technology
Speaking at Davos, Uber CEO Dara Khosrowshahi pointed out that consumers face a challenge in trying to understand tech’s influence in the age of big data. He called this an “information asymmetry.” In his previous job, as CEO of Expedia, Khosrowshahi said, customers were shown a tropical island while they waited for their purchase page to show up. As a test, engineers replaced the placid image with a stressful one that showed a person missing a train. Purchases shot up. The company subbed in an even more stressful image of a person looking at a non-working credit card, and purchases rose again. One enterprising engineer decided to use image of a cobra snake. Purchases went higher.
What’s good for a business isn’t always good for that businesses’ users. Yet Khosrowshahi stopped testing because he decided the experiment wasn’t in line with the Expedia’s values. “A company starts having so much data and information about the user that if you describe it as a fight, it’s just not a fair fight,” said Khosrowshahi.
The tech industry often responds to these concerns with a promise to be more transparent—to better show how its products and services are created and how they impact us. But transparency, explained Rachel Botsman in the same Davos conversation, is not synonymous with trust. A visiting professor at the University of Oxford’s Said School, Botsman authored a book on technology and trust entitled “Who Can You Trust?” “You’ve actually given up on trust if you need for things to be transparent,” she said. “We need to trust the intention of these companies.”
I think that it’s how little design flourishes are used to imperceptibly influence consumers that should be used to justify more intensive ethics and legal education to designers and engineers. Engineers of physical structures belong to formal associations that can evaluate the appropriateness of their members’ creations and conduct. Maybe it’s time for equivalent professional networks to be build for the engineers and developers who are building the current era’s equivalents to bridges, roads, and motor vehicles.
An agency like TfL could also use uber-accurate tracking data to send out real-time service updates. “If no passengers are using a particular stairway, it could alert TfL that there’s something wrong with the stairway—a missing step or a scary person,” Kaufman says. (Send emergency services stat.)
The Underground won’t exactly know what it can do with this data until it starts crunching the numbers. That will take a few months. Meanwhile, TfL has set about quelling a mini-privacy panic—if riders don’t want to share data with the agency, Sager Weinstein recommends shutting off your mobile device’s Wi-Fi.
So, on the one hand, they’ll apply norms and biases to ascertain why their data ‘says’ certain things. But to draw these conclusion the London transit authority will collect information from customers and the only way to disable this collection is to reduce the functionality of your device when you’re in a public space. Sounds like a recipe for great consensual collection of data and subsequent data ‘analysis’.
A senior Turkish official said Turkish intelligence cracked the app earlier this year and was able to use it to trace tens of thousands of members of a religious movement the government blames for last month’s failed coup.
Members of the group stopped using the app several months ago after realising it had been compromised, but it still made it easier to swiftly purge tens of thousands of teachers, police, soldiers and justice officials in the wake of the coup.
Starting in May 2015, Turkey’s intelligence agency was able to identify close to 40,000 undercover Gülenist operatives, including 600 ranking military personnel, by mapping connections between ByLock users, the Turkish official said.
However, the Turkish official said that while ByLock helped the intelligence agency identify Gülen’s wider network, it was not used for planning the coup itself. Once Gülen network members realised ByLock had been compromised they stopped using it, the official said.
But intelligence services are policing agencies are still ‘Going Dark’…
Both groups had significant improvements in body composition, fitness, physical activity and diet, with no significant difference between groups, they said.
In total, 75 per cent of participants completed the study.
Estimated average weights for the group wearing trackers were 212 pounds at study entry and 205 pounds at 24 months, resulting in an average weight loss of about 7.7 pounds.
In comparison, those in the website group started out at 210 pounds when the study began and weighed in at 197 pounds at 24 months, for an average loss of 13 pounds.
Still, Jakicic said in an email: “We should not send the message that these wearable technologies do not help with weight loss — there were some in our study for whom it made a difference.
I would argue that the ‘advantage’ that the trackers offer is to motivate people who otherwise might be less mindful on a regular basis to increase their daily activity. The headline of the article directly contradicts the point made by the study’s author: that the message should not be that wearables do not help with weight loss.
Perhaps one of the broader issues is that weight loss is predominantly associated with dietary changes. Fitness trackers focus on activity. As such, meeting fitness tracker goals (absent food monitoring) can lead to reduced weight losses as compared to those engaged in more comprehensive health and diet tracking.
Right now, there are probably many journalists, human rights organizations and democracy activists walking around oblivious to the invisible tracking that is going on behind their backs. It’s time to wake up to the silent epidemic of targeted digital attacks on civil society and do something about it.
The protections built into our technologies are flimsy and routinely subverted. The merits of a ‘first to market’ ethos that predominates technical innovation must be contrasted, and weighed, against the mortal risk these same technologies pose to some users.
Many hard problems require you to step back and consider whether you’re solving the right problem. If your solution only mitigates the symptoms of a deeper problem, you may be calcifying that problem and making it harder to change.
Ethan’s essay is a long response to Shane Snow’s proposals for prison reform. In short, Snow is aiming to adjust conditions inside of prisons without considering whether there is a broader series of social issues that are responsible for actually leading to incarcaration. And, worse, he’s making his proposals without lived experiences of what prison itself is like.
The crux of Ethan’s argument, really, doesn’t concern the kinds of prison reform which are(n’t) appropriate so much as this: is it appropriate for a given person, or group, to solve the problem(s) in the first place? Are they capable of even identifying what are the problem(s)?
I think that this kind of attitude – of humbleness and appreciation for one’s limited perspective on the world – is something that should be taken up by more technologists, policy makers, and law makers. Too often we assume we know how to help without even knowing whether, and if so why and under what conditions, help is needed in the first place.