Link

Transparency Follows After Trust Is Lost

Via Wired:

Speaking at Davos, Uber CEO Dara Khosrowshahi pointed out that consumers face a challenge in trying to understand tech’s influence in the age of big data. He called this an “information asymmetry.” In his previous job, as CEO of Expedia, Khosrowshahi said, customers were shown a tropical island while they waited for their purchase page to show up. As a test, engineers replaced the placid image with a stressful one that showed a person missing a train. Purchases shot up. The company subbed in an even more stressful image of a person looking at a non-working credit card, and purchases rose again. One enterprising engineer decided to use image of a cobra snake. Purchases went higher.

What’s good for a business isn’t always good for that businesses’ users. Yet Khosrowshahi stopped testing because he decided the experiment wasn’t in line with the Expedia’s values. “A company starts having so much data and information about the user that if you describe it as a fight, it’s just not a fair fight,” said Khosrowshahi.

The tech industry often responds to these concerns with a promise to be more transparent—to better show how its products and services are created and how they impact us. But transparency, explained Rachel Botsman in the same Davos conversation, is not synonymous with trust. A visiting professor at the University of Oxford’s Said School, Botsman authored a book on technology and trust entitled “Who Can You Trust?” “You’ve actually given up on trust if you need for things to be transparent,” she said. “We need to trust the intention of these companies.”

I think that it’s how little design flourishes are used to imperceptibly influence consumers that should be used to justify more intensive ethics and legal education to designers and engineers. Engineers of physical structures belong to formal associations that can evaluate the appropriateness of their members’ creations and conduct. Maybe it’s time for equivalent professional networks to be build for the engineers and developers who are building the current era’s equivalents to bridges, roads, and motor vehicles.

Link

How foreign governments spy using PowerPoint and Twitter

How foreign governments spy using PowerPoint and Twitter:

Right now, there are probably many journalists, human rights organizations and democracy activists walking around oblivious to the invisible tracking that is going on behind their backs. It’s time to wake up to the silent epidemic of targeted digital attacks on civil society and do something about it.

The protections built into our technologies are flimsy and routinely subverted. The merits of a ‘first to market’ ethos that predominates technical innovation must be contrasted, and weighed, against the mortal risk these same technologies pose to some users.

Link

Ethical hackers say government regulations put information at risk

Ethical hackers say government regulations put information at risk:

The chilling effect of vulnerability disclosure stems from potential legal liability for reporting vulnerabilities to software vendors. While it’s often (though not always) the case that technical staff understand the problems and may work to mitigate them, things can go to hell pretty quickly once non-technical staff such as legal or public relations get involved.

In effect, the incentive model for White Hats to come forward to help the commons of software users breaks down incredibly quickly in the face of harsh penalties for individuals ‘breaking digital locks’ or found to violate terms of service, penalties that corporate vendors can (and do) leverage in order to maintain their public reputations.

Quote

We agree that Cloud Computing, the Internet of Things, and Big Data analytics are all trends that may yield remarkable new correlations, insights, and benefits for society at large. While we have no intention of standing in the way of progress, it is essential that privacy practitioners participate in these efforts to shape trends in a way that is truly constructive, enabling both privacy and Big Data analytics to develop, in tandem.

There is a growing understanding that innovation and competitiveness must be approached from a “design-thinking” perspective — namely, viewing the world to overcome constraints in a way that is holistic, interdisciplinary, integrative, creative and innovative. Privacy must also be approached from the same design-thinking perspective. Privacy and data protection should be incorporated into networked data systems and technologies by default, and become integral to organizational priorities, project objectives, design processes, and planning operations. Ideally, privacy and data protection should be embedded into every standard, protocol, and data practice that touches our lives. This will require skilled privacy engineers, computer scientists, software designers and common methodologies that are now being developed, hopefully to usher in an era of Big Privacy.

We must be careful not to naively trust data users, or unnecessarily expose individuals to new harms, unintended consequences, power imbalances and data paternalism. A “trust me” model will simply not suffice. Trust but verify — embed privacy as the default, thereby growing trust and enabling confirmation of trusted practices.

I’m generally sympathetic to the arguments made in this article, though there are a series of concerns I have that are (I hope) largely the result of the authors trying to write an inoffensive article that could be acted on by large organizations. To begin, while I understand that Commissioner Cavoukian has developed her reputation on working with partners as opposed to tending to radically oppose corporations’ behaviours I’m left asking: what constitutes ‘progress’ for herself and her German counterpart, Dr. Dix?

Specifically, Commissioners Cavoukian and Dix assert that they have no intention to stand in the way of progress and (generally) that a more privacy-protective approach means we can enjoy progress and privacy at the same time. But how do the Commissioners ‘spot’ progress? How do they know what to oppose and not oppose? When must, and mustn’t, they stand in the way of a corporation’s practices?

The question of defining progress is tightly linked with my other concern from this quoted part of their article. Specifically, the Commissioners acknowledge that a ‘positive-sum’ approach to privacy and progress requires “skilled privacy engineers, computer scientists, software designers and common methodologies that are now being developed, hopefully to usher in an era of Big Privacy.” That these groups are important is true. But where are the non-engineers, non-software designers, and (presumably) non-lawyers? Social scientists and arts and humanities scholars and graduates can also contribute to sensitizing organizations’ understandings of privacy, of user interests, and the history of certain decisions.

Privacy isn’t something that is only understandable by lawyers or engineers. And, really, it would be better understood and protected if there were more people involved in the discussion. Potential contributors to the debates shouldn’t be excluded simply because they contest or demand definitions of ‘progress’ or come from a non-lawyerly or computer-development background. Rather, they should be welcomed as expanding the debate outside of the contemporary echo chamber of the usually-counted disciplinary actors.

Quote

CryptDB, a project out of MIT’s Computer Science and Artificial Intelligence Lab, (CSAIL) may be a solution for this problem. In theory, it would let you glean insights from your data without letting even your own personnel “see” that data at all, said Dr. Sam Madden, CSAIL director, on Friday.

“The goal is to run SQL on encrypted data, you don’t even allow your admin to decrypt any of that data and that’s important in cloud storage, Madden said at an SAP-sponsored event at Hack/reduce in Cambridge, Mass.

This is super interesting work that, if successful, could open a lot of sensitive data to mining. However, it needs to be extensively tested.

One thing that is baked into this product, however, is the assumption that large-scale data mining is good or appropriate. I’m not taking a position that it’s wrong, but note that there isn’t any discussion – that I can find – where journalists are thinking through whether such sensitive information should even be mined in the first place. We (seemingly) are foreclosing this basic and very important question and, in the process, eliding a whole series of important social and normative questions.

On Publicness and the Academy

Alex Reid has written a short piece about his position concerning the question: if and academic speaks in public, is it right for members of the audience to record/write/talk about what was said?

While I can’t say that I agree with one of the positions he assumes – that as an academic you should exclusively be publishing close-to-complete work (i.e. drafts or early works in progress you don’t want talked about need not apply!) – it’s worth the read, especially in the context that many academics are loathe to have ‘early’ work broadcast beyond tightly controlled confines and populations.

Alex has a great punchline, emphasizing how academics are for the first time really, widely, seeing their work being public and thus critiqued/engaged with. It’s scary for a lot of people but it’s definitely the new reality of academe. The post is well worth the few minutes it’ll take you to read!

Link

Surprise: American Equipment Spies on Iranians

Steve Stecklow, for Reuters, has an special report discussing how Chinese vendor ZTE was able to resell American network infrastructure and surveillance products to the Iranian government. The equipment sold is significant;

Mahmoud Tadjallimehr, a former telecommunications project manager in Iran who has worked for major European and Chinese equipment makers, said the ZTE system supplied to TCI was “country-wide” and was “far more capable of monitoring citizens than I have ever seen in other equipment” sold by other companies to Iran. He said its capabilities included being able “to locate users, intercept their voice, text messaging … emails, chat conversations or web access.”

The ZTE-TCI documents also disclose a backdoor way Iran apparently obtains U.S. technology despite a longtime American ban on non-humanitarian sales to Iran – by purchasing them through a Chinese company.

ZTE’s 907-page “Packing List,” dated July 24, 2011, includes hardware and software products from some of America’s best-known tech companies, including Microsoft Corp, Hewlett-Packard Co, Oracle Corp, Cisco Systems Inc, Dell Inc, Juniper Networks Inc and Symantec Corp.

ZTE has partnerships with some of the U.S. firms. In interviews, all of the companies said they had no knowledge of the TCI deal. Several – including HP, Dell, Cisco and Juniper – said in statements they were launching internal investigations after learning about the contract from Reuters.

The sale of Western networking and surveillance equipment/software to the Iranian government isn’t new. In the past, corporate agents for major networking firms explained to me the means by which Iran is successfully importing the equipment; while firms cannot positively know that this is going on, it’s typically because of an intentional willingness to ignore what they strongly suspect is happening. Regardless, the actual sale of this specific equipment – while significant – isn’t the story that Western citizens can do a lot to change at this point.

Really, we should be asking: do we, as citizens of Western nations, believe that manufacturing of these kinds of equipment is permissible? While some degree of surveillance capacity is arguably needed for lawful purposes within a democracy it is theoretically possible to design devices such that they have limited intercept and analysis capability out of the box. In essence, we could demand that certain degrees of friction are baked into the surveillance equipment that is developed, and actively work to prevent companies from producing highly scaleable and multifunctional surveillance equipment and software. Going forward, this could prevent the next sale of significant surveillance equipment to Iran on grounds that the West simply doesn’t have any for (legal) sale.

In the case of government surveillance inefficiency and lack of scaleability are advantageous insofar as they hinder governmental surveillance capabilities. Limited equipment would add time and resources to surveillance-driven operations, and thus demand a greater general intent to conduct surveillance than when authorities have access to easy-to-use, advanced and scalable, surveillance systems.

Legal frameworks are insufficient to protect citizens’ rights and privacy, as has been demonstrated time and time again by governmental extensions or exploitations of legal frameworks. We need a normatively informed limitation of surveillance equipment that is included in the equipment at the vendor-level. Anything less will only legitimize, rather than truly work towards stopping, the spread of surveillance equipment that is used to monitor citizens across the globe.

parislemon: This Is Why We Can’t Have Nice Things

I agree with parislemon’s general take on the targeting of Apple and labour: Apple isn’t alone, and we can’t ignore the role of local government in (not) regulating the state of affairs at Foxconn (or other large manufacturing) plants. This said, language like the following in unacceptable and intentionally uncritical:

 While this report brings such an issue to the forefront, similar pieces and stories surface quite frequently, actually. Guess what changes? Nothing. It’s shitty to say, but it’s the truth. And we all know it.

The fact of the matter is that we live in a world that demands amazing technology delivered to us at low costs and at great speed. That world leads to Foxconn.

We say we care about the means by which the results are reached when we read stories such as this one. But then we forget. Or we chose not to remember. We buy things and we’re happy that they’re affordable. And then we buy more things. And more. With huge smiles on our faces. Without a care in the world.

In the above quotation, Siegler obfuscates the real role that our governments could have in shaping the supply chain. Imagine: if there were a requirement  that certain imported products (e.g. electronics) had to be certified to meet standardized ethical and human rights requirements. Would that increase the price of goods/prevent some from coming to market, initially? Certainly. As a result Chinese (and other foreign national) companies would dramatically increase labor standards because it would no longer be a competitive advantage to have such incredibly low standards. Prices would stabilize and we could buy iPhones, Blackberry devices, and the rest without sleepless nights.

What must happen, however, is that the West must see beyond itself. Citizens must recognize that they can shape the world, and refuse to just give up on the basis that change would threaten the existing, ethically bankrupt, neo-liberal economic practices that surround our lives. If the EU and North America refused to import ethically suspect electronics and gave significant preferential advantage to companies that were ethical in the production and disposal of goods, then significant change could occur.

It is our choice to adopt, or refuse, to enforce basic human rights in the economic supply chain. Technology – it’s production, usage, and disposal – is rife with ethical quandaries. We have to serious address them if we are to remedy intolerable behaviours the companies like Foxconn perpetuate.

Link

Flexibility and Low Working Standards

The New York Times has a piece that argues – though the narrative is highly forgiving – that the flexibility ‘demanded’ by contemporary technology firms (amongst others) can only occur if they’re allowed to outsource labor. The reason? In countries like China you can rouse 8,000 people out of their dorms in their walled factory-city and put them to work almost instantly. In China, the government will subsidize the costs of massive factory development. Because in China, you can find thousands of engineers – not ones with bachelor degrees, but with a middle-ground space between high school and university – within two weeks.

In part, Asia was attractive because the semiskilled workers there were cheaper. But that wasn’t driving Apple. For technology companies, the cost of labor is minimal compared with the expense of buying parts and managing supply chains that bring together components and services from hundreds of companies.

For Mr. Cook, the focus on Asia “came down to two things,” said one former high-ranking Apple executive. Factories in Asia “can scale up and down faster” and “Asian supply chains have surpassed what’s in the U.S.” The result is that “we can’t compete at this point,” the executive said.

Never forget that language like ‘scale up and down’ really means ‘add and shed labor’, which is further translated to ‘pay people so they can live and work and then rapidly fire them without cause.’ Moreover, the reason why supply chains are so effective in Asia are because most of the bits and pieces of today’s gadgets are manufactured in dense techno-factory domains. These locations are incredibly hazardous to individuals who work there and the environment they are located within.

The ‘common sense’ of locating these factories in China shouldn’t obscure the fact that the West is benefiting off the hard labor of foreign citizens that costs those citizens now – with their health and lives – and may poison them in the future – both as their factories destroy the local environment and return toxic e-waste in the form of disposed products.

There is an ethics to technology. We need to start actively thinking about them.