Rina Steenkamp - Privacy and technology
[The privacy challenges of big data - a view from the lifeguard's chair | Judged by the tin man - individual rights in the age of big data | Protecting their own - fundamental rights implications for EU data sovereignty in the cloud | Information privacy in the cloud | How much will PRISM cost the U.S. cloud computing industry? | Rangzen - Circumventing government-imposed communication blackouts | Press suckered by anti-Google group's bogus claim that Gmail users can't expect privacy | Piaw@tch - the Privacy Impact Assessment observatory | Subject access code of practice | [Updated guidance on network attacks and malware] | Necessary and proportionate - International principles on the application of human rights to communications surveillance | Case of Youth Initiative for Human Rights v. Serbia | Biometric ID cybersurveillance | Privacy and safety on Facebook - a guide for survivors of abuse | Mobile health and fitness applications and information privacy | Report to the President - MIT and the prosecution of Aaron Swartz | Conducting privacy impact assessments code of practice | A case of collusion - a study of the interface between ad libraries and their apps | Browser security comparative analysis - Privacy settings | VERIS Community Database (VCDB) | Breach Watch | Global corporate IT security risks - 2013 | The economic impact of cybercrime and cyber espionage | Bradley Manning Espionage Act conviction a blow to both whistleblowers and journalists]
A keynote address by FTC Chairwoman Edith Ramirez.
From the text:
"My topic today is 'big data' and the privacy challenges it may pose to consumers. I want to explore how we can reap the benefits of big data without falling prey to possible pitfalls."
Full text (PDF):
See also:
An essay by Omer Tene and Jules Polonetsky.
From the Abstract:
"In this essay, we present some of the privacy and non-privacy risks of big data as well as directions for potential solutions. In a previous paper, we argued that the central tenets of the current privacy framework, the principles of data minimization and purpose limitation, are severely strained by the big data technological and business reality. Here, we assess some of the other problems raised by pervasive big data analysis. [...] In this article we argue that the focus on the machine is a distraction from the debate surrounding data driven ethical dilemmas, such as privacy, fairness and discrimination. The machine may exacerbate, enable, or simply draw attention to the ethical challenges, but it is humans who must be held accountable. Instead of vilifying machine-based data analysis and imposing heavy-handed regulation, which in the process will undoubtedly curtail highly beneficial activities, policymakers should seek to devise agreed-upon guidelines for ethical data analysis and profiling. Such guidelines would address the use of legal and technical mechanisms to obfuscate data; criteria for calling out unethical, if not illegal, behavior; categories of privacy and non-privacy harms; and strategies for empowering individuals through access to data in intelligible form."
Full text (SSRN):
See also:
A paper by Judith Rauhofer and Caspar Bowden.
Abstract:
"The recent PRISM scandal has illustrated the privacy risks that EU citizens take when their personal information is stored or processed in the cloud. Although EU data protection laws are designed to restrict the private actors handling that data from processing it in a way and for purposes that are unlawful, those laws have no effect on public bodies, including law enforcement and security agencies in third countries whose access to that data may be authorized by the laws of their own countries. This is the case even if such access would violate the individual's fundamental human rights had it occurred within the EU. This article examines the means by which the existing EU data protection framework restricts the transfer of personal data from the EU to third countries particularly in a cloud context. It analyses whether the European Commission's proposal for a new Data Protection Regulation in its current form is likely to increase or reduce the protection provided to EU citizens in this regard, and it looks at the potential threat that the laws of third countries may pose to EU citizens' right to privacy with respect to data uploaded to the cloud. The article assesses, in particular, the laws authorising the US government's access to personal data held or processed by US cloud providers, focusing specifically on the US Foreign Intelligence Surveillance Act of 1978 (FISA) . It also highlights the lack of equivalent protections currently granted to EU citizens by the US constitution. The article argues that in the light of the clear and present danger that provisions like §1881a of FISA represent to EU citizens' right to privacy, the EU institutions - as part of their own obligation under the Charter of Fundamental Rights and, in the future, the European Convention on Human Rights must take the appropriate steps to protect their citizens from this kind of interference."
Full text (SSRN):
See also:
An article by Paul M. Schwartz.
From the Abstract:
"This Article examines three areas of change in personal data processing due to the cloud. The first area of change concerns the nature of information processing at companies. For many organizations, data transmissions are no longer point-to-point transactions within one country; they are now increasingly international in nature. As a result of this development, the legal distinction between national and international data processing is less meaningful than in the past. Computing activities now shift from country to country depending on load capacity, time of day, and a variety of other concerns. The jurisdictional concepts of EU law do not fit well with these changes in the scale and nature of international data processing. A second legal issue concerns the multi-directional nature of modern data flows, which occur today as a networked series of processes made to deliver a business result. Due to this development, established concepts of privacy law, such as the definition of 'personal information' and the meaning of 'automated processing' have become problematic. There is also no international harmonization of these concepts. As a result, European Union and U.S. officials may differ on whether certain activities in the cloud implicate privacy law. A final change relates to a shift to a process-oriented management approach. Users no longer need to own technology, whether software or hardware, that is placed in the cloud. Rather, different parties in the cloud can contribute inputs and outputs and execute other kinds of actions. In short, technology has provided new answers to a question that Ronald Coase first posed in 'The Nature of the Firm.' New technologies and accompanying business models now allow firms to approach 'make or buy' decisions in innovative ways. Yet, privacy law's approach to liability for privacy violations and data losses in the new 'make or buy' world of the cloud may not create adequate incentives for the multiple parties who handle personal data."
Full text (SSRN):
See also:
An Information Technology & Innovation Foundation report by Daniel Castro.
From 'Findings: the impact on U.S. cloud service providers':
"Just how much do U.S. cloud computing providers stand to lose from PRISM? At this stage it is unclear how much damage will be done, in part because it is still not certain how the U.S. government will respond. But it is possible to make some reasonable estimates about the potential impact. On the low end, U.S. cloud computing providers might lose $21.5 billion over the next three years. This estimate assumes the U.S. eventually loses about 10 percent of foreign market to European or Asian competitors and retains its currently projected market share for the domestic market. On the high end, U.S. cloud computing providers might lose $35.0 billion by 2016. This assumes the U.S. eventually loses 20 percent of the foreign market to competitors and retains its current domestic market share."
Full text (PDF linked from this page):
See also:
A paper by Giulia Fanti, Yahel Ben David, Sebastian Benthall, Eric Brewer and Scott Shenker.
Abstract:
"A challenging problem in dissent networking is that of circumventing large-scale communication blackouts imposed by oppressive governments. Although prior work has not focused on the need for user anonymity, we contend that it is essential. Without anonymity, governments can use communication networks to track and persecute users. A key challenge for decentralized networks is that of resource allocation and control. Network resources must be shared in a manner that deprioritizes unwanted traffic and abusive users. This task is typically addressed through reputation systems that conflict with anonymity. Our work addresses this paradox: We prioritize resources in a privacy-preserving manner to create an attack-resilient, anonymity-preserving, mobile ad-hoc network. Our prioritization mechanism exploits the properties of a social trust graph to promote messages relayed via trusted nodes. We present Rangzen, a microblogging solution that uses smartphones to opportunistically relay messages among citizens in a delay-tolerant network (DTN) that is independent of government or corporate-controlled infrastructure."
Full text (PDF linked from this page):
See also:
A blog post by Mike Masnick (Techdirt).
From the blog post:
"Okay, so as a bunch of folks have been sending over today, there's been a bit of a furor over a press release pushed out by Consumer Watchdog [...] The 'story' claims that Google has admitted in court that there is no expectation of privacy over Gmail. This is not actually true - but we'll get to that. This story is a bit complex because the claims in most of the news coverage about this are simply wrong - but I still think Google made a big mistake in making this particular filing. So, first, let's explain why the coverage is completely bogus trumped up bullshit from Consumer Watchdog, and then we'll explain why Google still shouldn't have made this filing."
Full text:
A website by Eric Charikane.
From the 'About' page:
"'PIAw@tch the Privacy Impact Assessment observatory' is online since May 18, 2011. Its first aim is to be a database all about 'Privacy Impact Assessment' (PIA). It has been founded and built by Eric Charikane, a french privacy professional, as an helping tool while he was carrying out a worldwide study (during 2010/2011) to understand every aspects of the PIA."
Full text:
A publication by the ICO.
From 'About this code of practice':
"This code of practice explains the rights of individuals to access their personal data. It also clarifies what you must do in this regard to comply with your duties as a data controller."
Full text (PDF):
See also:
Updated publications by NIST.
From the accompanying web page:
"Detecting and stopping malicious attacks on computer networks is a central focus of computer security these days. The National Institute of Standards and Technology (NIST) is asking for comments on two updated guides on malicious computer attacks: one on preventing, detecting, and responding to attacks and one on preventing and mitigating the effects of malware, a potent tool in an attacker's arsenal. The publications are being revised to reflect the changes in threats and incidents."
Full text (PDF files linked from this page):
See also:
An initiative supported by 100+ signatories.
From the text:
"As technologies that facilitate State surveillance of communications advance, States are failing to ensure that laws and regulations related to communications surveillance adhere to international human rights and adequately protect the rights to privacy and freedom of expression. This document attempts to explain how international human rights law applies in the current digital environment, particularly in light of the increase in and changes to communications surveillance technologies and techniques. These principles can provide civil society groups, industry, States and others with a framework to evaluate whether current or proposed surveillance laws and practices are consistent with human rights. These principles are the outcome of a global consultation with civil society groups, industry and international experts in communications surveillance law, policy and technology."
Full text:
See also:
Judgement by the European Court of Human Rights.
From the text:
"5. The applicant is a non-governmental organisation set up in 2003 and based in Belgrade. It monitors the implementation of transitional laws with a view to ensuring respect for human rights, democracy and the rule of law. 6. On 31 October 2005 the applicant requested the intelligence agency of Serbia (Bezbednosno-informativna agencija) to inform it how many people had been subjected to electronic surveillance by that agency in 2005. 7. On 4 November 2005 the agency refused the request, relying thereby on section 9(5) of the Freedom of Information Act 2004. [...] the Court unanimously [...] 4. Holds that the respondent State must ensure, within three months from the date on which the judgment becomes final in accordance with Article 44 § 2 of the Convention, that the intelligence agency of Serbia provide the applicant with the information requested [...]"
Full text:
See also:
A paper by Margaret Hu.
Abstract:
"The implementation of a universal digitalized biometric ID system risks normalizing and integrating mass cybersurveillance into the daily lives of ordinary citizens. ID documents such as driver's licenses in some states and all U.S. passports are now implanted with radio frequency identification (RFID) technology. In recent proposals, Congress has considered implementing a digitalized biometric identification card — such as a biometric-based, 'high-tech' Social Security Card — which may eventually lead to the development of a universal multimodal biometric database (e.g., the collection of the digital photos, fingerprints, iris scans, and/or DNA of all citizens and noncitizens). Such 'hightech' IDs, once merged with GPS-RFID tracking technology, would facilitate exponentially a convergence of cybersurveillance-body tracking and data surveillance, or dataveillance-biographical tracking. Yet, the existing Fourth Amendment jurisprudence is tethered to a 'reasonable expectation of privacy' test that does not appear to restrain the comprehensive, suspicionless amassing of databases that concern the biometric data, movements, activities, and other personally identifiable information of individuals. In this Article, I initiate a project to explore the constitutional and other legal consequences of big data cybersurveillance generally and mass biometric dataveillance in particular. This Article focuses on how biometric data is increasingly incorporated into identity management systems through bureaucratized cybersurveillance or the normalization of cybersurveillance through the daily course of business and integrated forms of governance."
Full text (SSRN):
See also:
A guide by NNEDV.
From the guide:
"The National Network to End Domestic Violence and Facebook have teamed up to offer tips for survivors of abuse so that you can still use Facebook but maintain safety and control over your information. This guide is aimed at helping survivors of domestic violence, sexual assault and stalking know how to use Facebook in a way that ensures that they stay connected with friends and family, but control their safety and privacy to help prevent misuse by abusers, stalkers, and perpetrators to stalk and harass."
Full text (PDF):
See also:
A study by Privacy Rights Clearinghouse.
From the accompanying web page:
"Many individuals use mobile apps to monitor their health, learn about specific medical conditions, and help them achieve personal fitness goals. Apps in the 'wellness' space include those that support diet and exercise programs; pregnancy trackers; behavioral and mental health coaches; symptom checkers that can link users to local health services; sleep and relaxation aids; and personal disease or chronic condition managers. After studying 43 popular health and fitness apps (both free and paid) from both a consumer and technical perspective, it is clear that there are considerable privacy risks for users – and that the privacy policies for those apps that have policies do not describe those risks."
Full text (PDF files linked from this page, including a consumer-level report and a technical analysis):
See also:
A report by Harold Abelson, Peter A. Diamond, Andrew Grosso and Douglas W. Pfeiffer (support).
From 'Conclusion':
"As the length of this report demonstrates, the narrative of MIT's involvement in the events around Aaron Swartz's arrest and prosecution is extensive and intricate. [...] In concluding this review, we recognize the desire for a simple take-away, a conclusion that 'if MIT had only done this rather than that, things would have turned out OK.' We can't offer one. There were too many choices, too many might-have-beens, too great an emotional shock, and a public response that has been supercharged by the power of the Internet, the same power that Aaron Swartz epitomized and that he helped to create. Even today, with the benefit of hindsight, we have not found a silver bullet with which MIT could have simply prevented the tragedy. If the Review Panel is forced to highlight just one issue for reflection, we would choose to look to the MIT administration’s maintenance of a 'neutral' hands-off attitude that regarded the prosecution as a legal dispute to which it was not a party. This attitude was complemented by the MIT community's apparent lack of attention to the ruinous collision of hacker ethics, open-source ideals, questionable laws, and aggressive prosecutions that was playing out in its midst. As a case study, this is a textbook example of the very controversies where the world seeks MIT's insight and leadership. [...] In closing, our review can suggest this lesson: MIT is respected for world-class work in information technology, for promoting open access to online information, and for dealing wisely with the risks of computer abuse. The world looks to MIT to be at the forefront of these areas. Looking back on the Aaron Swartz case, the world didn't see leadership. As one person involved in the decisions put it: 'MIT didn't do anything wrong; but we didn't do ourselves proud.'"
Full text (PDF linked from this page):
See also:
A cosultation document by the ICO.
From 'About this code':
"Privacy impact assessments (PIAs) are a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy. An effective PIA will allow organisations to identify and fix problems at an early stage, reducing the associated costs and damage to reputation which might otherwise occur. This code explains the principles which form the basis for a PIA. It sets out the basic steps which an organisation should carry out during the assessment process. The code also contains a set of screening questions to help organisations identify when a PIA is necessary, and a template which can be used to help produce a PIA report."
Full text (PDF of the code of practice and other documentation linked from this page - note: this is a generic consultations page, check under 'closed consultations' if the links are no longer here):
See also:
A paper by Theodore Book and Dan. S. Wallach.
Abstract:
"A growing concern with advertisement libraries on Android is their ability to exfiltrate personal information from their host applications. While previous work has looked at the libraries' abilities to extract private information from the system, advertising libraries also include APIs through which a host application can deliberately leak private information about the user. This study considers a corpus of 114,000 apps. We reconstruct the APIs for 103 ad libraries used in the corpus, and study how the privacy leaking APIs from the top 20 ad libraries are used by the 64,000 applications in which they are included. Notably, we have found that app popularity correlates with privacy leakage; the marginal increase in advertising revenue, multiplied over a larger user base, seems to incentivize these app vendors to violate their users’ privacy."
Full text (PDF, Arxiv):
See also:
An NSS Labs report by Randy Abrams, Orlando Barrera and Jayendra Pathak.
From 'Overview':
"Privacy is an issue on the front lines of the browser wars. Both Apple and Microsoft have taken steps to improve privacy, with the most notable action being Microsoft's effective enabling of Do Not Track by default in Internet Explorer 10. Third-party cookies have been disabled by default in Apple's Safari for some time now. Google and Mozilla, which is heavily subsidized by Google, have actively avoided providing privacy protections to consumers, with Google going so far as to bypass Safari's cookie blocking mechanism, an action that led to a $22.5 million USD fine. In this comparative analysis, NSS Labs examines the privacy mechanisms built into the browsers and assesses their implications for user privacy. "
Full text (PDF linked from this page):
See also:
A project by Verizon.
From the accompanying blog post:
"While there are a handful of efforts to capture security incidents that are publicly disclosed, there is no unrestricted, comprehensive raw dataset available for download on security incidents that is sufficiently rich to support both community research and corporate decision-making. There are organizations that collect—and in some form—disseminate aggregated collections, but they are either not in a format that lends itself to ease of data manipulation and transformation required for research, or the underlying data are not freely and publicly available for use. [...] To address this problem that has plagued the community, we are pleased to announce the VERIS Community Database (VCDB), which aims to collect and disseminate data breach information for all publicly disclosed data breaches. The data are coded into VERIS format and we also provided the dataset in an interactive visualization available for public use."
Full text (Public.tableausoftware.com):
See also:
A website by John Elliott.
From the front page:
"Breach Watch aims to be a useful repository of information about regulatory action taken as a result of data breaches. It provides a comprehensive archive of of ICO and FCA/FSA enforcement, helpful categorisation and occasional analysis."
Full text:
See also:
A report by Kapersky Lab and B2B International.
From 'Main findings':
"In the past 12 months, 91% of the companies surveyed had at least one external IT security incident and 85% reported internal incidents. A serious incident can cost a large company an average of $649,000; for small and medium-sized companies the bill averages at about $50,000."
Full text (PDF):
See also:
A report by the Center for Strategic and International Studies and McAfee.
From the Introduction:
"Is cybercrime, cyber espionage, and other malicious cyber activities what some call 'the greatest transfer of wealth in human history,' or is it what others say is a 'rounding error in a fourteen trillion dollar economy?' The wide range of existing estimates of the annual loss—from a few billion dollars to hundreds of billions—reflects several difficulties. Companies conceal their losses and some are not aware of what has been taken. Intellectual property is hard to value. Some estimates relied on surveys, which provide very imprecise results unless carefully constructed. One common problem with cybersecurity surveys is that those who answer the questions 'self-select,' introducing a possible source of distortion into the results. Given the data collection problems, loss estimates are based on assumptions about scale and effect - change the assumption and you get very different results. These problems leave many estimates open to question. In this initial report we start by asking what we should count in estimating losses from cybercrime and cyber espionage"
Full text (PDF):
See also:
A statement by Freedom of the Press Foundation.
From the page:
"In the most important trial affecting whistleblower rights in years, Bradley Manning—the admitted source to the WikiLeaks disclosures—has been convicted on nineteen counts, including multiple Espionage Act and Computer Fraud and Abuse Act charges. He faces over 100 years in jail. While the most pernicious charge, 'aiding the enemy,' was thankfully rejected by the military judge, this decision is a terrible blow to both investigative journalists and the sources they rely on to inform the public."
Full text:
See also: