Rina Steenkamp - Privacy and technology
[Amici Curiae brief [ACLU v. Clapper] | Brief of Amicus Curiae [Riley v. California] | MetaPhone - the sensitivity of telephone metadata | Building privacy into mobile location analytics (MLA) through privacy by design | DeepFace - Closing the gap to human-level performance in face verification | Sometimes in class action settlements plaintiffs gain nothing, but risk everything | How do EU and US privacy regimes compare? | Privacy papers for policy makers 2013 | Are credit monitoring services worth it? | Operation Windigo - The vivisection of a large Linux server-side credential stealing malware campaign | Sophos mobile security threat report | Digital life in 2025 | Drones and targeted killing - defining a European position | [On the use of drones] | Metadata - Piecing together a privacy solution | Constitutional limits on surveillance - Association freedom in the age of data hoarding | I know why you went to the clinic - Risks and realization of HTTPS traffic analysis | Neural signatures of user-centered security - An fMRI study of phishing, and malware warnings | Robotics and the new cyberlaw | Handbook on European data protection law | The legal position and societal effects of security breach notification laws | Comparison of five data-breach bills currently pending in the Senate | The state of privacy 2014 | Meet Jack. Or, what the government could do with all that location data | 2013/2014 Data recovery project for National Association for Information Destruction - Data recovery & security report | Patient identification and matching - Final report | Regulating mass surveillance as privacy pollution - Learning from environmental impact statements]
An Amici Curiae brief submitted by Cindy Cohn, Mark Rumold and Andrew Crocker (EFF, on behalf of the Amici Curiae).
From the Introduction:
"It is not just metadata. Telephony metadata reveals private and sensitive information about people. It can reveal political affiliation, religious practices, and people's most intimate associations. It reveals who calls a suicide prevention hotline and who calls their elected official; who calls the local Tea Party office and who calls Planned Parenthood. The aggregation of telephony metadata—about a single person over time, about groups of people, or with other datasets—only intensifies the sensitivity of the information. Aggregated metadata 'generates a precise, comprehensive record' of people's habits, which in turn 'reflects a wealth of detail about [their] familial, political, professional, religious, and sexual associations.' United States v. Jones, 565 U.S. __, 132 S. Ct. 945, 955 (2012) (Sotomayor, J., concurring). The call records collected by the government are not just metadata - they are intimate portraits of the lives of millions of Americans."
Read more:
An Amicus Curiae brief submitted by Marc Rotenberg, Ginger McCall, Alan Butler and David Husband (EPIC).
Summary of the argument:
"Modern cell phone technology provides access to an extraordinary amount of personal data. Cell phone users routinely store sensitive and intimate information on a device that they keep close to their body. Misplacing a cellphone is an immediate cause for concern. Allowing police officers to search a person's cell phone without a warrant following an arrest would be a substantial infringement on privacy, is unnecessary, and unreasonable under the Fourth Amendment. First, the warrantless search of a cell phone provides access to personal information and private files, stored both on the phone and on remote servers that are accessible from the phone. Second, there is no need to allow warrantless searches when currently available techniques allow law enforcement to secure the cell phone data pending a judicial determination of probable cause. Neither of the interests recognized by this Court underlying the search incident to arrest exception would justify the warrantless search of an individual's cell phone."
Read more:
See also:
A blog post by Jonathan Mayer (Web Policy).
From the blog post:
"This is, at base, a factual dispute. Is it easy to draw sensitive inferences from phone metadata? How often do people conduct sensitive matters by phone, in a manner reflected by metadata? We used crowdsourced data to arrive at empirical answers. Since November, we have been conducting a study of phone metadata privacy. Participants run the MetaPhone app on their Android smartphone; it submits device logs and social network information for analysis. In previous posts, we have used the MetaPhone dataset to spot relationships, understand call graph interconnectivity, and estimate the identifiability of phone numbers. At the outset of this study, we shared the same hypothesis as our computer science colleagues—we thought phone metadata could be very sensitive. We did not anticipate finding much evidence one way or the other, however, since the MetaPhone participant population is small and participants only provide a few months of phone activity on average. We were wrong. We found that phone metadata is unambiguously sensitive, even in a small population and over a short time window. We were able to infer medical conditions, firearm ownership, and more, using solely phone metadata."
Read more:
See also:
A paper by Ann Cavoukian, Ph.D., Nilesh Bansal, Ph.D. and Nick Koudas, Ph.D..
From '1. Introduction':
"In this paper, we examine the application of Privacy by Design to the design and architecture of MLA systems through the work of Toronto-based MLA company Aislelabs. [...] This paper has in total four sections. It begins with a background discussion of MLA and how it works technologically (section 2). Next the paper discusses the unique privacy risks associated with MLA (section 3). Finally, it introduces Privacy by Design, discusses Aislelabs' MLA implementation, and shows how it designs in privacy from the outset (section 4)."
Read more:
A paper by Yaniv Taigman, Ming Yang, Marc'Aurelio Ranzato and Lior Wolf.
Abstract:
"In modern face recognition, the conventional pipeline consists of four stages: detect => align => represent => classify. We revisit both the alignment step and the representation step by employing explicit 3D face modeling in order to apply a piecewise affine transformation, and derive a face representation from a nine-layer deep neural network. This deep network involves more than 120 million parameters using several locally connected layers without weight sharing, rather than the standard convolutional layers. Thus we trained it on the largest facial dataset to-date, an identity labeled dataset of four million facial images belonging to more than 4,000 identities, where each identity has an average of over a thousand samples. The learned representations coupling the accurate model-based alignment with the large facial database generalize remarkably well to faces in unconstrained environments, even with a simple classifier. Our method reaches an accuracy of 97.25% on the Labeled Faces in the Wild (LFW) dataset, reducing the error of the current state of the art by more than 25%, closely approaching human-level performance."
Read more:
See also:
A blog post by Julia Horwitz (Privacy Rights Blog @ Epic.org).
From the blog post:
"In Fraley [v. Facebook], the defendant Facebook had used the images of Facebook users (including minor children) to advertise products. A group of parents filed a class action lawsuit against Facebook to vindicate the rights of children who had been subject to this advertising scheme. As a result of the lawsuit, Facebook and the parents agreed to a settlement, wherein Facebook would pay money to organizations that advocate for children's privacy. But the settlement agreement did not prevent Facebook from continuing to use children's images in advertisements, and the organizations selected to receive funds were not the groups that have objected to Facebook's use of images in advertising since the scheme began. The settlement agreement was so bad that one of the groups who had been selected to receive funds chose to turn the money down. The settlement agreement, said the group, left the class members worse off than they would have been without any settlement at all. If the settlement agreement was that bad (and, personally, I think it was), is it possible that none of the plaintiffs' rights were vindicated as a result of the lawsuit? Is there an argument to be made that the settlement agreement both allowed Facebook to continue its injurious behavior and also prevented the plaintiffs from ever challenging that behavior again? Are the organizations whose interests actually do align with those of the class members (for example, the group who refused the funds) barred from litigating the same issue? Or did the deficient settlement agreement reach back in time and opt everyone out of a class that would not reap the benefits of a settlement agreement?"
Read more:
A blog post by Phil Lee (Privacy and Information Law Blog).
From the blog post:
"As an EU privacy professional working in the US, one of the things that regularly fascinates me is each continent's misperception of the other's privacy rules. Far too often have I heard EU privacy professionals (who really should know better) mutter something like 'The US doesn't have a privacy law' in conversation; equally, I've heard US colleagues talk about the EU's rules as being 'nuts' without understanding the cultural sensitivities that drive European laws. So I thought it would be worth dedicating a few lines to compare and contrast the different regimes, principally to highlight that, yes, they are indeed different, but, no, you cannot draw a conclusion from these differences that one regime is 'better' (whatever that means) than the other."
Read more:
See also:
A selection of papers by various authors (Future of Privacy Forum).
From the digest:
"The featured papers analyze current and emerging privacy issues and propose solutions or offer free analysis that could lead to new approaches in privacy law. Academics, privacy advocates and Chief Privacy Officers on FPF’s Advisory Board reviewed all submitted papers, emphasizing clarify, practicality and overall utility as the most important criteria for selection."
Read more:
A blog post by Brian Krebs (Krebs on Security).
From the blog post:
"In the wake of one data breach after another, millions of Americans each year are offered credit monitoring services that promise to shield them from identity thieves. Although these services can help true victims step out from beneath the shadow of ID theft, the sad truth is that most services offer little in the way of real preventative protection against the fastest-growing crime in America."
Read more:
A report by Olivier Bilodeau, Pierre-Marc Bureau, Joan Calvet, Alexis Dorais-Joncas, Marc-Étienne M. Léveillé and Benjamin Vanheuverzwijn (WeLiveSecurity).
From the Executive Summary:
"This document details a large and sophisticated operation, code named 'Windigo', in which a malicious group has compromised thousands of Linux and Unix servers. The compromised servers are used to steal SSH credentials, redirect web visitors to malicious content and send spam. This operation has been ongoing since at least 2011 and has affected high profile servers and companies [...] This report contains a detailed description of our ongoing investigation of the Windigo operation. We provide details on the number of users that have been victimized and the exact type of resources that are now in control of the gang. Furthermore, we provide a detailed analysis for the three main malicious components of this operation [...]"
Read more:
See also:
A report by Vanja Svajcer (SophosLabs).
[==> Plak hier de pull-quote.]Read more:
A report by Pew Research Center.
From 'About this report':
"This report is part of an effort by the Pew Research Center’s Internet Project in association with Elon University's Imagining the Internet Center to look at the future of the Internet, the Web, and other digital activities. This is the first of eight reports based on a canvassing of hundreds of experts about the future of such things as privacy, cybersecurity, the “Internet of things,” and net neutrality. In this case we asked experts to make their own predictions about the state of digital life by the year 2025. We will also explore some of the economic change driven by the spectacular progress that made digital tools faster and cheaper. And we will report on whether Americans feel the explosion of digital information coursing through their lives has helped them be better informed and make better decisions."
Read more:
A policy brief by Anthony Dworkin (European Council on Foreign Relations).
From the introduction to the document:
"This policy brief sketches the outline of a common European position, rooted in the idea that outside zones of conventional hostilities, the deliberate taking of human life must be justified on an individual basis according to the imperative necessity of acting in order to prevent either the loss of other lives or serious harm to the life of the nation. It argues that such a position would now offer a basis for renewed engagement with the Obama administration, which has endorsed a similar standard as a matter of policy, even if its interpretation of many key terms remains unclear and its underlying legal arguments remain different. Finally, it suggests that European states will need to clarify their own understanding and reach agreement among themselves on some parts of the relevant legal framework as they refine their position and pursue discussions with the United States. None of these efforts will necessarily be easy. But unless the EU defines a position on remotely piloted aircraft and targeted killing, it risks neglecting its own interests and missing an opportunity to help shape global standards in an area that is vital to international peace and security."
Read more:
See also:
A report to the Human Rights Council by the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, Ben Emerson.
From the Summary:
"In the main report, contained in chapter III, the Special Rapporteur examines the use of remotely piloted aircraft, or drones, in extraterritorial lethal counter-terrorism operations, including in the context of asymmetrical armed conflict, and allegations that the increasing use of remotely piloted aircraft, or drones, has caused disproportionate civilian casualties, and makes recommendations to States."
Read more:
See also:
A report by the ACLU of California.
From the introduction:
"Limited privacy protections for metadata may have made sense decades ago when technology to collect and analyze data was virtually nonexistent. But in today's 'big data' world, non-content does not mean non-sensitive. In fact, new technology is demonstrating just how sensitive metadata can be: how friend lists can reveal a person's sexual orientation, purchase histories can identify a pregnancy before any visible signs appear, and location information can expose individuals to harassment for unpopular political views or even theft and physical harm. Two separate committees assembled by the executive branch — the President's Review Group on Intelligence and Communications Technology and the Privacy and Civil Liberties Oversight Board —have joined lawmakers, academics, and judges in calling for a reevaluation of the distinction between content and metadata. This paper examines how new technologies and outdated laws have combined to make metadata more important and more vulnerable than ever, and proposes a way forward to ensure that all of our sensitive information gets the privacy protection it deserves."
Read more:
See also:
An article by Steven R. Desai.
From the Abstract:
"Protecting associational freedom is a core, independent yet unappreciated part of the Fourth Amendment. New surveillance techniques threaten that freedom. Surveillance is no longer forward looking. Law enforcement can obtain the same, if not more, information about all of us by looking backward. Forward-looking surveillance has limits. Some limits are practical such as the cost to place a person in a car to follow a suspect. There are also procedural limits, such as the requirement that surveillance relate to criminal activity. In addition, surveillance such as wiretapping and using a GPS tracker often requires a warrant. Warrants involve review by a neutral magistrate. The warrant sets limits on what information may be collected, how it is collected, and how it can be used. The surveillance is also time limited and requires continual justification to a judge, or the surveillance will be shut down. With backward-looking surveillance all of these protections are gone. Law enforcement can now use low-cost technology to track us or need only ask a business for the record of where we went, whom we called, what we read, and more. Revelation of the NSA’s vast Prism surveillance project is but the most recent example of law enforcement engaging in this sort of over-reaching surveillance. The FBI has previously deployed similar programs to read mail, obtain lists of books read, demand member lists, and generate watch lists of people to round up in case of national emergency. The efforts vary; the harm is the same. Law enforcement has a perfect picture of our activities and associations regardless of whether they are criminal. With digital records these harms are more acute. Once the data about our activities is gathered, law enforcement may keep that data indefinitely. They have a data hoard. That hoard grows with each new data request. Once created, the hoard can be continually rifled to investigate us but without any oversight."
Read more:
A paper by Brad Miller, Ling Huang, A.D. Joseph, and J.D. Tygar.
Abstract:
"Revelations of large scale electronic surveillance and data mining by governments and corporations have fueled increased adoption of HTTPS. We present a traffic analysis attack against over 6000 webpages spanning the HTTPS deployments of 10 widely used, industry-leading websites in areas such as healthcare, finance, legal services and streaming video. Our attack identifies individual pages in the same website with 89% accuracy, exposing personal details including medical conditions, financial and legal affairs and sexual orientation. We examine evaluation methodology and reveal accuracy variations as large as 18% caused by assumptions affecting caching and cookies. We present a novel defense reducing attack accuracy to 27% with a 9% traffic increase, and demonstrate significantly increased effectiveness of prior defenses in our evaluation context, inclusive of enabled caching, user-specific cookies and pages within the same website."
Read more:
A paper by Ajaya Neupane, Nitesh Saxena, Keya Kuruvilla, Michael Georgescu, and Rajesh Kana.
From the Abstract:
"The security of computer systems often relies upon decisions and actions of end users. In this paper, we set out to investigate user-centered security by concentrating at the most fundamental component governing user behavior – the human brain. We introduce a novel neuroscience-based study methodology to inform the design of user-centered security systems. Specifically, we report on an fMRI study measuring users' security performance and the underlying neural activity with respect to two critical security tasks: (1) distinguishing between a legitimate and a phishing website, and (2) heeding security (malware) warnings. At a higher level, we identify neural markers that might be controlling users' performance in these tasks, and establish relationships between brain activity and behavioral performance as well as between users' personality traits and security behavior."
Read more:
See also:
A paper by Ryan Calo.
From the Abstract:
"Two decades of analysis have produced a rich set of insights as to how the law should apply to the Internet's peculiar characteristics. But, in the meantime, technology has not stood still. The same public and private institutions that developed the Internet, from the armed forces to search engines, have initiated a significant shift toward robotics and artificial intelligence. This article is the first to examine what the introduction of a new, equally transformative technology means for cyberlaw (and law in general). Robotics has a different set of essential qualities than the Internet and, accordingly, will raise distinct issues of law and policy. Robotics combines, for the first time, the promiscuity of data with the capacity to do physical harm; robotic systems accomplish tasks in ways that cannot be anticipated in advance; and robots increasingly blur the line between person and instrument."
Read more:
A publication by the European Union Agency for Fundamental Rights, European Court of Human Rights, Council of Europe.
From the Foreword:
"This handbook on European data protection law is jointly prepared by the European Union Agency for Fundamental Rights and the Council of Europe together with the Registry of the European Court of Human Rights. It is the third in a series of legal handbooks jointly prepared by the EU Agency for Fundamental Rights and the Council of Europe. [...] The aim of this handbook is to raise awareness and improve knowledge of data protection rules in European Union and Council of Europe member states by serving as the main point of reference to which readers can turn. It is designed for non-specialist legal professionals, judges, national data protection authorities and other persons working in the field of data protection."
Read more:
See also:
Master thesis by Bernold Nieuwesteeg.
From the Executive Summary:
"This thesis scrutinizes the proportionality and describes the subsidiarity of proposals for security breach notification laws (hereafter: SBNLs) in the European Union. […] The laws that have been assessed are Article 31 of the proposed Data Protection Regulation (hereafter: PDPR) and Article 14 of the proposed Cybersecurity Directive (hereafter: PCD). Article 31 PDPR concerns a single uniform personal data breach notification obligation. A personal data breach entails the unauthorized access to and/or theft of personal data. Article 14 PCD concerns the harmonization of national (significant) loss of integrity breach notification obligations. […] This thesis challenges the aforementioned assumption that determination of causality is straightforward. This is done by a more substantive assessment of the proportionality test. This thesis contributes an empirical study from a security economics perspective, in order to substantively review (the complexity of) effects of SBNLs. Do the (expected) effects of SBNLs match the aims it should attain according to the European proposals? And are these effects desirable?"
Read more:
A blog post by Meena Harris (Inside Privacy).
From the text:
"Democratic and Republican senators have been busy drafting legislation that would establish national requirements for data security and breach notice. The following bills have been introduced over the last year: Data Security and Breach Notification Act, Toomey (R-PA); Personal Data Privacy and Security Act, Leahy (D-VT); Data Security Act, Carper (D-DE) and Blunt (R-MO); Data Security and Breach Notification Act, Rockefeller (D-WV); and Personal Data Protection and Breach Accountability Act, Blumenthal (D-CT). This post provides a side-by-side comparison of these five data-breach bills, which would impose varying standards and penalties. The comparison focuses on the breach-notification requirements of each bill; it does not discuss the standards that some bills would establish for internal security protocols to safeguard stored data."
Read more:
A publication by Privacy International.
From the Editorial:
"In June 2013 the entire discourse changed dramatically. The catalyst for the right's recent rise to the top of international political and human rights agendas was last year's series of revelations by Edward Snowden, the former NSA contractor. The importance of the Snowden revelations cannot be overstated, as they finally gave us the evidence of what we had most feared: that governments acting with scant attention to legal protections, are using invasive techniques to collect as much as they can, while compromising the systems that we all rely upon. Equally, these revelations accelerated, in leaps and bounds, the process of building public knowledge about global surveillance arrangements and capabilities. Awareness of, and interest in, the right to privacy is now unprecedented. And so it was that 2013 became the year that privacy advocates finally gained traction in the halls of national parliaments and the United Nations General Assembly; that strong civil society coalitions were formed across borders and regions; that the world's 101st data protection law was adopted (by South Africa). Privacy became, in the words of Human Rights Watch, 'the right whose time has come'."
Read more:
A blog post by Jack Stanley (ACLU).
From the text:
"We now know that the NSA is collecting location information en masse. As we’ve long said, location data is an extremely powerful set of information about people. To flesh out why that is true, here is the kind of future memo that we fear may someday soon be uncovered: [...]"
Read more:
A report by Insight Intelligence (National Association for Information Destruction).
From the press release:
"The NAID-ANZ Secondhand Hard Drive Study, completed in January 2014 and published 19 Feb., showed that 15 of 52 hard drives randomly purchased, approximately 30 percent, contained highly confidential personal information. While seven of the 15 devices were recycled by individuals, eight were recycled by law firms, a government medical facility, and a community centre. These study results come just before the new Privacy Act reforms will be effective 12 March, requiring organisations to safeguard people's personal information."
Read more:
A report by Genevieve Morris, Greg Farnum, Scott Afzal, Carol Robinson, Jan Greene and Chris Coughlin (Office of the National Coordinator for Health Information Technology).
From the Executive Summary:
"The United States healthcare system is marching diligently toward a more connected system of care through the use of electronic health record systems (EHRs) and electronic exchange of patient information between organizations and with patients and caregivers. The Patient Identification and Matching Initiative, sponsored by the Office of the National Coordinator for Health Information Technology)or Health Information Technology (ONC), focused on identifying incremental steps to help ensure the accuracy of every patient’s identity, and the availability of their information wherever and whenever care is needed. Matching records to the correct person becomes increasingly complicated as organizations share records electronically using different systems, and in a mobile society where patients seek care in many healthcare settings. Many healthcare organizations use multiple systems for clinical, administrative, and specialty services, which leads to an increased chance of identity errors when matching patient records. Additionally, many regions experience a high number of individuals who share the exact name and birthdate, leading to the need for additional identifying attributes to be used when matching patient records. [...] Driven by concerns for patient safety in the event of mismatched or unmatched records and the national imperative to improve population health and lower costs through care coordination, this initiative studied both technical and human processes, seeking improvements to patient identification and matching that could be quickly implemented and lead to near-term improvements in matching rates."
Read more:
See also:
A paper by A. Michael Froomkin.
From the Abstract:
"Modeling mass surveillance disclosure regulations on an updated form of environmental impact statement will help protect everyone's privacy: Mandating disclosure and impact analysis by those proposing to watch us in and through public spaces will enable an informed conversation about privacy in public. Additionally, the need to build consideration of the consequences of surveillance into project planning, as well as the danger of bad publicity arising from excessive surveillance proposals, will act as a counterweight to the adoption of mass data collection projects, just as it did in the environmental context. In the long run, well-crafted disclosure and analysis rules could pave the way for more systematic protection for privacy -- as it did in the environmental context. Effective US regulation of mass surveillance will require that we know a great deal about who and what is being recorded and about the costs and benefits of personal information acquisition and uses. At present we know relatively little about how to measure these; a privacy equivalent of environmental impact statements will not only provide case studies, but occasions to grow expertise."
Read more: