Rina Steenkamp - Privacy and technology
[Syllabus, Riley v. California | Trickle down surveillance | Researchers find and decode the spy tools governments use to hijack phones | [Irish High Court refers Schrems Facebook privacy case to ECJ] | Coginitive disconnect - Understanding Facebook Connect login permissions | Addressing the right to privacy at the United Nations | Data doppelgängers and the uncanny valley of personalization | Big data and innovation, setting the record straight - De-identification does work | EMC privacy index | It's all about the Benjamins - An empirical study on incentivizing users to ignore security advice | Tor is for everyone - why you should use Tor | Open wireless movement | A measurement study of Google Play | Factsheet on the "Right to be Forgotten" ruling (C-131/12) | Judgment of the Court (Fourth Chamber) [...] In Case C-360/13 | Law enforcement disclosure report | Consumer's location data - Companies take steps to protect privacy, but practices are inconsistent, and risks may not be clear to consumers | Ars tests Internet surveillance - by spying on an NPR reporter | A crisis of accountability - A global analysis of the impact of the Snowden revelations | Why King George III can encrypt | Alan Westin's privacy homo economicus | Necessary & proportionate - Internationals principles on the application of human rights to communications surveillance - Background and supporting international legal analysis | When enough is enough - location tracking, mosaic theory, and machine learning | The top 5 claims that defenders of the NSA have to stop making to remain credible | Data controllers and data processors - what's the difference? | Business without borders - The importance of cross-border data transfers to global prosperity | Data brokers - A call for transparency and accountability | Privacy advocates warn of 'nightmare' scenario as tech giants consider fitness tracking | Your secret Stingray's no secret anymore - the vanishing government monopoly over cell phone surveillance and its impact on national security and consumer privacy | Judgement of the Court of 8 April 2014 in joined Cases C-293/12 and C-594/12 | U.S. mines personal health data to find the vulnerable in emergencies | Ask Ars - Can I see what information the feds have on my travel? | The Internet with a human face | Privacy under attack - the NSA files revealed new threats to democracy | Freedom and control - Engineering a new paradigm for the digital world | What's the gist? Privacy-preserving aggregation of user profiles | How to protect the most privacy with the least effort - change search engines | Twenty-fifth annual report of the Data Protection Commissioner 2013 | Privacy versus government surveillance - where network effects meet public choice | Care Data - the cons | 20 years of "online government" 101. Part 1 - progress towards a single online presence]
A publication by the Supreme Court of the United States.
From 'Opinion of the Court':
"Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans 'the privacies of life,' [...]. The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the Founders fought. Our answer to the question of what police must do before searching a cell phone seized incident to an arrest is accordingly simple - get a warrant."
Read more:
See also:
An article by Nathan Freed Wessler (Al Jazeera America).
From the article:
"Cell site simulators, also known as 'stingrays,' are devices that trick cellphones into reporting their locations and identifying information. They do so by mimicking cellphone towers and sending out electronic cues that allow the police to enlist cellphones as tracking devices, thus revealing people’s movements with great precision. The equipment also sends intrusive electronic signals through the walls of private homes and offices, learning information about the locations and identities of phones inside."
Read more:
See also:
A blog post by Kim Zetter (Wired).
From the blog post:
"Newly uncovered components of a digital surveillance tool used by more than 60 governments worldwide provide a rare glimpse at the extensive ways law enforcement and intelligence agencies use the tool to surreptitiously record and steal data from mobile phones."
Read more:
See also:
Judgment of Mr. Justice Hogan.
From the 'Summary of overall conclusions':
"75. [...] It is irrelevant that Mr. Schrems cannot show that his own personal data was accessed in this fashion by the NSA, since what matters is the essential inviolability of the personal data itself. The essence of that right would be compromised if the data subject had reason to believe that it could be routinely accessed by security authorities on a mass and undifferentiated basis. 76. Third, the evidence suggests that personal data of data subjects is routinely accessed on a mass and undifferentiated basis by the US security authorities."
Read more:
See also:
A working draft by Nicky Robinson and Joseph Bonneau.
Abstract:
"We study Facebook Connect's permissions system using crawling, experimentation, and surveys and determine that it works differently than both users and developers expect in several ways. We show that more permissions can be granted than the developer intended. In particular, permissions that allow a site to post to the user's profile are granted on an all-or-nothing basis. We evaluate how the requested permissions are presented to the user and find that, while users generally understand what data sites can read from their profile, they generally do not understand the many different things the sites can post. In the case of write permissions, we show that user expectations are influenced by the identity of the requesting site which in reality has no impact on what is enforced. We also find that users generally do not understand the way Facebook Connect permissions interact with Facebook's privacy settings. Our results suggest that users understand detailed, granular messages better than those that are broad and vague."
Read more:
Blog post by Alexandrine Pirlot (Privacy International).
From the blog post:
"What do Egypt, Kenya, Turkey, Guinea, and Sweden have in common? Despite having a Constitutional right to privacy, they are adopting and enforcing policies that directly challenge this human right. These states are also up for a Universal Periodic Review this year before the United Nations Human Rights Council. UPRs are a mechanism within the Council aimed at improving the human rights situation in all countries and address human rights violations wherever they occur. [...] This year, we submitted reports on Egypt, Kenya, Guinea, Sweden, and Turkey, and will make a submission on the US and Belgium later in the year. We hope that the Human Rights Council within the UPR process will address the privacy concerns raised by Privacy International and its partners of the need to protect privacy rights in these countries."
Read more:
An article by Sara M. Watson (Nextgov.com).
From the article:
"Google thinks I'm interested in parenting, superhero movies, and shooter games. The data broker Acxiom thinks I like driving trucks. My data doppelgänger is made up of my browsing history, my status updates, my GPS locations, my responses to marketing mail, my credit card transactions, and my public records.Still, it constantly gets me wrong, often to hilarious effect. I take some comfort that the system doesn't know me too well, yet it is unnerving when something is misdirected at me. Why do I take it so personally when personalization gets it wrong?"
Read more:
A paper by Ann Cavoukian, Ph.D. and Daniel Castro.
From the Introduction:
"In this paper, we will discuss a select group of academic articles often referenced in support of the myth that de-identification is an ineffective tool to protect the privacy of individuals. While these articles raise important issues concerning the use of proper de-identification techniques, reported findings do not suggest that de-identification is impossible or that de-identified data should be classified as personally identifiable information. We then provide a concrete example of how data may be effectively de-identified — the case of the U.S. Heritage Health Prize. This example shows that in some cases, de-identification can maximize both privacy and data quality, thereby enabling a shift from zero-sum to positive-sum thinking — a key principle of Privacy by design."
Read more:
See also:
A survey by EMC.
From the overview page:
"The 2014 EMC Privacy Index surveyed 15,000 people in 15 countries to produce a ranking of nations based on consumer perceptions and attitudes about data privacy, and their willingness to trade privacy for greater convenience and benefits online."
Read more:
See also:
A paper by Nicolas Christin, Serge Egelman, Timothy Vidas, and Jens Grossklags.
From the Abstract:
"We examine the cost for an attacker to pay users to execute arbitrary code—potentially malware. We asked users at home to download and run an executable we wrote without being told what it did and without any way of knowing it was harmless. Each week, we increased the payment amount. Our goal was to examine whether users would ignore common security advice — not to run untrusted executables — if there was a direct incentive, and how much this incentive would need to be."
Read more:
A blog post by Cooper Quintin (EFF).
From the blog post:
"EFF recently kicked off our second Tor Challenge, an initiative to strengthen the Tor network for online anonymity and improve one of the best free privacy tools in existence. The campaign - which we've launched with partners at the Freedom of the Press Foundation, the Tor Project, and the Free Software Foundation - is already off to a great start. In just the first few days, we've seen over 600 new or expanded Tor nodes—more than during the entire first Tor Challenge. This is great news, but how does it affect you? To understand that, we have to dig into what Tor actually is, and what people can do to support it. Support can come in many forms, too. Even just using Tor is one of the best and easiest things a person can do to preserve privacy and anonymity on the Internet."
Read more:
A project by the EFF and others.
From 'It is crucial to user privacy':
"[...] smartphones are actually spy phones. But they don't need to be. If we had enough open wireless networks available, we could change that. Startup companies—and open source projects—could make devices that used the open networks without reporting your location and communications to phone companies. Devices that skip smoothly from one open wireless network to another don't provide the kind of granular information about your intimate activities that the current single-carrier systems do. We have two choices: let mobile privacy stay dead forever, or build an alternative open wireless future."
Read more:
See also:
A paper by Nicolas Viennot, Edward Garcia and Jason Nieh.
From '10. Conclusions':
"We have built PlayDrone, a system that uses various hacking techniques to circumvent Google security to successfully crawl Google Play. [...] We further show that [...] Android applications contain thousands of leaked secret authentication keys which can be used by malicious users to gain unauthorized access to server resources through Amazon Web Services and compromise user accounts on Facebook. We worked with service providers, including Amazon, Facebook, and Google, to identify and notify customers at risk, and make the Google Play store a safer place."
Read more:
A publication by the European Commission.
From '1) What is the case about and what did the Court rule?':
"In 2010 a Spanish citizen lodged a complaint against a Spanish newspaper with the national Data Protection Agency and against Google Spain and Google Inc. The man complained that an auction notice of his repossessed home on Google's search results infringed his privacy rights because the proceedings concerning him had been fully resolved for a number of years and hence the reference to these was entirely irrelevant. He requested, first, that the newspaper be required either to remove or alter the pages in question so that the personal data relating to him no longer appeared; and second, that Google Spain or Google Inc. be required to remove the personal data relating to him, so that it no longer appeared in the search results. The Spanish court referred the case to the Court of Justice of the European Union asking: (a) whether the EU's 1995 Data Protection Directive applied to search engines such as Google; (b) whether EU law (the Directive) applied to Google Spain, given that the company's data processing server was in the United States; (c) whether an individual has the right to request that his or her personal data be removed from accessibility via a search engine (the 'right to be forgotten')."
Read more:
See also:
Judment of the EU Court of Justice.
From the judgment:
"63. [...] the answer to the question referred is that Article 5 of Directive 2001/29 must be interpreted as meaning that the on-screen copies and the cached copies made by an end-user in the course of viewing a website satisfy the conditions that those copies must be temporary, that they must be transient or incidental in nature and that they must constitute an integral and essential part of a technological process, as well as the conditions laid down in Article 5(5) of that directive, and that they may therefore be made without the authorisation of the copyright holders."
Read more:
See also:
A report by Vodafone.
From 'What we are publishing, and why':
"This is our inaugural Law Enforcement Disclosure Report. We are also one of the first communications operators in the world to provide a country-by-country analysis of law enforcement demands received based on data gathered from local licensed communications operators. We will update the information disclosed in this report annually. We also expect the contents and focus to evolve over time and would welcome stakeholders’ suggestions as to how they should do so."
Read more:
See also:
Statement of Mark L. Goldstein (United States Government Accountability Office).
From 'What GAO found':
"Fourteen mobile industry companies and 10 in-car navigation providers that GAO examined in its 2012 and 2013 reports—including mobile carriers and auto manufacturers with the largest market share and popular application developers—collect location data and use or share them to provide consumers with location-based services and improve consumer services. For example, mobile carriers and application developers use location data to provide social networking services that are linked to consumers’ locations. In-car navigation services use location data to provide services such as turn-by-turn directions or roadside assistance. Location data can also be used and shared to enhance the functionality of other services, such as search engines, to make search results more relevant by, for example, returning results of nearby businesses. While consumers can benefit from location-based services, their privacy may be at risk when companies collect and share location data. For example, in both reports, GAO found that when consumers are unaware their location data are shared and for what purpose data might be shared, they may be unable to judge whether location data are shared with trustworthy third parties. Furthermore, when location data are amassed over time, they can create a detailed profile of individual behavior, including habits, preferences, and routes traveled—private information that could be exploited. Additionally, consumers could be at higher risk of identity theft or threats to personal safety when companies retain location data for long periods or in a way that links the data to individual consumers. Companies can anonymize location data that they use or share, in part, by removing personally identifying information; however, in its 2013 report, GAO found that in-car navigation providers that GAO examined use different de-identification methods that may lead to varying levels of protection for consumers."
Read more:
See also:
A blog post by Sean Gallagher (Ars Technica).
From the blog post:
"On a bright April morning in Menlo Park, California, I became an Internet spy. This was easier than it sounds because I had a willing target. I had partnered with National Public Radio (NPR) tech correspondent Steve Henn for an experiment in Internet surveillance. For one week, while Henn researched a story, he allowed himself to be watched—acting as a stand-in, in effect, for everyone who uses Internet-connected devices. How much of our lives do we really reveal simply by going online?"
Read more:
Compiled and edited by Simon Davies.
From the Executive summary:
"A significant number of corporations have responded to the disclosures by introducing a range of accountability and security measures (transparency reports, end-to-end encryption etc). Nonetheless, while acknowledging that these reforms are 'a promising start' nearly sixty percent of legal and IT professionals surveyed for this report believe that they do not go far enough, with more than a third of respondents reporting that they felt the measures were 'little more than window dressing” or are of 'little value' outside the US. Civil society and the tech community have not adequately adapted to the challenges raised by the Snowden revelations. For example, the interface and the communications between policy reform (e.g. efforts to create greater accountability measures, privacy regulations) and technical privacy solutions (e.g. designing stronger embedded security) is worryingly inconsistent and patchy. Few channels of communication and information exchange exist between these disparate communities."
Read more:
A paper by Wenley Tong, Sebastian Gold, Samuel Gichohi, Mihai Roman, and Jonathan Frankle.
Abstract:
"We sought to re-examine the conclusions of the classic paper Why Johnny Can't Encrypt, which portrayed a usability crisis in security software by documenting the inability of average users to correctly send secure email through Pretty Good Privacy (PGP). While the paper's authors primarily focused on user-interface concerns, we turned our attention to the terminology underlying the protocol. We developed a new set of metaphors with the goal of representing cryptographic actions (sign, encrypt, etc.) rather than primitives (public and private keys). Our objects were chosen such that their real-world analogs would correctly represent the security properties of PGP. Since these metaphors now corresponded to physical actions, we also introduced new forms of documentation that explored narrative techniques for explaining secure email to non-technical users. In quiz-based testing, we found that, while our new metaphors did not dramatically outperform traditional PGP, we were able to convey equivalent levels of understanding with far shorter documentation. Subsequent lab testing confirmed that metaphors with physical analogs and the accompanying briefer instructions greatly eased the process of using secure email. Our results indicate that crafting new metaphors to facilitate these alternative forms of documentation is a fruitful avenue for explaining otherwise challenging security concepts to nontechnical users."
Read more:
A paper by Chris Jay Hoofnagle and Jennifer M. Urban.
From the Abstract:
"Homo economicus reliably makes an appearance in regulatory debates concerning information privacy. Under the still-dominant U.S. 'notice and choice' approach to consumer information privacy, the rational consumer is expected to negotiate for privacy protection by reading privacy policies and selecting services consistent with her preferences. A longstanding model for predicting these preferences is Professor Alan Westin's well-known segmentation of consumers into 'privacy pragmatists,' 'privacy fundamentalists,' and 'privacy unconcerned.' [...] This Article contributes to the ongoing debate about notice and choice in two main ways. First, we consider the legacy Westin's privacy segmentation model itself, which as greatly influenced the development of the notice-and-choice regime. Second, we report on original survey research, collected over four years, exploring Americans’ knowledge, preferences, and attitudes about a wide variety of data practices in online and mobile markets. Using these methods, we engage in considered textual analysis, empirical testing, and critique of Westin’s segmentation model."
Read more:
A publication by EFF and Article 19.
From the Introduction:
"These questions and ongoing concerns arising from surveillance techniques were the jumping off point for the drafting of the International Principles on the Application of Human Rights to Communication Surveillance that explain how international human rights law applies in the context of communication surveillance. [...] In this document, the Electronic Frontier Foundation and ARTICLE 19 explain the legal or conceptual basis for the specific Principles. Our paper is divided into three parts. Part one addresses questions relating to the Principles’ scope of application. Part two introduces key definitions and concepts, namely the concept of 'protected information' in contrast with traditional categorical approaches to data protection and privacy and a definition of 'communications surveillance.' Part three explains the legal and conceptual basis of each Principle. It begins by setting out the basic human rights framework underpinning the rights to privacy, freedom of expression, and freedom of association. It then elaborates on the legal underpinning for each of the Principles with reference to the case law and views of a range of international human rights bodies and experts, such as UN special rapporteurs. We try to be clear about when our conclusions are based on firmly established law, and when we are suggesting new specific practices based on principles fundamental to human rights."
Read more:
A paper by Steven M. Bellovin, Renée M. Hutchins, Tony Jebara and Sebastian Zimmeck.
From the Abstract:
"Since 1967, when it decided Katz v. United States, the Supreme Court has tied the right to be free of unwanted govern-ment scrutiny to the concept of reasonable expectations of privacy. An evaluation of reasonable expectations depends, among other factors, upon an assessment of the intrusiveness of government ac-tion. When making such assessment historically the Court consid-ered police conduct with clear temporal, geographic, or substantive limits. However, in an era where new technologies permit the storage and compilation of vast amounts of personal data, things are becoming more complicated. A school of thought known as 'mosaic theory' has stepped into the void, ringing the alarm that our old tools for assessing the intrusiveness of government conduct potentially undervalue privacy rights. Mosaic theorists advocate a cumulative approach to the evaluation of data collection. Under the theory, searches are 'analyzed as a collective sequence of steps rather than as individual steps.' The approach is based on the observation that comprehensive aggregation of even seemingly innocuous data reveals greater insight than consideration of each piece of information in isolation. Over time, discrete units of surveillance data can be processed to create a mosaic of habits, relationships, and much more. Consequently, a Fourth Amendment analysis that focuses only on the government's collection of discrete units of data fails to appreciate the true harm of long-term surveillance—the composite."
Read more:
See also:
A blog post by Cindy Cohn and Nadia Kayyali (EFF).
From the blog post:
"Over the past year, as the Snowden revelations have rolled out, the government and its apologists have developed a set of talking points about mass spying that the public has now heard over and over again. From the President, to Hilary Clinton to Rep. Mike Rogers, Sen. Dianne Feinstein and many others, the arguments are often eerily similar. But as we approach the one year anniversary, it's time to call out the key claims that have been thoroughly debunked and insist that the NSA apologists retire them."
Read more:
A quiz (and guidance) by the ICO.
From the introduction page:
"The definitions of data controller and data processor are central to understanding the Data Protection Act – particularly in terms of working out where data protection responsibility lies. But the distinction between the two can be misunderstood, and we’re being asked to tell the difference between them on more and more occasions. That's prompted us to update our guidance. So how much do you know about which is which? Take our test and find out, before heading to the guidance for an explanation of each example."
Read more:
A report by the United States Chamber of Commerce and Hunton & Williams L.L.P..
From 'I. Introduction':
"Privacy safeguards are critical, and businesses play a key role in protecting the information under their control But privacy need not be the enemy of prosperity - we can embrace strong, innovative privacy regimes that also promote trade and growth. This report offers recommendations for a path forward by highlighting existing privacy rules that can be implemented on a more global scale, and proposes new mechanisms to facilitate cross-border data transfers."
Read more:
A report by the Federal Trade Commission.
From 'VIII. Findings and recommendations':
"Data brokers collect and store a vast amount of data on almost every U.S. household and commercial transaction. Of the nine data brokers, one data broker's database has information on 1.4 billion consumer transactions and over 700 billion aggregated data elements; another data broker's database covers one trillion dollars in consumer transactions; and yet another data broker adds three billion new records each month to its databases. Most importantly, data brokers hold a vast array of information on individual consumers. For example, one of the nine data brokers has 3000 data segments for nearly every U.S. consumer."
Read more:
See also:
An article by Andrea Peterson (Washingon Post).
From the article:
"'This is really, really a privacy nightmare,' says Deborah Peel, the executive director of Patient Privacy Rights, who claims that the vast majority, if not all, of the health data collected by these types of apps have effectively 'zero' protections, but is increasingly prized by online data mining and advertising firms. Both the Food and Drug Administration and the FTC regulate some aspects of the fitness tracking device and app market, but not everyone thinks the government has kept pace with the rapidly changing fitness tracking market. 'The FTC and even the FDA have not done enough,' says Jeffrey Chester, the executive director of the Center for Digital Democracy, who says the lack of concrete safeguards to protect data in this new space leaves consumers at risk. 'Health information is sensitive information and it should be tightly regulated.'"
Read more:
An article by Stephanie K. Pell and Christopher Soghoian.
From the Abstract:
"This Article illustrates how cellular interception capabilities and technology have become, for better or worse, globalized and democratized, placing Americans' cellular communications at risk of interception from foreign governments, criminals, the tabloid press and virtually anyone else with sufficient motive to capture cellular content in transmission. Notwithstanding this risk, US government agencies continue to treat practically everything about this cellular interception technology, as a closely guarded, necessarily secret 'source and method,' shrouding the technical capabilities and limitations of the equipment from public discussion, even keeping its very name from public disclosure. This 'source and method' argument, although questionable in its efficacy, is invoked to protect law enforcement agencies' own use of this technology while allegedly preventing criminal suspects from learning how to evade surveillance. This Article argues that current policy makers should not follow the worn path of attempting to outlaw technology while ignoring, and thus perpetuating, the significant vulnerabilities in cellular communications networks on which it depends. Moreover, lawmakers must resist the reflexive temptation to elevate the sustainability of a particular surveillance technology over the need to curtail the general threat that technology poses to the security of cellular networks. Instead, with regard to this destabilizing, unmediated technology and its increasing general availability at decreasing prices, Congress and appropriate regulators should address these network vulnerabilities directly and thoroughly as part of the larger cyber security policy debates and solutions now under consideration. This Article concludes by offering the beginnings of a way forward for legislators to address digital cellular network vulnerabilities with a new sense of urgency appropriate to the current communications security environment."
Read more:
An information note by the General Secretariat of the Council of the European Union.
From 'III. Consequences of the Judgement for the Council':
"This judgment of the Court delivered by its Grand Chamber, is clearly of crucial importance in view of further action of the Union in the field of privacy and data protection. It confirms that the Court of Justice will not satisfy itself with anything less than a strict assessment of the proportionality and necessity of measures that constitute serious restrictions to fundamental rights, however legitimate the objectives pursued by the EU legislature. It also indicates that such measures do not stand a serious chance of passing the legality test unless they are accompanied by adequate safeguards in order to ensure that any serious restriction of fundamental rights is circumscribed to what is strictly necessary and is decided in the framework of guarantees forming part of Union legislation instead of being left to the legislation of Member States."
Read more:
See also:
An article by Sheri Fink (NYT).
From the article±
"When a rare ice storm threatened New Orleans in January, some residents heard from a city official who had gained access to their private medical information. Kidney dialysis patients were advised to seek early treatment because clinics would be closing. Others who rely on breathing machines at home were told how to find help if the power went out. Those warnings resulted from vast volumes of government data. For the first time, federal officials scoured Medicare health insurance claims to identify potentially vulnerable people and share their names with local public health authorities for outreach during emergencies and disaster drills."
Read more:
A blog post by Cyrus Farivar (Ars Technica).
From the blog post:
"Lately I've been on something of a public records binge. I asked for records about my license plate reader data from local law enforcement agencies. I asked for complaint records from the Federal Trade Commission about a sketchy Bitcoin mining hardware maker. A few more requests are still pending. And last summer, I asked United States Customs and Border Protection (CBP) agency for my travel records under the Freedom of Information Act (FOIA). Recently, I got an answer back - sort of."
Read more:
Text version of a talk by Maciej Ceglowski (Idle Words).
From 'The Internet remembers too much':
"The offline world works like it always has. I saw many of you talking yesterday between sessions; I bet none of you has a verbatim transcript of those conversations. If you do, then I bet the people you were talking to would find that extremely creepy. I saw people taking pictures, but there's a nice set of gestures and conventions in place for that. You lift your camera or phone when you want to record, and people around you can see that. All in all, it works pretty smoothly. The online world is very different. Online, everything is recorded by default, and you may not know where or by whom. If you've ever wondered why Facebook is such a joyless place, even though we've theoretically surrounded ourselves with friends and loved ones, it's because of this need to constantly be wearing our public face. Facebook is about as much fun as a zoning board hearing."
Read more:
An essay by Eben Moglen (The Guardian).
From the essay:
"We need to decentralise the data. If we keep it all in one great big pile – if there's one guy who keeps all the email and another guy who manages all the social sharing – then there isn't really any way to be any safer than the weakest link in the fence around those piles. But if everyone is keeping her and his own, then the weak links on the outside of any fence get the attacker exactly one person's stuff. Which, in a world governed by the rule of law, might be optimal: one person is the person you can spy on because you've got probable cause. Email scales beautifully without anybody at the centre keeping all of it. We need to make a mail server for people that costs five bucks and sits on the kitchen counter where the telephone answering machine used to be. If it breaks, you throw it away. Decentralised social sharing is harder, but not so hard that we can't do it. For the technologically gifted and engaged around the world this is the big moment, because if we do our work correctly freedom will survive and our grandkids will say: 'So what did you do back then?' The answer could be: 'I made SSL better.' Snowden has nobly advanced our effort to save democracy. In doing so he stood on the shoulders of others. The honour will be his and theirs, but the responsibility is ours. It is for us to finish the work that they have begun."
Read more:
See also:
A report by Ann Cavoukian, Ph.D. (Information and Privacy Commissioner, Ontario, Canada) and Dank Kruger (Absio Corporation).
Preface:
"Privacy and cybersecurity professionals, creators of digital property, and countless policy-makers have spent decades fighting to civilize the digital world, but they have lacked the most fundamental tool they need to succeed, namely - information systems engineering that enables true control of digital data. It is now possible to change the paradigm of the digital world from 'Use At Your Own Risk,' to 'My Data, My Rules.' Imagine such a world, if you can! Too many individuals and organizations are resigned to large-scale computerbased surveillance, invasion, and expropriation. The purpose of this paper is to explain, in plain language, why we believe that resignation to be unwarranted."
Read more:
A paper by Igor Bilogrevic, Julien Freudiger, Emiliano De Cristofaro, and Ersin Uzun.
From the Abstract:
"Online service providers gather increasingly large amounts of personal data into user profiles and monetize them with advertisers and data brokers. Users have little control of what information is processed and face an all-or-nothing decision between receiving free services or refusing to be profiled. This paper explores an alternative approach where users only disclose an aggregate model – the 'gist' – of their data. The goal is to preserve data utility and simultaneously provide user privacy."
Read more:
A blog post by Jan Stanley (ACLU).
From the text:
"If you use Google, Yahoo, Bing, or any other service that tracks your search terms, there is no reason not to change search engines today. When you do a search with these companies, they log your IP address and search terms, and store that data for varying periods of time. If you are logged in with them, they know much more about your identity, and can combine your search history with other sensitive information they have about you. And of course, these records, once stored, can be (among other things) obtained by government agencies very easily under the Patriot Act and other laws."
Read more:
A report by the An Comisinéir Cosanta Sonraí / Data Protection Commissioner.
From the Foreword:
"Our audits of State organisations have, in too many cases, shown scant regard by senior management to their duty to safeguard the personal data entrusted to them – a duty that is all the greater because of the legal obligation to provide such personal data to the State. Laudable objectives such as fraud prevention and greater efficiency must meet a test of proportionality in the manner in which personal data is used. Failure to treat personal data with respect can only lessen the trust that should exist between the individual and the State. It will also lead inevitably to more formal enforcement action by my Office unless system-wide action is taken to improve current practice."
Read more:
A paper by Ross Anderson.
From the Abstract:
"The forces that lead to pervasive monopolies in the information industries – network effects, technical lock-in and low marginal costs - are pervasive in the affairs of states too, once we look for them; they are just not yet recognised as such. There are many significant implications, from international relations through energy policy to privacy. Network effects make regulation hard; the USA failed to protect US attorney-client communications from Australian intelligence, just as Australia failed to protect its own citizens' personal health information from the NSA. There are some upsides too; but to identify and exploit them, we need to start thinking in a more grown-up way about what it means to live in a networked world. So, for that matter, must the international relations community."
Read more:
An article by Jon Baines (SCL).
From the article:
"The Health and Social Care Act 2012 (HSCA) gave NHS England the power to direct the Health and Social Care Information Centre (formerly the NHS Information Centre) to collect electronic patient records from GP practices. This was to be the first part of the 'care.data' initiative, the stated purpose of which is that using 'information about the care you have received, enables those involved in providing care and health services to improve the quality of care and health services for all'. [...] It is undeniable that the rising cost of health and social care provision is a huge societal problem, and it is also undeniable that health and social care services possess enormous quantities of hugely valuable patient data (whose value lies both in its potential benefits for future service provision, and in potential commercial benefits to the private sector) but I am by no means certain that the people whose data is involved understand what is proposed, or what the potential implications are. The suspicion – fair or not – that care.data is merely a front for the monetization of that valuable patient data, the suspicion that attempts were being made to implement it 'under the radar' (remember that, initially, no national publicity campaign, or opt-out procedure was proposed) and the apparent reluctance of its proponents to engage with the complex questions of what 'anonymisation' and 'pseudonymisation' mean in our increasingly technical world, lead me to doubt that care.data is, currently, proportionate to the problem it seeks to address."
Read more:
A series of blog posts by Jerry Fishenden (new tech observations from the UK).
From part 1:
"I’m going to bring together in a variety of posts (in no particular order and at random times) a very succinct summary of various aspects of the move towards online public services over the last couple of decades. This draws upon research we did at CTPR, along with personal engagement with some of these efforts, and discussions and debates with a whole host of people and organisations who have grappled with the problems and opportunities over the years."
Read more: