Rina Steenkamp - Privacy and technology
[Judgment in Case C-131/12 | Protecting and promoting the open internet | Privacy International challenges GCHQ's unlawful hacking of computers, mobile phones | Protecting personal data in online services - learning from the mistakes of others | Chip and skim - cloning EMV cards with the pre-play attack | Independent report on e-voting in Estonia | Can we trust anyone with our personal info? | [Complaint against Snapchat, Inc.] | Big data - Seizing opportunities, preserving values | Big data and privacy - A technological perspective | Over one hundred Internet companies call on FCC to protect network neutrality | Privacy Badger | Analyzing forged SSL certificates in the wild | Microsoft Security Intelligence Report - Volume 16 | What a toilet hoax can tell us about the future of surveillance | How urban anonymity disappears when all data is tracked | Policing by numbers - Big Data and the Fourth Amendment | The scored society - Due process for automated predictions | AccelPrint - Imperfections of accelerometers make smartphones trackable | Heartbleed's impact | Tax fraud gang targeted healthcare firms | The FTC and privacy and security duties for the cloud]
A judgment by the Court of Justice of the European Union.
From the press release:
"An internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties Thus, if, following a search made on the basis of a person's name, the list of results displays a link to a web page which contains information on the person in question, that data subject may approach the operator directly and, where the operator does not grant his request, bring the matter before the competent authorities in order to obtain, under certain conditions, the removal of that link from the list of results."
Read more:
See also:
Notice of proposed rulemaking by the Federal Communications Commission.
From the Introduction:
"We start with a fundamental question: What is the right public policy to ensure that the Internet remains open? This Notice of Proposed Rulemaking (Notice), and the comment process that follows, will turn on this fundamental question. Today, there are no legally enforceable rules by which the Commission can stop broadband providers from limiting Internet openness. This Notice begins the process of closing that gap, by proposing to reinstitute the no-blocking rule adopted in 2010 and creating a new rule that would bar commercially unreasonable actions from threatening Internet openness (as well as enhancing the transparency rule that is currently in effect)."
Read more:
See also:
A legal complaint by Privacy International.
From the press release:
"Privacy International today filed a legal complaint demanding an end to the unlawful hacking being carried out by GCHQ which, in partnership with the NSA, is infecting potentially millions of computer and mobile devices around the world with malicious software that gives them the ability to sweep up reams of content, switch on users' microphones or cameras, listen to their phone calls and track their locations. The complaint, filed in the UK's Investigatory Powers Tribunal, is the first UK legal challenge to the use of hacking tools by intelligence services. It contends that the infection of devices with malicious software, which enables covert intrusion into the devices and lives of ordinary people, is so invasive that it is incompatible with democratic principles and human rights standards. Moreover, given that GCHQ and the NSA have no clear lawful authority to conduct hacking, which if performed by a private individual would involve the commission of criminal offences, their conduct is unlawful and must be halted immediately."
Read more:
See also:
A report by the Information Commissioner's Office (ICO).
From the Introduction:
"This report describes eight frequently-arising computer security issues in an online environment that relate to data protection, together with a summary of good practice for how to guard against each issue. In many ICO data breach cases, the measures which could have prevented the breach or reduced the level of harm to individuals would have been simple to implement."
Read more:
A paper by Mike Bond, Omar Choudary, Steven J. Murdoch, Sergei Skorobogatov and Ross Anderson.
From the Abstract:
"EMV, also known as 'Chip and PIN', is the leading system for card payments worldwide. [...] We have discovered two serious problems: a widespread implementation flaw and a deeper, more difficult to fix flaw with the EMV protocol itself."
Read more:
See also:
A security analysis by J.Alex Halderman, Harri Hursti, Jason Kitcat, Margaret MacAlpine, Drew Springall and Travis Finkenauer.
From the 'Our findings' page:
"After studying other e-voting systems around the world, the team was particularly alarmed by the Estonian I-voting system. It has serious design weaknesses that are exacerbated by weak operational management. It has been built on assumptions which are outdated and do not reflect the contemporary reality of state-level attacks and sophisticated cybercrime. These problems stem from fundamental architectural problems that cannot be resolved with quick fixes or interim steps. While we believe e-government has many promising uses, the Estonian I-voting system carries grave risks — elections could be stolen, disrupted, or cast into disrepute. In light of these problems, our urgent recommendation is that to maintain the integrity of the Estonian electoral process, use of the Estonian I-voting system should be immediately discontinued."
Read more:
A blog post by John Hawes (Naked Security).
From the blog post:
"In the last few weeks, two very different criminal cases have concluded on opposite sides of the Atlantic, each of them showing how vulnerable our personal information is to those eager to exploit it."
Read more:
A publication by the FTC.
From the press release:
"Snapchat, the developer of a popular mobile messaging app, has agreed to settle Federal Trade Commission charges that it deceived consumers with promises about the disappearing nature of messages sent through the service. The FTC case also alleged that the company deceived consumers over the amount of personal data it collected and the security measures taken to protect that data from misuse and unauthorized disclosure. In fact, the case alleges, Snapchat's failure to secure its Find Friends feature resulted in a security breach that enabled attackers to compile a database of 4.6 million Snapchat usernames and phone numbers."
Read more:
See also:
A report by John Podesta, Penny Pritzker, Ernest J. Moniz, John Holdren and Jeffrey Zients.
From the accompanying letter to the President:
"Big data technologies will be transformative in every sphere of life. The knowledge discovery they make possible raises considerable questions about how our framework for privacy protection applies in a big data ecosystem. Big data also raises other concerns. A significant finding of this report is that big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and potential."
Read more:
See also:
A report by the President's Council of Advisors on Science and Technology.
From the accompanying letter to the President:
"Big data drives big benefits, from innovative businesses to new ways to treat diseases. The challenges to privacy arise because technologies collect so much data (e.g., from sensors in everything from phones to parking lots) and analyze them so efficiently (e.g., through data mining and other kinds of analytics) that it is possible to learn far more than most people had anticipated or can anticipate given continuing progress. These challenges are compounded by limitations on traditional technologies used to protect privacy (such as de-identification). PCAST concludes that technology alone cannot protect privacy, and policy intended to protect privacy needs to reflect what is (and is not) technologically feasible. In light of the continuing proliferation of ways to collect and use information about people, PCAST recommends that policy focus primarily on whether specific uses of information about people affect privacy adversely. It also recommends that policy focus on outcomes, on the 'what' rather than the 'how,' to avoid becoming obsolete as technology advances."
Read more:
See also:
An initiative by, among others, the New America Foundation.
From the press release:
"Next week, on May 15, the Federal Communication Commission (FCC) will officially propose rules regarding the Open Internet. According to press reports and FCC briefings, the rules would authorize phone and cable ISPs to create a two-tiered internet of slow lanes and fast lanes and to charge web companies for access to the 'fast lanes,' transforming the internet’s current level playing field and threatening innovation and entrepreneurship. Today, a broad cross-section of over a hundred Internet companies and innovators filed a letter calling on the FCC to abandon its apparent path and instead to protect and preserve an open, equal internet. The signers -- a diverse group including tiny start-ups, household names, and industry giants -- called for Open Internet Rules that afford companies and entrepreneurs strong protections against online discrimination and individualized bargaining."
Read more:
A browser add-on by EFF.
From 'Frequently asked questions':
"Privacy Badger is a browser add-on that stops advertisers and other third-party trackers from secretly tracking where you go and what pages you look at on the web. If an advertiser seems to be tracking you across multiple websites without your permission, Privacy Badger automatically blocks that advertiser from loading any more content in your browser. To the advertiser, it's like you suddenly disappeared."
Read more:
A paper by Lin-Shung Huang, Alex Rice, Erling Ellingsen and Collin Jackson.
Abstract:
"The SSL man-in-the-middle attack uses forged SSL certificates to intercept encrypted connections between clients and servers. However, due to a lack of reliable indicators, it is still unclear how commonplace these attacks occur in the wild. In this work, we have designed and implemented a method to detect the occurrence of SSL man-in-the-middle attack on a top global website, Facebook. Over 3 million real-world SSL connections to this website were analyzed. Our results indicate that 0.2% of the SSL connections analyzed were tampered with forged SSL certificates, most of them related to antivirus software and corporate-scale content filters. We have also identified some SSL connections intercepted by malware. Limitations of the method and possible defenses to such attacks are also discussed."
Read more:
See also:
A report by Microsoft.
From the related blog post:
"Foremost among the tactics many attackers are using is 'deceptive downloads.' In more than 95% of the 110 countries/regions we studied, deceptive downloads were a top threat. Cybercriminals are secretly bundling malicious items with legitimate content such as software, games or music. Taking advantage of people's desire to get a good deal, cybercriminals are bundling malware with free programs and free software packages that can be downloaded online. For example, a typical scenario is someone that has a file they downloaded from a website that they can't open because they don't appear to have the right software installed to open it. As a result, they search online and come across a free software download that might help them open the file. The free download also comes with other add-ons. In addition to what the person thought they were getting, the download also installs malware. The malware may be installed immediately or at a later date as it assesses the victim's computer's profile. It could be months or even years before the victim notices the infection, as often these malicious items operate behind the scenes with the only visible effect being slower performance on the system that was infected."
Read more:
An article by Jennifer Golbeck (The Atlantic).
From the article:
"Quantified Toilet got me thinking about issues that are integral to the way we assess the interplay between technology and privacy. The concerns these hypothetical toilets raise will only continue to come up as smart technologies become more integrated with daily life. While the project may have been a hoax, surveillance in public spaces is rising—and it isn't restricted to cameras. There are sophisticated sensors for monitoring all kinds of activities. [...] Sensors of all types are easily connected to the Internet. They can collect vast amounts of data, which can then be shared widely. As citizens, we don't always know what data is being collected, who can access it, or how it will be used. Even seemingly secure networks can be comprised. We should be leading conversations about the legal privacy protections we need to establish for what once seemed to be private activities. In a data-rich connected world, even the most intimate spaces are becoming public."
Read more:
A blog post by Quentin Hardy (NYT Bits).
From the blog post:
"Cities are our paradises of anonymity, a place for both self-erasure and self-reinvention. But soon, cities may fall first in the disappearance, or at least a radical remaking, of privacy. Information about our innocuous public acts is denser in urban areas, and can now be cheaply aggregated. Cameras and sensors, increasingly common in the urban landscape, pick up all sorts of behaviors. These are stored and categorized to draw personal conclusions - all of it, thanks to cheap electronics and cloud computing, for affordable sums."
Read more:
An article by Elizabeth E. Joh (Washington Law Review).
From the Introduction:
"This article identifies three uses of big data that hint at the future of policing and the questions these tools raise about conventional Fourth Amendment analysis. Two of these examples, predictive policing and mass surveillance systems, have already been adopted by a small number of police departments around the country. A third example—the potential use of DNA databank samples—presents an untapped source of big data analysis. Whether any of these three examples of big data policing attract more widespread adoption by the police is yet unknown, but it likely that the prospect of being able to analyze large amounts of information quickly and cheaply will prove to be attractive. While seemingly quite distinct, these three uses of big data suggest the need to draw new Fourth Amendment lines now that the government has the capability and desire to collect and manipulate large amounts of digitized information."
Read more:
An article by Danielle Keats Citron and Frank Pasquale (Washington Law Review).
Abstract:
"Big Data is increasingly mined to rank and rate individuals. Predictive algorithms assess whether we are good credit risks, desirable employees, reliable tenants, valuable customers—or deadbeats, shirkers, menaces, and 'wastes of time.' Crucial opportunities are on the line, including the ability to obtain loans, work, housing, and insurance. Though automated scoring is pervasive and consequential, it is also opaque and lacking oversight. In one area where regulation does prevail—credit—the law focuses on credit history, not the derivation of scores from data. Procedural regularity is essential for those stigmatized by 'artificially intelligent' scoring systems. The American due process tradition should inform basic safeguards. Regulators should be able to test scoring systems to ensure their fairness and accuracy. Individuals should be granted meaningful opportunities to challenge adverse decisions based on scores miscategorizing them. Without such protections in place, systems could launder biased and arbitrary data into powerfully stigmatizing scores."
Read more:
A paper by Sanorita Dey, Nirupam Roy, Wenyuan Xu, Romit Roy Choudhury and Srihari Nelakuditi.
From the Abstract:
"As mobile begins to overtake the fixed Internet access, ad networks have aggressively sought methods to track users on their mobile devices. While existing countermeasures and regulation focus on thwarting cookies and various device IDs, this paper submits a hypothesis that smartphone/tablet accelerometers possess unique fingerprints, which can be exploited for tracking users. We believe that the fingerprints arise from hardware imperfections during the sensor manufacturing process, causing every sensor chip to respond differently to the same motion stimulus. The differences in responses are subtle enough that they do not affect most of the higher level functions computed on them. Nonetheless, upon close inspection, these fingerprints emerge with consistency, and can even be somewhat independent of the stimulus that generates them."
Read more:
A report by Lee Rainie and Maeve Duggan (Pew Research Center).
From 'About this report':
"In early April, a major security flaw affecting perhaps 500,000 or more websites was announced and fixed. But the patch to the “secure socket” program that is supposed to encrypt and protect user information on secure websites was only made after more than two years of vulnerability on some of the most heavily trafficked sites, including Facebook, Google, YouTube, Yahoo and Wikipedia. Analysts warned that untold numbers of internet users might have had key personal information compromised either in their use of those websites, or their use of email, instant messaging, and even supposedly secure virtual personal networks. This report covers public response to the revelation of the security code flaw."
Read more:
A blog post by Brian Krebs (Krebs on Security).
From the blog post:
"Earlier this month, I wrote about an organized cybercrime gang that has been hacking into HR departments at organizations across the country and filing fraudulent tax refund requests with the IRS on employees of those victim firms. Today, we'll look a bit closer at the activities of this crime gang, which appears to have targeted a large number of healthcare and senior living organizations that were all using the same third-party payroll and HR services provider."
Read more:
An essay by Daniel J. Solove and Woodrow Hartzog.
Abstract:
"Increasingly, companies, hospitals, schools, and other organizations are using cloud service providers (and also other third party data service providers) to store and process the personal data of their customers, patients, clients, and others. When an entity shares people’s personal data with a cloud service provider, this data is protected in large part through a contract between the organization and the cloud service provider. In many cases, however, these contracts fail to contain key protections of data. Because the consumer is not a direct party to these contracts and often cannot even have access to these contracts, the consumer is often powerless, and the consumer's interests are often not adequately represented. In this short essay, we argue that there is a remedy in Section 5 of the Federal Trade Commission (FTC) Act that prohibits unfair and deceptive trade practices. Certain key cases from the emerging body of FTC enforcement actions on data protection issues can be read together to create a double-edged set of duties – both on the organizations contracting with cloud service providers and on the cloud service providers themselves. Not only does an organization owe a duty to consumers to appropriately represent their privacy and data security interests in the negotiation, but cloud service providers have an obligation to the consumer as well, and cannot enter into contracts that lack adequate protections and controls."
Read more: