What’s new in research: full-access papers on machine learning with personal data, the ethics of Big Data as a public good

Today pdpecho inaugurates a weekly post curating research articles/papers/studies or dissertations in the field of data protection and privacy, that are available under an open access regime and that were recently published.

This week there are three recommended pieces for your weekend read. The first article, published by researchers from Queen Mary University of London and Cambridge University, provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

The second article is the view of a researcher specialised in International Development, from the University of Amsterdam, on the new trend in humanitarian work to consider data as a public good, regardless of whether it is personal or not.

The last paper is a draft authored by a law student at Yale (published on SSRN), which explores an interesting phenomenon: how data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. The paper underlines that the US privacy law system lacks protection for individuals whose data are sold in this scenario and proposes a solution.

1) Machine Learning with Personal Data (by Dimitra Kamarinou, Christopher Millard, Jatinder Singh)

“This paper provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

We look at what profiling means and at the right that data subjects have not to be subject to decisions based solely on automated processing, including profiling, which produce legal effects concerning them or significantly affect them. We also look at data subjects’ right to be informed about the existence of automated decision-making, including profiling, and their right to receive meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.

The purpose of this paper is to explore the application of relevant data protection rights and obligations to machine learning, including implications for the development and deployment of machine learning systems and the ways in which personal data are collected and used. In particular, we consider what compliance with the first data protection principle of lawful, fair, and transparent processing means in the context of using machine learning for profiling purposes. We ask whether automated processing utilising machine learning, including for profiling purposes, might in fact offer benefits and not merely present challenges in relation to fair and lawful processing.”

The paper was published as “Queen Mary School of Law Legal Studies Research Paper No. 247/2016″.

“International development and humanitarian organizations are increasingly calling for digital data to be treated as a public good because of its value in supplementing scarce national statistics and informing interventions, including in emergencies. In response to this claim, a ‘responsible data’ movement has evolved to discuss guidelines and frameworks that will establish ethical principles for data sharing. However, this movement is not gaining traction with those who hold the highest-value data, particularly mobile network operators who are proving reluctant to make data collected in low- and middle-income countries accessible through intermediaries.

This paper evaluates how the argument for ‘data as a public good’ fits with the corporate reality of big data, exploring existing models for data sharing. I draw on the idea of corporate data as an ecosystem involving often conflicting rights, duties and claims, in comparison to the utilitarian claim that data’s humanitarian value makes it imperative to share them. I assess the power dynamics implied by the idea of data as a public good, and how differing incentives lead actors to adopt particular ethical positions with regard to the use of data.”

This article is part of the themed issue ‘The ethical impact of data science’ in “Philosophical transactions of the Royal Society A”.

3) What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers (by Theodore Rostow)

Privacy scholarship to date has failed to consider a new development in the commercial privacy landscape. Data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. This practice creates an avenue for a new type of privacy harm — “insider control” — which privacy scholarship has yet to recognize.

U.S. privacy laws fail to protect consumers from the possibility of insider control. Apart from two noteworthy frameworks that might offer paths forward, none of the viable reforms offered by privacy scholars would meaningfully limit consumers’ vulnerability. This Note proposes changes to existing privacy doctrines in order to reduce consumers’ exposure to this new harm.”

This paper was published as a draft on SSRN. According to SSRN, the final version will be published in the 34th volume of the Yale Journal on Regulation.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s