Tag Archives: GDPR

What’s new in research: full-access papers on machine learning with personal data, the ethics of Big Data as a public good

Today pdpecho inaugurates a weekly post curating research articles/papers/studies or dissertations in the field of data protection and privacy, that are available under an open access regime and that were recently published.

This week there are three recommended pieces for your weekend read. The first article, published by researchers from Queen Mary University of London and Cambridge University, provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

The second article is the view of a researcher specialised in International Development, from the University of Amsterdam, on the new trend in humanitarian work to consider data as a public good, regardless of whether it is personal or not.

The last paper is a draft authored by a law student at Yale (published on SSRN), which explores an interesting phenomenon: how data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. The paper underlines that the US privacy law system lacks protection for individuals whose data are sold in this scenario and proposes a solution.

1) Machine Learning with Personal Data (by Dimitra Kamarinou, Christopher Millard, Jatinder Singh)

“This paper provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

We look at what profiling means and at the right that data subjects have not to be subject to decisions based solely on automated processing, including profiling, which produce legal effects concerning them or significantly affect them. We also look at data subjects’ right to be informed about the existence of automated decision-making, including profiling, and their right to receive meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.

The purpose of this paper is to explore the application of relevant data protection rights and obligations to machine learning, including implications for the development and deployment of machine learning systems and the ways in which personal data are collected and used. In particular, we consider what compliance with the first data protection principle of lawful, fair, and transparent processing means in the context of using machine learning for profiling purposes. We ask whether automated processing utilising machine learning, including for profiling purposes, might in fact offer benefits and not merely present challenges in relation to fair and lawful processing.”

The paper was published as “Queen Mary School of Law Legal Studies Research Paper No. 247/2016″.

“International development and humanitarian organizations are increasingly calling for digital data to be treated as a public good because of its value in supplementing scarce national statistics and informing interventions, including in emergencies. In response to this claim, a ‘responsible data’ movement has evolved to discuss guidelines and frameworks that will establish ethical principles for data sharing. However, this movement is not gaining traction with those who hold the highest-value data, particularly mobile network operators who are proving reluctant to make data collected in low- and middle-income countries accessible through intermediaries.

This paper evaluates how the argument for ‘data as a public good’ fits with the corporate reality of big data, exploring existing models for data sharing. I draw on the idea of corporate data as an ecosystem involving often conflicting rights, duties and claims, in comparison to the utilitarian claim that data’s humanitarian value makes it imperative to share them. I assess the power dynamics implied by the idea of data as a public good, and how differing incentives lead actors to adopt particular ethical positions with regard to the use of data.”

This article is part of the themed issue ‘The ethical impact of data science’ in “Philosophical transactions of the Royal Society A”.

3) What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers (by Theodore Rostow)

Privacy scholarship to date has failed to consider a new development in the commercial privacy landscape. Data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. This practice creates an avenue for a new type of privacy harm — “insider control” — which privacy scholarship has yet to recognize.

U.S. privacy laws fail to protect consumers from the possibility of insider control. Apart from two noteworthy frameworks that might offer paths forward, none of the viable reforms offered by privacy scholars would meaningfully limit consumers’ vulnerability. This Note proposes changes to existing privacy doctrines in order to reduce consumers’ exposure to this new harm.”

This paper was published as a draft on SSRN. According to SSRN, the final version will be published in the 34th volume of the Yale Journal on Regulation.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Even if post Brexit-UK adopts the GDPR, it will be left without its “heart”

Gabriela Zanfir Fortuna

brexit

There has been lately a wave of optimism of those looking for legal certainty that the GDPR will be adopted by the UK even after the country leaves the European Union. This wave was prompted by a declaration of the British Secretary of State, Karen Bradley, at the end of October, when she stated before a Committee of the Parliament that “We will be members of the EU in 2018 and therefore it would be expected and quite normal for us to opt into the GDPR and then look later at how best we might be able to help British business with data protection while maintaining high levels of protection for members of the publicThe information commissioner of the UK, Elisabeth Denham, welcomed the news. On another hand, as Amberhawk explained in detail, this will not mean that the UK will automatically be considered as ensuring an adequate level of protection.

The truth is that as long as the UK is still a Member of the EU, it can’t opt in or opt out, for that matter, from regulations (other than the ones subject to the exemptions negotiated by the UK when it entered the Union – but this is not the case for the GDPR). They are “binding in their entirety” and “directly applicable”, according to Article 288 of the Treaty on the Functioning of the EU. So, yes, quite normally, if the UK is still a Member State of the EU on 25 May 2018, then the GDPR will start applying in the UK just as it will be applying in Estonia or France.

The fate of the GDPR after Brexit becomes effective will be as uncertain as the fate of all other EU legislative acts transposed in the UK or directly applicable in the UK. But let’s imagine the GDPR will remain national law after Brexit, in a form or another. If this happens, it is likely that it will take a life of its own, departing from harmonised application throughout the EU. First and foremost, the GDPR in the UK will not be applied in the light of the Charter of Fundamental Rights of the EU and especially its Article 8 – the right to the protection of personal data. The Charter played an extraordinary role in the strengthening of data protection in the EU after it became binding, in 2009, being invoked by the Court of Justice of the EU in its landmark judgments – Google v Spain,  Digital Rights Ireland and Schrems.

The Court held as far back as 2003 that “the provisions of Directive 95/46, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy, must necessarily be interpreted in the light of fundamental rights” (Österreichischer Rundfunk, para 68). This principle was repeated in most of the following cases interpreting Directive 95/46 and other relevant secondary law for this field, perhaps with the most notable results in Digital Rights Ireland and Schrems. 

See, for instance:

“As far as concerns the rules relating to the security and protection of data retained by providers of publicly available electronic communications services or of public communications networks, it must be held that Directive 2006/24 does not provide for sufficient safeguards, as required by Article 8 of the Charter, to ensure effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data” (Digital Rights Ireland, para. 66).

“As regards the level of protection of fundamental rights and freedoms that is guaranteed within the European Union, EU legislation involving interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must, according to the Court’s settled case-law, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data. The need for such safeguards is all the greater where personal data is subjected to automatic processing and where there is a significant risk of unlawful access to that data” (Schrems, para. 91).

Applying data protection law outside the spectrum of fundamental rights will most likely not ensure sufficient protection to the person. While the UK will still remain under the legal effect of the European Convention of Human Rights and its Article 8 – respect for private life – this by far does not equate to the specific protection ensured to personal data by Article 8 of the Charter as interpreted and applied by the CJEU.

Not only the Charter will not be binding for the UK post-Brexit, but the Court of Justice of the EU will not have jurisdiction anymore on the UK territory (unless some sort of spectacular agreement is negotiated for Brexit). Moreover, EU law will not enjoy supremacy over national law, as there is the case right now. This means that the British data protection law will be able to depart from the European standard (GDPR) to the extent desirable by the legislature. For instance, there will be nothing staying in the way of the British legislature to adopt permissive exemptions to the rights of the data subject, pursuant to Article 23 GDPR.

So when I mentioned in the title that the GDPR in the post-Brexit UK will in any case be left without its “heart”, I was referring to its application and interpretation in the light of the Charter of the Fundamental Rights of the EU.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Interested in the GDPR? See the latest posts:

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

Gabriela Zanfir Fortuna

The French Data Protection Authority, CNIL, made public this week the report of the public consultation it held between 16 and 19 July 2016 among professionals about the General Data Protection Regulation (GDPR). The public consultation gathered 540 replies from 225 contributors.

The main issues the CNIL focused on in the consultation were four:

  • the data protection officer;
  • the right to data portability;
  • the data protection impact assessments;
  • the certification mechanism.

These are also the four themes in the action plan of the Article 29 Working Party for 2016.

This post (Part I) will summarise the results and action plan for the first two themes, while the last two will be dealt with in a second post (Part II). [Disclaimer: all quotations are translated from French].

1) On the data protection officer

According to Article 37 GDPR, both the controller and the processor must designate a data protection officer where the processing is carried out by a public authority (1)(a), where their core activities consist of processing operations which require regular and systematic monitoring of data subjects on a large scale (1)(b) and where their core activities consist of processing sensitive data on a large scale (1)(c).

The report reveals that there are many more questions than answers or opinions about how Article 37 should be applied in practice. In fact, most of the contributions are questions from the contributors (see pages 2 to 4). They raise interesting points, such as:

  • What is considered to be a conflict of interest – who will not be able to be appointed?
  • Should the DPO be appointed before May 2018 (when GDPR becomes applicable)?
  • Will the CNIL validate the mandatory or the optional designation of a DPO?
  • Which will exactly be the role of the DPO in the initiative for and in the drafting of the data protection impact assessments?
  • Which are the internal consequences if the recommendations of the DPO are not respected?
  • Is it possible that the DPO becomes liable under Criminal law for how he/she monitors compliance with the GDPR?
  • Should the DPO be in charge of keeping the register of processing operations and Should the register be communicated to the public?
  • Should only the contact details of the DPO be published, or also his/her identity?
  • Must the obligations in the GDPR be applied also for the appointment of the DPO that is made voluntarily (outside the three scenarios in Article37(1))?
  • Can a DPO be, in fact, a team? Can a DPO be a legal person?
  • Are there any special conditions with regard to the DPO for small and medium enterprises?

The CNIL underlines that for this topic an important contribution was brought by large professional associations during discussions, in addition to the large number of replies received online.

In fact, according to the report, the CNIL acknowledges “the big expectations of professional associations  and federations to receive clarifications with regard to the function of the DPO, as they want to prepare as soon as possible and in a sustainable way for the new obligations” (p. 5).

As for future steps, the CNIL recalls that the Article 29 Working Party will publish Guidelines to help controllers in a practical manner, according to the 2016 action plan. (There’s not much left of 2016, so hopefully we’ll see the Guidelines soon!). The CNIL announces they will also launch some national communication campaigns and they will intensify the training sessions and workshops with the current CILs (Correspondants Informatique et Libertés – a role similar to that of a DPO).

2) On the right to data portability

new-note-2

Article 20 GDPR provides that the data subject has the right to receive a copy of their data in a structured, commonly used and machine-readable format and has the right to transmit those data to another controller only if the processing is based on consent or on a contract.

First, the CNIL notes that there was “a very strong participation of the private sector submitting opinions or queries regarding the right to data portability, being interesting especially about the field of application of the new right, the expenses its application will require and about its consequences on competition” (p. 6).

According to the report, the right to data portability it’s perceived as an instrument that allows regaining the trust of persons about processing of their personal data, bringing more transparency and more control over the processing operation (p. 6).

On another hand, the organisations that replied to the public consultation are concerned about the additional investments they will need to make to implement this right. They are also concerned about (p. 6):

  • “the risk of creating an imbalance in competition between European and American companies, as European companies are directly under the obligation to comply with this right, whereas American companies may try to circumvent the rules”. My comment here would be that they should not be concerned about that, because if they target the same European public to offer services, American companies will also be under a direct obligation to comply with this right.
  • “the immediate cost of implementing this right (for instance, the development of automatic means to extract data from databases), which cannot be charged to the individuals, but which will be a part of the management costs and will increase the costs for the services”.
  • “the level of responsibility if the data are mishandled or if the data handed over to the person are not up to date”.

The respondents to the public consultation seem to be a good resource for technical options to use in terms of the format needed to transfer data. Respondents argued in favor of open source formats, which will make reusing the data easier and which will be cheaper compared to proprietary solutions. Another suggested solution is the development of Application Program Interfaces (APIs) based on open standards, without a specific licence key. This way the persons will be able to use the tools of their choice.

One of the needs that emerged from the consultation was to clarify whether the data that are subject to the right to portability must be raw data, or whether transferring a “summary” of the data would suffice. Another question was whether the data could be asked for by a competing company, with a mandate from the data subject. There were also questions regarding the interplay of the right to data portability and the right of access, or asking how could data security be ensured for the transfer of the “ported” data.

In the concluding part, the CNIL acknowledges that two trends could already be seen within the replies: on the one hand, companies tend to want to limit as much as possible the applicability of the right to data portability, while on the other hand, the representatives of the civil society are looking to encourage persons to take their data in their own hands and to reinvent their use (p. 10).

According to the report, the Technology Subgroup of the Article 29 Working Party is currently drafting guidelines with regard to the right to data portability. “They will clarify the field of application of this right, taking into account all the questions raised by the participants to the consultation, and they will also details ways to reply to portability requests”, according to the report (p. 10).

***

Find what you’re reading useful? Consider supporting pdpecho.

Click HERE for Part II of this post.

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

CJEU’s AG Bot referred to the GDPR in his recent ‘right to be forgotten’ Opinion

It may only become applicable on 25 May 2018, but the GDPR already made its official debut in the case-law of the CJEU.

It was the last paragraph (§101) of the Conclusions of AG Bot in Case C-398/15 Manni, published on 8 September, that specifically referred to Regulation 2016/679 (the official name of the GDPR). The case concerns the question of whether the right to erasure (the accurate name of the more famous “right to be forgotten”) as enshrined in Article 12 of Directive 95/46 also applies in the case of personal data of entrepreneurs recorded in the Public Registry of companies, if their organisation went bankrupt years ago. Curiously, the preliminary ruling question doesn’t specifically refer to the right to erasure, but to the obligation in Article 6(1)(e) for controllers not to retain the data longer than necessary to achieve the purpose for which they were collected.

In fact, Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§30).

Disclaimer! The Opinion is not yet available in English, but in another handful of official languages of the EU. Therefore, the following quotes are all my translation from French or Romanian.

AG Bot advised the Court to reply to the preliminary ruling questions in the sense that all personal data in the Public Registry of companies should be retained there indefinitely, irrespective of the fact that companies to whose administrators the data refer are still active or not. “Public Registries of companies cannot achieve their main purpose, namely the consolidation of legal certainty by disclosing, in accordance with the transparency principle, legally accurate information, if access to this information would not be allowed indefinitely to all third parties” (§98).

The AG adds that “the choice of natural persons to get involved in the economic life through a commercial company implies a permanent requirement of transparency. For this main reason, detailed throughout the Opinion, I consider that the interference in the the right to the protection of personal data that are registered in a Public Registry of companies, specifically ensuring their publicity for an indefinite period of time and aimed towards any person who asks for access to these data, is justified by the preponderant interest of third parties to access those data” (§100).

Restricting the circle of ‘interested third parties’ would be incompatible with the purpose of the Public Registry

Before reaching this conclusion, the AG dismissed a proposal by the Commission that suggested a limited access to the personal data of administrators of bankrupt companies could be ensured only for those third parties that “show a legitimate interest” in obtaining it.

The AG considered that this suggestion “cannot, at this stage of development of EU law, ensure a fair balance between the objective of protecting third parties and the right to the protection of personal data registered in Public Registries of companies” (§87). In this regard, he recalled that the objective to protect the interest of third parties as enshrined in the First Council Directive 68/151  “is provided for in a sufficiently wide manner so as to encompass not only the creditors of a company, but also, in general, all persons that want to obtain information regarding that company” (§88).

Earlier, the AG had also found that the suggestion to anonymise data regarding the administrators of bankrupt companies is not compatible with the historical function of the Public Registry and with the objective to protect third parties that is inherent to such registries. “The objective to establish a full picture of a bankrupt company is incompatible with processing anonymous data” (§78).

Throughout the Opinion, the AG mainly interprets the principles underpinning the First Council Directive 68/151/EC (of 9 March 1968 on co-ordination of safeguards which, for the protection of the interests of members and others, are required by Member States of companies within the meaning of the second paragraph of Article 58 of the Treaty, with a view to making such safeguards equivalent throughout the Community)  and it is apparent that it enjoys precedence over Directive 95/46/EC.

Finally: the reference to the GDPR

The AG never refers in his analysis to Article 12 of Directive 95/46,  which grants data subjects the right to erasure. However, come the last paragraph of the Opinion, the AG does refer to Article 17(3)(b) and (d) from Regulation (EU) 2016/679 (yes, the GDPR). He applies Article 17 GDPR to the facts of the case and mentions that the preceding analysis “is compatible” with it, because “this Article provides that the right to erasure of personal data, or ‘the right to be forgotten’, does not apply to a processing operation ‘for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’ or ‘for archiving purposes in the public interest'” (§101).

While I find the Opinion of the AG clear and well argued, I have two comments. I wish he had referred more comprehensively to the fundamental rights aspect of the case when balancing the provisions of the two directives. But most of all, I wish he would have analysed the right to erasure itself, the conditions that trigger it and the exemptions under Article 13 of Directive 95/46.

My bet on the outcome of the case: the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.

A small thing that bugs me about this case is that I find there is a differentiation between searching a Registry of Companies being interested in a company name and searching a Registry of Companies being interested in a specific natural person. I mean, all third parties may very well be interested in finding out everything there is to know about bankrupt Company X, discovering thus that Mr Manni was the administrator. To me, this does not seem to be the same situation as searching the Public Registry of companies using Mr Manni’s name to find out all about Mr Manni’s background. In §88 the AG even mentions, when recognising the all encompassing interest of every third party to access all information about a certain company indefinitely, that Directive 68/151 protects the interest of “all persons that want to obtain information regarding this company“. I know the case is about keeping or deleting the personal data of Mr Manni from the Registry. And ultimately it is important to keep the information there due to the general interest of knowing everything about the history of a company. However, does it make any difference for the lawfulness of certain processing operations related to the data in the Registry that the Registry of companies is used to create profiles of natural persons? I don’t know. But it’s something that bugged me while reading the Opinion. Moreover, if you compare this situation to the “clean slate” rules for certain offenders that have their data erased from the criminal record, it is even more bugging.  (Note: at §34 the AG specifies he is only referring in his Opinion to the processing of personal data by the Chamber of Commerce and not by private companies specialising in providing background information about entrepreneurs).

Fun fact #1

The GDPR made its ‘unofficial’ debut in the case-law of the CJEU in the Opinion of AG Jaaskinen in C-131/14 Google v. Spain delivered on 25 June 2013. In fact, it was precisely Article 17 that was referred to in this Opinion as well, in §110. There’s another reference to the GDPR in §56, mentioning the new rules on the field of application of EU data protection law. Back then, the text of the GDPR was merely a proposal of the Commission – nor the EP, or the Council had adopted their own versions of the text, before entering the trilogue which resulted in the adopted text of Regulation 2016/679.

Fun fact #2

AG Bot is the AG that the delivered the Opinion in the Schrems case as well. The Court followed his Opinion to a large extent for its Judgment. There are fair chances the Court will follow again his Opinion.

***

Find what you’re reading useful? Consider supporting pdpecho.