Category Archives: Comments

Why data protection law is uniquely equipped to let us fight a pandemic with personal data

Image by ar130405 from Pixabay

Data protection law is different than “privacy”. We, data protection lawyers, have been complacent recently and have failed to clarify this loud and clear for the general public. Perhaps happy to finally see this field of law taking the front stage of public debate through the GDPR, we have not stopped anyone from saying that the GDPR is a privacy law.

The truth is, the GDPR is a “data protection” law (it stands for the General “Data Protection” Regulation). And this makes a world of difference these days, when governments, individuals, companies, public health authorities are looking at the collection of personal data and digital tracking of people as a potential effective way to stop the spread of the COVID-19 pandemic.

The GDPR is the culmination of about half a century of legislative developments in Europe, which saw data protection evolve from a preoccupation of regional laws, to national laws, to EU laws, to a fundamental right in the EU Charter of Fundamental Rights. A fundamental right (Article 8) which is provided for distinctly than the fundamental right to respect for private and family life (Article 7). What a wonderous distinction!

The right to the protection of personal data has been conceived particularly to support societies in facing the reality of massive automation of systems fed with data about individuals. At the very beginning, the introduction of computerized databases in public administration pushed for the necessity of adopting detailed safeguards that would ensure the rights of individuals are not breached by the collection and use of their data.  

In the following decades, waves of development added layers to those safeguards and shaped data protection law as we know it today, layers such as the need for a justification to collect and use personal data; fair information principles like purpose limitation and data minimization; transparency and fairness; control of data subjects over their own data through specific rights like access, correction and deletion; the need of having a dedicated, independent supervisory authority to explain and enforce data protection law; accountability of whomever is responsible for the collection and use of personal data.

The right to data protection is procedural in nature. It does have a flavor of substantial protection, which will certainly grow in importance and will likely be developed in the age of AI and Machine Learning – in particular I am thinking of fairness, but at its core the right to data protection remains procedural. Data protection sets up specific measures or safeguards that must be implemented to reach its goal, in relation to personal data being collected and used.

Importantly, the goal of data protection is to ensure that information relating to individuals are collected and used in such a way that all their other fundamental rights are protected. This includes freedom of speech, the right to private life/privacy, the right to life, the right to security, the right to non-discrimination and so on. Even though I have not seen this spelled out anywhere, I believe it has also been developed to support the rule of law.

This is why data protection is uniquely equipped to let us fight the pandemic using personal data. It has literally been conceived and developed to allow the use of personal data by automated systems in a way that guarantees the rule of law and the respect of all fundamental rights. This might be the golden hour for data protection.

That is, if its imperatives are being applied to any technological or digital responses to the COVID-19 pandemic relying on personal data:

  • The dataflow proposed must be clear, including all the categories of data that will be collected and used.
  • The purpose(s) must be clear, specific, granular, well-defined.
  • Have a lawful ground for processing in place.
  • Building any solution that necessitates personal data must be done by taking into account from the outset data protection requirements (data protection by design).
  • The web of responsibility must be clear (who are the controllers and the processors?).
  • Personal data must not be shared, or given access to, beyond the defined web of responsibility (for example, through controller-processor agreements).
  • There must be transparency in an intelligible way for the individuals whose personal data are collected.
  • The necessity of collecting any of the personal data items must be assessed (can the project do without some of them and achieve the same purpose?).
  • All personal data must be accurate.
  • Ensure that individuals have a way to obtain access to their own data and to ask for correction, erasure if it is justified (as well as for the other rights they have).
  • Ensure the security of data.
  • The personal data collected must be retained only for as long as it is necessary to achieve the purpose (afterwards, it must be deleted; anonymization may be accepted as an alternative to deletion, but there is an ongoing debate about this).
  • Data Protection Impact Assessments (even if loose) should be conducted and then engaging with supervisory authorities to discuss the risks identified which cannot be mitigated could be helpful (and may even be obligatory under certain circumstances).

Therefore, all the data-based solutions proposed to diminish the effects of the COVID-19 pandemic are not being proposed and accepted in Europe in spite of the GDPR, as media has been portraying it. It is almost as if data protection has been developing in the past half a century to give us the right instruments to be able to face this challenge and preserve our freedoms and our democracies. I hope we will be smart enough to properly use them.

Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking

The Court of Justice of the European Union published yesterday its long-awaited judgment in the Planet49 case, referred by a German Court in proceedings initiated by a non-governmental consumer protection organization representing the participants to an online lottery. It dealt with questions which should have been clarified long time ago, after Article 5(3) was introduced in Directive 2002/58 (the ‘ePrivacy Directive’) by an amendment from 2009, with Member States transposing and then applying its requirements anachronistically:

  • Is obtaining consent through a pre-ticked box valid when placing cookies on website users’ devices?
  • Must the notice given to the user when obtaining consent include the duration of the operation of the cookies being placed and whether or not third parties may have access to those cookies?
  • Does it matter for the application of the ePrivacy rules whether the data accessed through the cookies being placed is personal or non-personal?

The Court answered all of the above, while at the same time signaling to Member States that a disparate approach in transposing and implementing the ePrivacy Directive is not consistent with EU law, and setting clear guidance on what ‘specific’, ‘unambiguous’ and ‘informed’ consent means.

The core of the Court findings is that:

  • pre-ticked boxes do not amount to valid consent,
  • expiration date of cookies and third party sharing should be disclosed to users when obtaining consent,
  • different purposes should not be bundled under the same consent ask,
  • in order for consent to be valid ‘an active behaviour with a clear view’ (which I read as ‘intention’) of consenting should be obtained (so claiming in notices that consent is obtained by having users continuing to use the website very likely does not meet this threshold) and,
  • (quite consequential), these rules apply to cookies regardless of whether the data accessed is personal or not.

Unfortunately, though, the Court did not tackle one other very important issue: what does ‘freely given’ consent mean? In other words, would requiring and obtaining consent for placing cookies with the purpose of online tracking for behavioural advertising as a condition to access an online service, such as an online lottery (as in Planet49’s case), be considered as ‘freely given’ consent?

An answer to this question would have affected all online publishers and online service providers that condition access to their services to allowing online behaviour tracking cookies being installed on user devices and rely on ‘cookie walls’ as a source of income for their businesses. What is interesting is that the Court included a paragraph in the judgment specifically enunciating that it does not give its view on this issue because it was not asked to do so by the referring German Court (paragraph 64). Notably, ‘freely given’ is the only of the four conditions for valid consent that the Court did not assess in its judgment and that it specifically singled out as being left out in the open.

Finally, one very important point to highlight is that the entirety of the findings were made under the rules for valid consent as they were provided by Directive 95/46. The Court even specified that its finding concerning ‘unambiguous’ consent is made under the old directive. This is relevant because the definition of consent in Article 2(h) of Directive 95/46 only refers to ‘any freely given specific and informed indication’ of agreement. However, Article 7(a) of the directive provides that the data subject’s consent may make a processing lawful if it was given ‘unambiguously’.

With the GDPR, the four scattered conditions have been gathered under Article 4(11) and have been reinforced by clearer recitals. The fact remains that conditions for valid consent were just as strong under Directive 95/46. The Court almost ostensibly highlights that its interpretation is made on the conditions provided under the old legal regime and they only apply to the GDPR ‘a fortiori‘ (paragraph 60); (see here for what a fortiori means in legal interpretation).

Consequently, it seems that consent obtained for placing cookies with the help of pre-ticked boxes or through inaction or action without intent to give consent, even prior to the GDPR entering into force, has been unlawfully obtained. It remains to be seen if any action by supervisory authorities will follow to tackle some of those collections of data built relying on unlawfully obtained consent, or whether they will take a clean slate approach.

For a deeper dive into the key findings of the Planet49 CJEU judgment, read below:

Discrepancies in applying ePrivacy at Member State level, unjustifiable based on Directive’s text

Before assessing the questions referred on substance, the Court makes some preliminary findings. Among them, it finds that ‘the need for a uniform application of EU law and the principle of equality require that the wording of a provision of EU law which makes no express reference to the law of the Member States for the purpose of determining its meaning and scope must normally be given an autonomous and uniform interpretation throughout the European Union’ (paragraph 47). Article 5(3) of the ePrivacy Directive does not provide any room for Member State law to determine the scope and meaning of its provisions, by being sufficiently clear and precise in what it asks the Member States to do (see paragraph 46 for the Court’s argument).

In practice, divergent transposition and implementation of the ePrivacy Directive has created different regimes across the Union, which had consequences for the effectiveness of its enforcement.

‘Unambiguous’ means ‘active behavior’ and intent to give consent

The Court starts its assessment from a linguistic interpretation of the wording of Article 5(3) of Directive 2002/58. It notes that the provision doesn’t require a specific way of obtaining consent to the storage of and access to cookies on users’ devices. The Court observes that ‘the wording ‘given his or her consent’ does however lend itself to a literal interpretation according to which action is required on the part of the user in order to give his or her consent.

In that regard, it is clear from recital 17 of Directive 2002/58 that, for the purposes of that directive, a user’s consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an internet website‘ (paragraph 49).

The Court highlights that per Article 2(f) of Directive 2002/58 the meaning of a user’s ‘consent’ under the ePrivacy Directive is meant to be the same as that of a data subject’s consent under Directive 95/46 (paragraph 50). By referring to Article 2(h) of the former data protection directive, the Court observes that ‘the requirement of an ‘indication’ of the data subject’s wishes clearly points to active, rather than passive, behaviour’ (paragraph 52). The Court then concludes that ‘consent given in the form of a preselected tick in a checkbox does not imply active behaviour on the part of a website user’ (paragraph 52).

Interestingly, the Court points out that this interpretation of what ‘indication’ means ‘is borne out by Article 7 of Directive 95/46’ (paragraph 53), and in particular Article 7(2) which ‘provides that the data subject’s consent may make such processing lawful provided that the data subject has given his or her consent ‘unambiguously’’ (paragraph 54). So even if the definition of consent in Directive 95/46 does not refer to this condition in particular, the Court nevertheless anchored its main arguments in it.

The Court then made another important interpretation concerning what ‘unambiguous’ consent means: ‘Only active behaviour on the part of the data subject with a view to giving his or her consent may fulfil that requirement’ (paragraph 54). This wording (‘with a view to’) suggests that there is a condition of willfulness, of intent to give consent in order for the indication of consent to be lawful.

In addition, to be even clearer, the Court finds that ‘it would appear impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by not deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed. It is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited” (paragraph 55).

A fortiori, it appears impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by merely continuing with his or her activity on the website visited (continuing browsing or scrolling), nor whether the consent has been informed, provided that the information given to him or her does not even include a pre-ticked checkbox which would at least give the opportunity to uncheck the box. Also, just like the Court points out, it is not inconceivable that a user would not have read the information announcing him or her that by continuing to use the website they give consent.

With these two findings in paragraphs 54 and 55 the Court seems to clarify once and for all that informing users that by continuing their activity on a website signifies consent to placing cookies on their device is not sufficient to obtain valid consent under the ePrivacy Directive read in the light of both Directive 95/46 and the GDPR.

‘Specific’ means consent can’t be inferred from bundled purposes

The following condition that the Court analyzes is that of specificity. In particular, the Court finds that ‘specific’ consent means that ‘it must relate specifically to the processing of the data in question and cannot be inferred from an indication of the data subject’s wishes for other purposes” (paragraph 58). This means that bundled consent will not be considered valid and that consent should be sought granularly for each purpose of processing.

‘Informed’ means being able to determine the consequences of any consent given

One of the questions sent for a preliminary ruling by the German Court concerned specific categories of information that should be disclosed to users in the context of obtaining consent for placing cookies. Article 5(3) of the ePrivacy Directive requires that the user is provided with ‘clear and comprehensive information’ in accordance with Directive 95/46 (now replaced by the GDPR). The question was whether this notice must also include (a) the duration of the operation of cookies and (b) whether or not third parties may have access to those cookies.

The Court clarified that providing ‘clear and comprehensive’ information means ‘that a user is in a position to be able to determine easily the consequences of any consent he or she might give and ensure that the consent given is well informed. It must be clearly comprehensible and sufficiently detailed so as to enable the user to comprehend the functioning of the cookies employed’ (paragraph 74). Therefore, it seems that using language that is easily comprehensible for the user is important, just as it is important painting a full picture of the function of the cookies for which consent is sought.

The Court found specifically with regard to cookies that ‘aim to collect information for advertising purposes’ that ‘the duration of the operation of the cookies and whether or not third parties may have access to those cookies form part of the clear and comprehensive information‘ which must be provided to the user (paragraph 75).

Moreover, the Court adds that ‘information on the duration of the operation of cookies must be regarded as meeting the requirement of fair data processing‘ (paragraph 78). This is remarkable, since the Court doesn’t usually make findings in its data protection case-law with regard to the fairness of processing. Doubling down on its fairness considerations, the Court goes even further and links fairness of the disclosure of the retention time to the fact that ‘a long, or even unlimited, duration means collecting a large amount of information on users’ surfing behaviour and how often they may visit the websites of the organiser of the promotional lottery’s advertising partners’ (paragraph 78).

It is irrelevant if the data accessed by cookies is personal or anonymous, ePrivacy provisions apply regardless

The Court was specifically asked to clarify whether the cookie consent rules in the ePrivacy Directive apply differently depending on the nature of the data being accessed. In other words, does it matter that the data being accessed by cookie is personal or anonymized/aggregated/de-identified?

First of all, the Court points out that in the case at hand, ‘the storage of cookies … amounts to a processing of personal data’ (paragraph 67). That being said, the Court nonetheless notes that the provision analyzed merely refers to ‘information’ and does so ‘without characterizing that information or specifying that it must be personal data’ (paragraph 68).

The Court explained that this general framing of the provision ‘aims to protect the user from interference with his or her private sphere, regardless of whether or not that interference involves personal data’ (paragraph 69). This finding is particularly relevant for the current legislative debate over the revamp of the ePrivacy Directive. It is clear that the core difference between the GDPR framework and the ePrivacy regime is what they protect: the GDPR is concerned with ensuring the protection of personal data and fair data processing whenever personal data is being collected and used, while the ePrivacy framework is concerned with shielding the private sphere of an individual from any unwanted interference. That private sphere/private center of interest may include personal data or not.

The Court further refers to recital 24 of the ePrivacy Directive, which mentions that “any information stored in the terminal equipment of users of electronic communications networks are part of the private sphere of the users requiring protection under the European Convention for the Protection of Human Rights and Fundamental Freedoms. That protection applies to any information stored in such terminal equipment, regardless of whether or not it is personal data, and is intended, in particular, as is clear from that recital, to protect users from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge” (paragraph 70).

Conclusion

The judgment of the CJEU in Planet49 provides some much needed certainty about how the ‘cookie banner’ and ‘cookie consent’ provisions in the ePrivacy Directive should be applied, after years of disparate approaches from national transposition laws and supervisory authorities which lead to a lack of effectiveness in enforcement and, hence, compliance. The judgment does leave open on ardent question: what does ‘freely given consent’ mean? It is important to note nonetheless that before reaching the ‘freely given’ question, any consent obtained for placing cookies (or similar technologies) on user devices will have to meet all of the other three conditions. If only one of them is not met, then that consent is invalid.

***

You can refer to this summary by quoting G. Zanfir-Fortuna, ‘Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking’, http://www.pdpecho.com, published on October 3, 2019.

Exam scripts are partly personal data and other practical findings of the CJEU in Nowak

The Court of Justice of the European Union (CJEU) gave its judgment in Case C-434/16 Nowak on 20 December 2017, and it is significant from several points of view:

  • It provides a good summarized description of what constitutes “personal data”, referring to both objective and subjective information, regardless of its sensitivity, and it also details what the “related to” criterion from the legal definition of personal data means;
  • It *almost* departs from its YS jurisprudence on the concept of personal data;
  • It applies the interpretation that the Article 29 Working Party gave to the “related to” criterion in its Opinion on personal data from 2007, highlighting thus the weight that the interpretation of data protection law given by the European DPAs might have;
  • It establishes that written answers submitted by a candidate during an exam are personal data of the candidate (this is relevant for all education services providers);
  • It also establishes that the questions of the exam do not fall in the category of “personal data” – hence, not the entire exam script is considered personal data, but only the answers submitted by the candidate;
  • It establishes that the comments reviewers make on the margins of one’s written answers to an exam are personal data of the person being examined, while also being personal data of the reviewer;
  • It establishes that exam scripts should only be kept in an identifiable form only as long as they can be challenged.

This comment looks closer at all of these findings.

Facts of the Case

Mr Nowak was a trainee accountant who requested access to his exam script from the Institute of Chartered Accountants of Ireland (CAI), after failing the examination. He first challenged the results of the exam with no success. He then submitted a subject access request to the CAI, asking to receive a copy of all his personal data held by the CAI. He obtained 17 documents, but the exam script was not among them.

Mr Nowak brought this to the attention of the Irish Data Protection Commissioner (DPC) through an email, arguing that his exam script was also his personal data. The DPC  answered by email that exam scripts “would not generally constitute personal data”. Mr Nowak submitted then a formal complaint with the DPC against the CAI. The official response of the DPC was to reject the complaint on the ground that it is “frivolous or vexatious” (the same reason used to reject the first complaint of Max Schrems challenging the EU-US Safe Harbor scheme).

Mr Nowak then challenged this decision of the Irish DPC in front of the Circuit Court, then the High Court and then the Court of Appeal, which all decided against him. Finally, he challenged the decision of the Court of Appeal at the Supreme Court who decided to stay proceedings and send questions for a preliminary ruling to the CJEU, since the case required interpretation of EU law – in particular, how should the concept of “personal data” as provided for by EU Directive 95/46 be interpreted (a small procedural reminder here: Courts of last instance are under an obligation to send questions for a preliminary ruling to the CJEU in all cases that require the interpretation of EU law, per Article 267 TFEU last paragraph).

Questions referred

The Supreme Court asked the CJEU two questions (in summary):

  1. Is information recorded in/as answers given by an exam candidate capable of being personal data?
  2. If this is the case, then what factors are relevant in determining whether in a given case such information is personal data?

Pseudonymised data is personal data

First, recalling its Breyer jurisprudence, the Court establishes that, for information to be treated as personal data, it is of no relevance whether all the information enabling the identification of the data subject is in the hands of one person or whether the identifiers are separated (§31). In this particular case, it is not relevant “whether the examiner can or cannot identify the candidate at the time when he/she is correcting and marking the examination script” (§30).

The Court then looks at the definition of personal data from Directive 95/46, underlying that it has two elements: “any information” and “related to an identified or identifiable natural person”.

“Any information” means literally any information, be it objective or subjective

The Court recalls that the scope of Directive 95/46 is “very wide and the personal data covered … is varied” (§33).

“The use of the expression ‘any information’ in the definition of the concept of ‘personal data’ … reflects the aim of the EU legislature to assign a wide scope to that concept, which is not restricted to information that is sensitive or private, but potentially encompasses all kinds of information, not only objective but also subjective, in the form of opinions and assessments, provided that it ‘relates’ to the data subject.” (§34)

Save this paragraph, as it is a new jurisprudential source of describing what constitutes personal data – it is certainly a good summary, in line with the Court’s previous case-law (see an excellent overview of the Court’s approach to the definition of personal data here, p. 40 – 41). It makes clear that, for instance, comments on social media, reviews of products/companies, ratings and any other subjective assessments are personal data, as long as they relate to an identified or identifiable individual. This is also true for any sort of objective information (think shoe number), regardless of whether it is sensitive or private, as long as it relates to an identified or identifiable individual.

“Related to” must be judged in relation to “content, purpose or effect/consequences”

The condition for any information to be considered personal data is that it relates to a natural person. According to the Court, this means that “by reason of its content, purpose or effect, (it) is linked to a particular person” (§35). The Court thus applies the test developed by the Article 29 Working Party in its 2007 Opinion on the concept of personal data. Ten years ago, the DPAs wrote that “in order to consider that the data ‘relate’ to an individual, a ‘content’ element OR a ‘purpose’ element OR a ‘result’ element should be present” (2007 Opinion, p. 10).

The Court now adopted this test in its case-law, giving an indication of how important the common interpretation given by data protection authorities in official guidance is. However, the Court does not directly refer to the Opinion.

Applying the test to the facts of the case, the Court showed that the content of exam answers “reflects the extent of the candidate’s knowledge and competence in a given field and, in some cases, his intellect, thought processes, and judgment” (§37). Additionally, following AG Kokott’s Opinion, the Court also pointed out that “in the case of a handwritten script, the answers contain, in addition, information as to his handwriting” (§37).

The purpose of the answers is “to evaluate the candidate’s professional abilities and his suitability to practice the profession concerned” (§38) and the consequence of the answers “is liable to have an effect on his or her rights and interests, in that it may determine or influence, for example, the chance of entering the profession aspired to or of obtaining the post sought” (§39).

Comments of reviewers are two times personal data

The test is then applied to the comments of reviewers on the margin of a candidate’s answers. The Court showed that “The content of those comments reflects the opinion or the assessment of the examiner of the individual performance of the candidate in the examination, particularly of his or her knowledge and competences in the field concerned. The purpose of those comments is, moreover, precisely to record the evaluation by the examiner of the candidate’s performance, and those comments are liable to have effects for the candidate” (§43).

It is important to note here that complying with only one of the three criteria (content, purpose, effects) is enough to qualify information as “relating to” an individuals, even if the Court found in this particular case that all of them are met. This is shown by the us of “or” in the enumeration made in §35, as shown above.

The Court also found that “the same information may relate to a number of individuals and may constitute for each of them, provided that those persons are identified or identifiable, personal data” (§45), having regard to the fact that the comments of the examiners are personal data of both the examiners and the “examinee”.

Information can be Personal data regardless of whether one is able to rectify it or not

It was the Irish DPC that argued that qualifying information as “personal data” should be affected by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification (§46). The logic here was that if data cannot be rectified, it cannot be considered personal – just as exam answers cannot be rectified after the exam finished.

The Court (rightfully so) disagreed with this claim, following the opinion of the Advocate General and contradicting its own findings in Case C-141/12 YS (see a more detailed analysis of the interaction between the two judgments below). It argued that “a number of principles and safeguards, provided for by Directive 95/46, are attached to that classification and follow from that classification” (§47), meaning that protecting personal data goes far beyond the ability to access and rectify your data. This finding is followed by a summary of the fundamental mechanisms encompassed by data protection.

Data protection is a web of safeguards, accountability and individual rights

Starting from recital 25 of Directive 95/46 (yet again, how important recitals are! Think here of Recital 4 of the GDPR and the role it can play in future cases – “The processing of personal data should be designed to serve mankind”), the Court stated that:

“…the principles of protection provided for by that directive are reflected, on the one hand, in the obligations imposed on those responsible for processing data, obligations which concern in particular data quality, technical security, notification to the supervisory authority, and the circumstances under which processing can be carried out, and, on the other hand, in the rights conferred on individuals, the data on whom are the subject of processing, to be informed that processing is taking place, to consult the data, to request corrections and even to object to processing in certain circumstances” (§48).

The Court thus looks at data protection as a web of accountability, safeguards (reflected in technical security measures, data quality, conditions for lawful processing data) and rights conferred to the individuals.

In this case, not considering exam answers personal data just because they cannot be “corrected” after the exam would strip this information from the other web of protections, such as being processed on a legitimate ground, being retained only for the necessary period of time and so on. The Court does not phrase this finding this way, but it states that:

“Accordingly, if information relating to a candidate, contained in his or her answers submitted at a professional examination and in the comments made by the examiner with respect to those answers, were not to be classified as ‘personal data’, that would have the effect of entirely excluding that information from the obligation to comply not only with the principles and safeguards that must be observed in the area of personal data protection, and, in particular, the principles relating to the quality of such data and the criteria for making data processing legitimate, established in Articles 6 and 7 of Directive 95/46, but also with the rights of access, rectification and objection of the data subject, provided for in Articles 12 and 14 of that directive, and with the supervision exercised by the supervisory authority under Article 28 of that directive” (§49).

Furthermore, the Court shows that errors in the answers given to an exam do not constitute “inaccuracy” of personal data, because the level of knowledge of a candidate is revealed precisely by the errors in his or her answers, and revealing the level of knowledge is the purpose of this particular data processing. As the Court explains, “[i]t is apparent from Article 6(1)(d) of Directive 95/46 that the assessment of whether personal data is accurate and complete must be made in the light of the purpose for which that data was collected” (§53).

Exam scripts should only be kept in an identifiable form as long as they can be challenged

The Court further explained that both exam answers and reviewers’ comments can nevertheless be subject to “inaccuracy” in a data protection sense, “for example due to the fact that, by mistake, the examination scripts were mixed up in such a way that the answers of another candidate were ascribed to the candidate concerned, or that some of the cover sheets containing the answers of that candidate are lost, so that those answers are incomplete, or that any comments made by an examiner do not accurately record the examiner’s evaluation of the answers of the candidate concerned” (§54).

Also, the Court also admitted the possibility that “a candidate may, under Article 12(b) of Directive 95/46, have the right to ask the data controller to ensure that his examination answers and the examiner’s comments with respect to them are, after a certain period of time, erased, that is to say, destroyed” (§55).

Another finding of the Court that will be useful to schools, universities and other educational institutions is that keeping exam scripts related to an identifiable individual is not necessary anymore after the examination procedure is closed and can no longer be challenged: “Taking into consideration the purpose of the answers submitted by an examination candidate and of the examiner’s comments with respect to those answers, their retention in a form permitting the identification of the candidate is, a priori, no longer necessary as soon as the examination procedure is finally closed and can no longer be challenged, so that those answers and comments have lost any probative value” (§55).

The Court distances itself from the findings in C-141/12 YS, but still wants to keep that jurisprudence alive

One of the biggest questions surrounding the judgment in Nowak was whether the Court will follow AG’s Opinion and change it’s jurisprudence from C-141/12 YS.  In that judgment, the Court found that the legal analysis used by the Dutch Ministry of Immigration in a specific case of asylum seekers is not personal data, and the main reason invoked was that “[i]n contrast to the data relating to the applicant for a residence permit which is in the minute and which may constitute the factual basis of the legal analysis contained therein, such an analysis … is not in itself liable to be the subject of a check of its accuracy by that applicant and a rectification under Article 12(b) of Directive 95/46” (§45).

The Court further noted: “In those circumstances, extending the right of access of the applicant for a residence permit to that legal analysis would not in fact serve the directive’s purpose of guaranteeing the protection of the applicant’s right to privacy with regard to the processing of data relating to him, but would serve the purpose of guaranteeing him a right of access to administrative documents, which is not however covered by Directive 95/46.” Finally, the finding was that “[i]t follows from all the foregoing considerations … that the data relating to the applicant for a residence permit contained in the minute and, where relevant, the data in the legal analysis contained in the minute are ‘personal data’ within the meaning of that provision, whereas, by contrast, that analysis cannot in itself be so classified” (§48).

Essentially, in YS the Court linked the ability of accessing and correcting personal data with the classification of information as personal data, finding that if the information cannot be corrected, then it cannot be accessed and it cannot be classified as personal data.

By contrast, following AG Kokott’s analysis, in Nowak the Court essentially states that classifying information as personal data must not be affected by the existence of the rights to access and rectification – in the sense that the possibility to effectively invoke them should not play a role in establishing that certain information is or is not personal data: “the question whether written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers should be classified as personal data cannot be affected … by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification, pursuant to Article 12(a) and (b) of Directive 95/46” (§46).

However, the Court is certainly not ready to fully change its jurisprudence established in YS, and even refers to its judgment in YS in a couple of paragraphs. In the last paragraphs of Nowak, the Court links the ability to correct or erase data to the existence of the right of accessing that data (but not to classifying information as personal data).

The Court states that: “In so far as the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers are therefore liable to be checked for, in particular, their accuracy and the need for their retention… and may be subject to rectification or erasure…, the Court must hold that to give a candidate a right of access to those answers and to those comments… serves the purpose of that directive of guaranteeing the protection of that candidate’s right to privacy with regard to the processing of data relating to him (see, a contrario, judgment of 17 July 2014, YS and Others, C‑141/12 and C‑372/12, EU:C:2014:2081, paragraphs 45 and 46), irrespective of whether that candidate does or does not also have such a right of access under the national legislation applicable to the examination procedure”.

After previously showing an ever deeper understanding of data protection in its Nowak judgment, the Court sticks to some of its findings from YS, even if this meant perpetuating a confusion between the fundamental right to respect for private life and the fundamental right to the protection of personal data: “it must be recalled that the protection of the fundamental right to respect for private life means, inter alia, that any individual may be certain that the personal data relating to him is correct and that it is processed in a lawful manner” (§57 in Nowak and §44 in YS). Lawful processing of personal data and the right to keep personal data accurate are, in fact, enshrined in Article 8 of the EU Charter – the right to the protection of personal data, and not in Article 7 – the right to respect for private life.

Obiter dictum 1: the curious insertion of “exam questions” in the equation

The Court also does something curious in these last paragraphs. It simply states, after the paragraphs sending to the YS judgment, that “the rights of access and rectification, under Article 12(a) and (b) of Directive 95/46, do not extend to the examination questions, which do not as such constitute the candidate’s personal data” (§58). The national court did not ask about this specific point. AG Kokott also does not address this issue at all in her Opinion. This might have been raised during the hearing, but no context is provided to it. The Court simply states that “Last, it must be said…” and follows it with the finding regarding test questions.

While it is easy to see that questions of a specific test, by themselves, are not personal data, as they do not relate with regard to their content, purpose or effect to a specific individual, the situation is not as clear when the questions are part of the “solved” exam sheet of a specific candidate. The question is: “Are the answers of the test inextricably linked to the questions?” Imagine a multiple choice test, where the candidate only gains access to his/her answers, without obtaining access to the questions of that test. Accessing the answers would be unintelligible. For instance, EPSO candidates have been trying for years to access their own exam sheets held by the EPSO agency of the European Union, with no success. This is exactly because EPSO only provides access to the series of letters chosen as answers from the multiple choice test. Challenges of this practice have all failed, including those brought to the attention of the former Civil Service Tribunal of the CJEU (see this case, for example). This particular finding in Nowak closes the barely opened door for EPSO candidates to finally have access to their whole test sheet.

Obiter dictum 2: reminding Member States they can restrict the right of access

With an apparent reason and referring to the GDPR, the CJEU recalls, as another obiter dictum, under the same “it must be said” (§58 and §59), that both Directive 95/46 and the GDPR “provide for certain restrictions of those rights” (§59) – access, erasure etc.

It also specifically refers to grounds that can be invoked by Member States when limiting the right to access under the GDPR: when such a restriction constitutes a necessary measure to safeguard the rights and freedoms of others (§60,§61), or if it is done for other objectives of general public interest of the Union or of a Member State (§61).

These findings are not followed by any other considerations, as the Court concludes with a finding that had already been reached around §50: “the answer to the questions referred is that Article 2(a) of Directive 95/46 must be interpreted as meaning that, in circumstances such as those of the main proceedings, the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers constitute personal data, within the meaning of that provision” (§62).

If you want to have a look at a summary of AG Kokott’s excellent Conclusions in this case and then compare them to the judgment of the Court, click here. The Court did follow the Conclusions to a great extent.

 

Exam scripts and examiner’s corrections are personal data of the exam candidate (AG Kokott Opinion in Nowak)

AG Kokott delivered her Opinion on 20 July in Case C-434/16 Nowak v Data Protection Commissioner, concluding that “a handwritten examination script capable of being ascribed to an examination candidate, including any corrections made by examiners that it may contain, constitutes personal data within the meaning of Article 2(a) of Directive 95/46/EC” (Note: all highlights in this post are mine).
This is a really exciting Opinion because it provides insight into:

  • the definition of personal data,
  • the purpose and the functionality of the rights of the data subject,
  • the idea of abusing data protection related rights for non-data protection purposes,
  • how the same ‘data item’ can be personal data of two distinct data subjects (examiners and examinees),
  • what constitutes a “filing system” of personal data processed otherwise than by automated means.

But also because it technically (even if not literally) invites the Court to change its case-law on the definition of personal data, and specifically the finding that information consisting in a legal assessment of facts related to an individual does not qualify as personal data (see C-141/12 and C-372/12 YS and Others).

The proceedings were initially brought in front of the Irish Courts by Mr Nowak, who, after failing an exam organised by a professional association of accountants (CAI) four times, asked for access to see his exam sheet on the basis of the right to access his own personal data. Mr Nowak submitted a request to access all his personal data held by CAI and received 17 items, none of which was the exam sheet. He then submitted a complaint to the Irish Data Protection Commissioner, who decided not to investigate it, arguing that an exam sheet is not personal data. The decision not to investigate on this ground was challenged in front of a Court. Once the case reached the Irish Supreme Court, it was referred to the Court of Justice of the EU to clarify whether an exam sheet falls under the definition of “personal data” (§9 to §14).

Analysis relevant both for Directive 95/46 and for the GDPR

Yet again, AG Kokott refers to the GDPR in her Conclusions, clarifying that “although the Data Protection Directive will shortly be repealed by the General Data Protection Regulation, which is not yet applicable, the latter will not affect the concept of personal data. Therefore, this request for a preliminary ruling is also of importance for the future application of the EU’s data protection legislation” (m.h.).

The nature of an exam paper is “strictly personal and individual”

First, the AG observes that “the scope of the Data Protection Directive is very wide and the personal data covered by the Directive is varied” (§18).

The Irish DPC argued that an exam script is not personal data because “examination exercises are normally formulated in abstract terms or relate to hypothetical situations”, which means that “answers to them are not liable to contain any information relating to an identified or identifiable individual” (§19).

This view was not followed by the AG, who explained that it is incongruent with the purpose of an exam. “In every case“, she wrote, “the aim of an examination — as opposed, for example, to a representative survey — is not to obtain information that is independent of an individual. Rather, it is intended to identify and record the performance of a particular individual, i.e. the examination candidate”  (§24; m.h.). Therefore, “every examination aims to determine the strictly personal and individual performance of an examination candidate. There is a good reason why the unjustified use in examinations of work that is not one’s own is severely punished as attempted deception” (§24; m.h.).

What about exam papers identified by codes?

In a clear indication that pseudonymized data are personal data, the AG further noted that an exam script is personal data also in those cases where instead of bearing the examination candidate’s name, the script has an identification number or bar code: “Under Article 2(a) of the Data Protection Directive, it is sufficient for the existence of personal information that the data subject may at least be indirectly identified. Thus, at least where the examination candidate asks for the script from the organisation that held the examination, that organisation can identify him by means of the identification number” (§28).

Characteristics of handwriting, personal data themselves 

The AG accepted the argument of Mr Nowak that answers to an exam that are handwritten “contain additional information about the examination candidate, namely about his handwriting” (&29). Therefore, the characteristics of the handwriting are personal data themselves. The AG explains that “a script that is handwritten is thus, in practice, a handwriting sample that could at least potentially be used at a later date as evidence to determine whether another text was also written in the examination candidate’s writing. It may thus provide indications of the identity of the author of the script” (§29). According to the AG, it’s not relevant whether such a handwriting sample is a suitable means of identifying the writer beyond doubt: “Many other items of personal data are equally incapable, in isolation, of allowing the identification of individuals beyond doubt” (§30).

Classifying information as ‘personal data’ is a stand alone exercise (does not depend on whether rights can be exercised)

The Irish DPC argued that one of the reasons why exam scripts are not personal data in this case is because the “purpose” of the right to access and the right to rectification of personal data precludes them to be “personal data” (§31). The DPC is concerned that Recital 41 of Directive 95/46 specifies that any person must be able to exercise the right of access to data relating to him which is being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing. “The examination candidate will seek the correction of incorrect examination answers”, the argument goes (§31).

AG Kokott rebuts this argument by acknowledging that the classification of information as personal data “cannot be dependent on whether there are specific provisions about access to this information” or on eventual problems with rectification of data (§34). “If those factors were regarded as determinative, certain personal data could be excluded from the entire protective system of the Data Protection Directive, even though the rules applicable in their place do not ensure equivalent protection but fragmentary protection at best” (§34)

Even if classification information as “personal data” would depend in any way on the purpose of the right to access, the AG makes it clear that this purpose is not strictly linked to rectification, blocking or erasure: “data subjects generally have a legitimate interest in finding out what information about them is processed by the controller” (§39). This finding is backed up by the use of “in particular” in Recital 41 of the Directive (§39).

The purpose of processing and… the passage of time, both relevant for obtaining access, rectification

After clarifying that it’s irrelevant what an individual wants to do with their data, once accessed (see also the summary below on the ‘abuse of rights’), AG Kokott explains that a legitimate interest in correcting an “exam script”-related data is conceivable.

She starts from the premise that “the accuracy and completeness of personal data pursuant to Article 6(1)(d) must be judged by reference to the purpose for which the data was collected and processed” (§35). The AG further identifies the purpose of an exam script as determining  “the knowledge and skills of the examination candidate at the time of the examination, which is revealed precisely by his examination performance and particularly by the errors in the examination” (§35). “The existence of errors in the solution does not therefore mean that the personal data incorporated in the script is inaccurate”, the AG concludes (§35).

Rectification could be achieved if, for instance, “the script of another examination candidate had been ascribed to the data subject, which could be shown by means of, inter alia, the handwriting, or if parts of the script had been lost” (§36).

The AG also found that the legitimate interest of the individual to have access to their own data is strengthened by the passage of time, to the extent that their recollection of the contents of their answer is likely to be considerably weaker a few years after the exam. This makes it possible that “a genuine need for information, for whatever reasons, will be reflected in a possible request for access. In addition, there is greater uncertainty with the passing of time — in particular, once any time limits for complaints and checks have expired — about whether the script is still being retained. In such circumstances the examination candidate must at least be able to find out whether his script is still being retained” (§41).

Is Mr Nowak abusing his right of access under data protection law?

AG Kokott recalls CJEU’s case-law on “abuse of rights” and the double test required by the Court to identify whether there had been any abuse of rights in a particular case (C-423/15 Kratzer and the case-law cited there at §38 to §40), which can be summed up to (§44):

i) has the purpose of the EU legislation in question been misused?

ii)  is the essential aim of the transaction to obtain an undue advantage?

The DPC submitted during the procedure that if exam scripts would be considered personal data, “a misuse of the aim of the Data Protection Directive would arise in so far as a right of access under data protection legislation would allow circumvention of the rules governing the examination procedure and objections to examination decisions” (§45).

The AG considers that “any alleged circumvention of the procedure for the examination and objections to the examination results via the right of access laid down by data protection legislation would have to be dealt with using the provisions of the Data Protection Directive” and she specifically refers to the restrictions to the right of access laid down in Article 13 of the Directive with the aim “to protect certain interests specified therein” (§46). She also points out that if restricting access to exam scripts can’t be circumscribed to those exceptions, than “it must be recognised that the legislature has given precedence to the data protection requirements which are anchored in fundamental rights over any other interests affected in a specific instance” (§47).

The AG also looks at the exceptions to the right of access under the GDPR and finds that it is more nuanced than the Directive in this regard. “First, under Article 15(4) of the regulation, the right to obtain a copy of personal data is not to adversely affect the rights and freedoms of others. Second, Article 23 of the regulation sets out the grounds for a restriction of data protection guarantees in slightly broader terms than Article 13 of the Directive, since, in particular, protection of other important objectives of general public interest of the Union or of a Member State pursuant to Article 23(1)(e) of the regulation may justify restrictions” (§48).

However, it seems that she doesn’t find the slight broadening of the scope of exemptions in the GDPR as justifying the idea of an abuse of right in this particular case.

The AG also argues that “on the other hand, the mere existence of other national legislation that also deals with access to examination scripts is not sufficient to allow the assumption that the purpose of the Directive is being misused” (§49). She concludes that even if such misuse would be conceivable, the second limb of the “abuse of rights” test would not be satisfied: “it is still not apparent where the undue advantage lies if an examination candidate were to obtain access to his script via his right of access. In particular, no abuse can be identified in the fact that someone obtains information via the right of access which he could not otherwise have obtained” (§50).

Examiner’s correction on the exam script are the examinee’s personal data and his/her own personal data at the same time

The AG looks into whether any corrections made by the examiner on the examination script are also personal data with respect to the examination candidate (a question raised by some of the parties), even though she considers that the answer will not impact the result of the main proceedings (§52, §53).

It is apparent that the facts of this case resemble the facts of YS and Others, where the Court refused extension of the right of access to the draft legal analysis of an asylum application on the grounds that that did not serve the purpose of the Data Protection Directive but would establish a right of access to administrative documents. The Court argued in YS that such an analysis “is not information relating to the applicant for a residence permit, but at most information about the assessment and application by the competent authority of the law to the applicant’s situation” (§59; see YS and Others, §40). The AG considers that only “at first glance” the cases are similar. But she doesn’t convincingly differentiate between the two cases in the arguments that follow.

However, she is convincing when explaining why the examiner’s corrections are “personal data”. AG Kokott explains that the purpose of the comments made by examiners on an exam script is “the evaluation of the examination performance and thus they relate indirectly to the examination candidate” (§61). It does not matter that the examiners don’t know the identity of the examination candidate who produced the script, as long as the candidate can be easily identified by the organisation holding the examination (§60 and §61).

The AG further adds that “comments on an examination script are typically inseparable from the script itself … because they would not have any informative value without it” (§62). And it is “precisely because of that close link between the examination script and any corrections made on it”, that “the latter also are personal data of the examination candidate pursuant to Article 2(a) of the Data Protection Directive” (§63).

In an important statement, the AG considers that “the possibility of circumventing the examination complaint procedure is not, by contrast, a reason for excluding the application of data protection legislation” (§64). “The fact that there may, at the same time, be additional legislation governing access to certain information is not capable of superseding data protection legislation. At most it would be admissible for the individuals concerned to be directed to the simultaneously existing rights of information, provided that these could be effectively claimed” (§64).

Finally, the AG points out “for the sake of completeness” that “corrections made by the examiner are, at the same time, his personal data”. AG Kokott sees the potential conflict between the right of the candidate to access their personal data and the right of the examiners to protect their personal data and underlines that the examiner’s rights “are an appropriate basis in principle for justifying restrictions to the right of access pursuant to Article 13(1)(g) of the Data Protection Directive if they outweigh the legitimate interests of the examination candidate” (§65).

The AG considers that “the definitive resolution to this potential conflict of interests is likely to be the destruction of the corrected script once it is no longer possible to carry out a subsequent check of the examination procedure because of the lapse of time” (§65).

An exam script forms part of a filing system

One last consideration made by AG Kokott is whether processing of an exam script would possibly fall outside the scope of Directive 95/46, considering that it does not seem to be processed using automated means (§66, §67).

The AG points out that the Directive also applies to personal data processed otherwise than by automated means as long as they form part of a “filing system”, even if this “filing system” is not electronically saved (§69).

“This concept covers any structured set of personal data which is accessible according to specific criteria. A physical set of examination scripts in paper form ordered alphabetically or according to other criteria meets those requirements” (§69), concludes the AG.

Conclusion. What will the Court say?

The Conclusions of AG Kokott in Nowak contain a thorough analysis, which brings several dimensions to the data protection debate that have been rarely considered by Courts – the self-standing importance of the right of access to one’s own data (beyond any ‘utilitarianism’ of needing it to obtain something else), the relevance of passage of time for the effectiveness of data protection rights, the limits of the critique that data protection rights may be used to achieve other purposes than data protection per se, the complexity of one data item being personal data of two different individuals (and the competing interests of those two individuals).

The Court will probably closely follow the Conclusions of the AG for most of the points she raised.

The only contentious point will be the classification of an examiner’s corrections as personal data of the examined candidate, because following the AG will mean that the Court would reverse its case-law from YS and Others.

If we apply the criteria developed by AG Kokott in this Opinion, it is quite clear that the analysis concerning YS and their request for asylum is personal data: the legal analysis is closely linked to the facts concerning YS and the other asylum applicants and the fact that there may be additional legislation governing access to certain information (administrative procedures in the case of YS) is not capable of superseding data protection legislation. Moreover, if we add to this the argument that access to one’s own personal data is valuable in itself and does not need to satisfy other purpose, reversing this case-law is even more likely.

The only arguable difference between this case and YS and Others is that, unlike what the AG found in §62 (“comments on an examination script are typically inseparable from the script itself… because they would not have any informative value without it”), it is conceivable that a legal analysis in general may have value by itself. However, a legal analysis of particular facts is void of value when applied to different individual facts. In this sense, a legal analysis can also be considered inseparable from the particular facts it assesses. What would be relevant in classifying it as personal data would then remain the identifiability of the person that the particularities refer to…

I was never convinced by the argumentation of the Court (or AG Sharpston for that matter) in YS and Others and I would welcome either reversing this case-law (which would be compatible with what I was expecting the outcome of YS to be) or having a more convincing argumentation as to why such an analysis/assessment of an identified person’s specific situation is not personal data. However, I am not getting my hopes high. As AG Kokott observed, the issue in the main proceedings can be solved without getting into this particular detail. In any case, I will be looking forward to this judgement.

(Summary and analysis by dr. Gabriela Zanfir-Fortuna)

 

Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)

The Conclusions of Advocate General Kokott in C-73/16 Puškár were published on 30 March and remained under the radar, even though they deal with a couple of very important questions for EU data protection law that may have wide implications: effective judicial remedies, lawful grounds for processing other than consent, the right to access one’s own personal data. As a bonus, the AG refers to and analyses Article 79 GDPR – the right to a judicial remedy.

The analysis regarding effective judicial remedies under Article 47 Charter and Directive 95/46 could be relevant for the debate on essentially equivalence when it comes to adequacy decisions for international data transfers (for those of you who don’t remember, one of the two main findings in Schrems was that the Safe Harbor framework touched the essence of the right to effective judicial remedies, breaching thus Article 47 Charter). In this sense, the AG founds that a measure that does not restrict the category of people who could in principle have recourse to judicial review does not touch the essence of this right. Per a contrario, if a measure does restrict these categories of people, it would touch the essence of the right to an effective judicial remedy, and, therefore, it would breach the Charter.

Finally, a question of great importance for EU law in general is also tackled: what should national courts do when the case-law of the CJEU and the case-law of the ECtHR diverge regarding the protection of fundamental rights?

Here is what you will further read:

  1. Facts of the case and questions referred to the CJEU
  2. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy
  3. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents
  4. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)
  5. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection
  6. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed when the ECtHR case-law provides a higher level of protection for the rights of a person
  7. What to expect from the Court

Note that all highlights from the post are made by the author.

  1. Facts of the case and questions referred to the CJEU

C-73/16 Puškár concerns the request of Mr Puškár to have his name removed from a blacklist kept by the Finance Directorate of Slovakia which contains names and national ID numbers for persons “who purport to act, as ‘fronts’, as company directors”. The list associates a legal person or persons with a natural person who supposedly acted on their behalf (§15) and is created for the purposes of tax administration and combating tax fraud (§23 2nd question for a preliminary ruling). It transpires from several paragraphs of the Conclusions that Mr Puskar found out about the list and the fact that he is on the list from a leak (§23 2nd question;§72; §76). Instead of relying on the more straightforward right to erasure or right to object under data protection law, Mr Puškár claimed that “his inclusion in the above mentioned list infringes his personal rights, specifically the right to the protection of his good name, dignity and good reputation” (§16).

The Supreme Court rejected his claims, partly on procedural issues, partly on substantive grounds (§18). Later, the Constitutional Court found that “the Supreme Court infringed the fundamental right to the protection of personal data against unauthorised collection and other abuses, in addition to the right to privacy”, quashed its decision and send back the case to the Supreme Court for retrial, grounding its findings on ECtHR case-law (§20). In the context of these second round proceedings, the Supreme Court sent questions for a preliminary ruling to the CJEU to essentially clarify:

  • whether the right to an effective remedy under Article 47 of the Charter in the context of data protection is compatible with a national law requirement that a claimant must first exhaust the procedures available under administrative law (administrative complaints) before going to Court;
  • whether the legitimate grounds for processing under Directive 95/46 and Articles 7 and 8 of the Charter preclude tax authorities to create such a blacklist without the consent of the individuals on the list;
  • whether the list obtained by the claimant without the consent of the tax authorities is admissible as evidence;
  • whether national courts should give precedence to the case-law of the CJEU or the case-law of the ECtHR on a specific topic where the two diverge.
  1. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy

To reply to the first question, AG Kokott looks at Articles 28(4) and 22 of Directive 95/46 and also at Article 79 of the General Data Protection Regulation, which will replace Directive 95/46 starting with 25 May 2018.

Article 28(4) of Directive 95/46 states that each supervisory authority (Data Protection Authority) is to hear claims lodged by any person concerning the protection of his rights and freedoms with regard to the processing of personal data. Article 22 provides that, without prejudice to the remedy referred to in Article 28(4), every person is to have a right to a judicial remedy for any breach of the rights guaranteed him by the national law applicable to the processing in question (§37, §38).

In practice, this means that an individual who engages in Court proceedings for a breach of data protection law must be able to also initiate administrative proceedings with a DPA (complaints lodged with DPAs).

The same rule is kept under Article 79 GDPR, slightly broadened: the right to a judicial remedy must be effective and must be granted without prejudice to any administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority. 

AG Kokott explains that these rules still do not clarify “whether the bringing of legal proceedings may be made contingent upon exhaustion of another remedy. All that can be taken from Article 79 of the General Data Protection Regulation is that the judicial remedy must be effective. An obligation to exhaust some other remedy before bringing legal proceedings will consequently be impermissible if the judicial remedy is rendered ineffective as a result of this precondition” (§43).

The AG found that Article 47(1) of the Charter and the principle of effectiveness “ultimately embody the same legal principle” and that they can be examined jointly using the rules in Articles 47(1) and 52(1) of the Charter – which is the provision that enshrines the rules for limiting the exercise of the fundamental rights in the Charter (§51). Hence, the question is whether the obligation to exhaust administrative procedures before going to Court amounts to a justified interference with the right to an effective judicial remedy.

AG Kokott remarks that the interference is provided for by Slovakian law and that it does not touch the essence of the right to effective judicial remedy because “it does not restrict the category of people who could in principle have recourse to judicial review” (§56). [Small comment here: this means that a provision which would restrict the category of people who could in principle have recourse to judicial review touches the essence of the right in Article 47 Charter. Check out paragraphs 45 and 46 of the EDPS Opinion on the EU-US Umbrella Agreement commenting on the fact that Article 19 of the Agreement provides for the possibility of judicial redress only for citizens of the EU, excluding thus categories of individuals that would otherwise be covered by the Charter, such as asylum seekers and residents].

It remained to be analysed whether the interference complies with the principle of proportionality, which “requires that a measure be ‘appropriate, necessary and proportionate to the objective it pursues’” (§58). The AG retains the submission of the Supreme Court that “the exhaustion of the administrative remedy represents a gain in efficiency, as it provides the administrative authority with an opportunity to remedy the alleged unlawful intervention, and saves it from unwanted court proceedings” (§59). The AG considers that “obligatory preliminary proceedings are undoubtedly appropriate for achieving the objectives” and that a “less onerous method” does not suggest itself as capable of realising them to the same extent (§62).

However, the AG points out that the “specific form” of the administrative remedy is important to determine the appropriateness of the measure in practice. This condition applies in particular if there is uncertainty “as to whether the time limit for bringing an action begins to run before a decision has been made in the administrative action” (§64). Additionally, Article 47(2) Charter establishes the right of every person to have their case dealt with within a reasonable period of time. “While this right in fact relates to judicial proceedings, naturally it may not be undermined by a condition for the bringing of an action” (§67).

In conclusion, the AG considers that the right to effective judicial review under Article 47 Charter and the principle of effectiveness “do not preclude an obligation to exhaust an administrative remedy being a condition on bringing legal proceedings if the rules governing that remedy do not disproportionately impair the effectiveness of judicial protection. Consequently, the obligatory administrative remedy must not cause unreasonable delay or excessive costs for the overall legal remedy” (§71).

  1. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents

Essentially, the question asked by the Supreme Court is whether the contested list may be excluded as evidence due to the fact that it came into the possession of the claimant without the consent of the competent authorities (§72).

The AG considers that “a review should be carried out to determine whether the person affected has a right of access to the information in question. If this were the case, the interest in preventing unauthorized use would no longer merit protection” (§83).

Further, it is recalled that “under the second sentence of Article 8(2) of the Charter and Article 12 of the Data Protection Directive, everyone has the right of access to data which has been collected concerning him or her. This also applies in principle to data being recorded in the contested list. Furthermore, the persons so affected would, by virtue of the collection of the data, have to be informed of the use of the data, under either Article 10 or Article 11 of the Data Protection Directive” (§85).

While indeed Article 13 of the Directive allows this right to information to be restricted, it also “expressly requires that such restrictions be imposed by legislative measures” (§86). The AG acknowledged that “there is a potential risk that inspection and monitoring activities based on the list would be less effective if it were known who was named on that list” (§87). However, the national Court must examine:

  • “whether a restriction of the right of information of this kind is provided for” (§88) and
  • “where appropriate” if it is “justified” (§88). This is an indication that even if such an exemption would be provided for by law, a further analysis is needed to see whether the exemption is justified.

A key point the AG makes is that “even if there are indications of a legitimate interest in a hypothetical, legally justified non-disclosure of the list in question, the national courts must also examine whether in the individual case these outweigh the legitimate interests of the individual in bringing the proceedings” (§89). This is important because it is a clear indication that when a controller relies on their legitimate interest as a ground for processing, it always has to engage in a balancing exercise with the legitimate interests (and rights) of the data subject.

In conclusion, the AG established that refusing to accept as evidence a document obtained by the claimant without the consent of an authority is not possible under the principle of a fair hearing in Article 47 Charter when the document contains personal data of the claimant, which the authority is required to disclose to the claimant under Article 12 and 13 of the Data Protection Directive.

  1. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)

The Supreme Court wanted to know whether the fundamental right to privacy (Article 7 Charter) and protection of personal data (Article 8 Charter) and the Data Protection Directive prohibit a Member State from creating a list of personal data for the purposes of tax collection without the consent of the persons concerned.

The AG points out that “this question is primarily to be answered in the light of the Data Protection Directive, as this specifies the rights to privacy and data protection” (§95).

The AG further recalls that Article 7 of the Data Protection Directive allows processing of personal data if it is based on one of the six lawful grounds for processing provided for (§99) [NB: of which only one is “consent”!]. While the AG acknowledges that three of the six conditions are applicable in this case (1 – performance of a task in the public interest [Article 7(e)]; 2 – legitimate interest of the controller [Article 7(f)] and 3 – necessity of compliance with a legal obligation [Article 7(c)]), she considers the examination of the latter 2 as “superfluous”: “This is because all parties acknowledge that tax collection and combating tax fraud are tasks in the public interest within the meaning of Article 7(e) of the Data Protection Directive” (§100).

A much-welcomed clarification is further brought by the AG, who specifies that Article 7(e) of the Data Protection Directive “must be read in conjunction with the principles of Article 6. According to Article 6(1)(b), personal data must only be collected for specified, explicit and legitimate purposes. Within the scope of Article 7(e), the purpose of the data processing is inseparably linked to the delegated tasks. Consequently, the transfer of the task must clearly include the purpose of the processing” (§106).

This clarification is welcomed because it reminds controllers that even if they correctly process personal data on one of the lawful grounds for processing (such as consent or legitimate interest) in compliance with Article 7 of the Directive, they still have to comply with all the other safeguards for processing personal data, including the principles for processing in Article 6 of the Directive (purpose limitation, data minimization etc).

The AG remarks that the reference for a preliminary ruling does not specify the purpose of the contested list and leaves it to the Supreme Court to look further into this question (§107). Additionally, the AG also considers that the Supreme Court “will have to examine whether the creation and use of the contested list and in particular the naming of Mr Puškár is necessary for the claimed public interest”. This is yet another reminder how important “necessity” is for personal data protection in the EU legal framework (check out EDPS’s recently published “Necessity Toolkit”).

Another very interesting point that the AG brings forward is how naming a person on this black list constitutes “a considerable interference with the rights of the person concerned”, beyond the right to privacy in Article 7 Charter – it also touches (§110):

  • “his reputation and could lead to serious, practical disadvantages in his dealings with the tax authorities;
  • the presumption of innocence in Article 48(1) of the Charter;
  • the legal persons associated with the person concerned, which will be affected in terms of their freedom to conduct business under Article 16 of the Charter”.

This finding is a testimony of the importance of complying with the right to the protection of personal data, as non-compliance would have various consequences on several other fundamental rights.

As the AG explains, “such a serious interference of this kind can only be proportionate if there are sufficient grounds for the suspicion that the person concerned purported to act as a company director of the legal persons associated with him and in so doing undermined the public interest in the collection of taxes and combating tax fraud” (§111).

In conclusion, the tax authorities can create a blacklist such as the one in the main proceedings on the grounds of Article 7(e) of the Data Protection Directive, but this assumes that (§117):

  • “the task was legally assigned to the tax authorities,
  • the use of the list is appropriate and necessary for the purposes of the tax authorities and
  • there are sufficient grounds to suspect that these persons should be on the list”.
  1. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection

Further, the AG spelled out that “neither the fundamental rights to privacy, Article 7 of the Charter, or data protection, Article 8, would in this case prevent the creation and use of the list” (§117).

The analysis to reach this conclusion was another missed opportunity to persuade the Court of Justice to better delineate the two fundamental rights protected by Article 7 and Article 8 of the Charter. The AG referred to these as “the fundamental rights to privacy and data protection”.

Without a clear analysis of what constitutes interference with the two rights, the AG referred to “naming of a person on the contested list” as “affecting” both fundamental rights (§115). In the same paragraph, she further analysed en masse “these interferences”, writing that they are only justified “if they have a sufficient legal basis, respect the essence of both fundamental rights, and preserve the principle of proportionality” (§ 115). Considering that the legality and proportionality of the measure were addressed in previous sections, the AG merely stated that “the adverse effects associated with inclusion on the contested list, those interferences do not meet the threshold of a breach of the essence of those rights” before concluding that neither of the two Charter articles would prevent the creation of such a blacklist.

  1. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed, even if the ECtHR case-law provides a higher level of protection for the rights of a person

The last question is one that is extremely interesting for EU lawyers in general, not necessarily for EU data protection lawyers, because it tackles the issue of different levels of protection of the same fundamental right emerging from the case-law of the Court of Justice of the EU in Luxembourg, on one hand, and the European Court of Human Rights in Strasbourg, on the other hand.

As the AG summarizes it, “the fourth question is aimed at clarifying whether a national court may follow the case-law of the Court of Justice of the European Union where this conflicts with the case-law of the ECtHR” (§118). This issue is relevant in our field because Article 8 of the European Convention of Human Rights shares partially the same material scope of Article 7 and Article 8 of the EU Charter of Fundamental Rights (Article 8 of the Convention is more complex), and Article 52(3) of the Charter states that “the rights in the Charter, which correspond to rights guaranteed by the European Convention on the Protection of Human Rights and Fundamental Freedoms (ECHR), have the same meaning and scope as conferred by the ECHR” (§122). However, the second sentence of Article 52(3) of the Charter permits EU law to accord more extensive protection (§122).

The AG specifies that “EU law permits the Court of Justice to deviate from the case-law of the ECtHR only to the extent that the former ascribes more extensive protection to specific fundamental rights than the latter. This deviation in turn is only permitted provided that it does not also cause another fundamental right in the Charter corresponding to a right in the ECHR to be accorded less protection than in the case-law of the ECtHR. One thinks, for example, of cases in which a trade-off must be made between specific fundamental rights” (§123).

Not surprisingly, the AG advises that when the case-law of the two Courts comes in conflict, the national courts should directly apply the case-law of the CJEU when it affords more protection to the fundamental rights in question, but they should send a reference for a preliminary ruling to the CJEU to ask which way to go when the case-law of the ECtHR affords enhanced protection to the fundamental right in question (§124 and §125). The argument of the AG is that the latter case “inevitably leads to a question of the interpretation of EU law with regard to the fundamental right in question and Article 52(3) of the Charter” which, if performed by the national Court, could further “amount to the view that the interpretation of the fundamental right in question by the Court of Justice is not compatible with Article 52(3)”.

As for the relevance of this question to the case at hand – it remains a mystery. The AG herself pointed out that “the admissibility of the question in this form is dubious, particularly as the Supreme Court does not state on which issue the two European courts supposedly are in conflict and the extent to which such a conflict is significant for the decision in the main proceedings” (§119).

  1. What to expect from the Court

How will the CJEU reply to these questions? My bet is that, in general, the Court will follow the AG on substance. However, it is possible that the Court will simplify the analysis and reformulate the questions in such a way that the answers will be structured around three main issues:

  • lawfulness of creating such a blacklist (and the lawful grounds for processing in the Data Protection Directive) and compatibility of this interference with both Article 7 and Article 8 of the Charter (I do hope, having low expectations nonetheless, that we will have more clarity of what constitutes interference with each of the two rights from the Court’s perspective);
  • compatibility of procedural law of Slovakia in the field of data protection with Article 47 Charter (in fact, this may be the only point where the Court could lay out a different result than the one proposed by the AG, in the sense that the condition to exhaust first administrative remedies before engaging in litigation may be considered a non-proportionate interference with the right to effective judicial remedy; it is also possible that the Court will refer for the first time directly to the GDPR);
  • the relationship between ECtHR and CJEU case-law on the same fundamental right.

Suggested citation: G. Zanfir-Fortuna, “Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)”, pdpEcho.com, 24 April 2017.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

Some end-of-the-year good news: People genuinely care about their privacy

Dear followers,

First, I would like to thank you for making this the most successful year in the 5 years life of pdpEcho (I would especially like to thank those who supported the blog and helped me cover, thus, the cost of renting the blog’s .com name). I started this blog when I was in my first year as a PhD student to gather all information I find interesting related to privacy and data protection. At that time I was trying to convince my classic “civilist” supervisor that data protection is also a matter of civil law. And that I could write a civil law thesis on this subject in Romanian, even though Romanian literature on it only counted one book title from 2004. In the five years that followed another book title was added to it and the blog and I grew together (be it at different paces).

In the recent months it offered me a way to keep myself connected to the field while transitioning from Brussels to the US. But most importantly it reminded me constantly that privacy is really not dead, as it has been claimed numerous times. I cared about it, people that daily found this blog cared about it and as long as we care about privacy, it will never die.

I am writing this end-of-the-year post with some very good news from Europe: you and I are not the only ones that care about privacy. A vast majority of Europeans also does. The European Commission published some days ago a Eurobarometer on ePrivacy, as a step towards the launch of the ePrivacy Directive reform later in January.

The results could not have been clearer:

More than nine in ten respondents said it is important that personal information (such as their pictures, contact lists, etc.) on their computer, smartphone or tablet can only be accessed with their permission, and that it is important that the confidentiality of their e-mails and online instant messaging is guaranteed (both 92%)” (source, p. 2).

“More than seven in ten think both of these aspects are very important. More than eight in ten (82%) also say it is important that tools for monitoring their activities online (such as cookies) can only be used with their permission (82%), with 56% of the opinion this is very important” (source, p. 2).

Overwhelming support for encryption

Remarkably, 90% of those asked agreed “they should be able to encrypt their messages and calls, so they can only be read by the recipient”. Almost as many (89%) agree the default settings of their browser should stop their information from being shared (source, p. 3).

Respondents thought it is unacceptable to have their online activities monitored in exchange for unrestricted access to a certain website (64%), or to pay in order not to be monitored when using a website (74%). Almost as many (71%) say it is unacceptable for companies to share information about them without their permission (71%), even if it helps companies provide new services they may like (source, p. 4).

You can find here the detailed report.

Therefore, there is serious cause to believe that our work and energy is well spent in this field.

The new year brings me several publishing projects that I am very much looking forward to, as well as two work projects on this side of the Atlantic. Nevertheless, I hope I will be able to keep up the work on pdpEcho, for which I hope to receive more feedback and even input from you.

In this note, I wish you all a Happy New Year, where all our fundamental rights will be valued and protected!

Gabriela

 

A million dollar question, literally: Can DPAs fine a controller directly on the basis of the GDPR, or do they need to wait for national laws?

by Gabriela Zanfir-Fortuna

The need to discuss the legal effect of the GDPR emerged as there are some opinions in the privacy bubble informing that it will take at least a couple of years before the GDPR will de facto have legal effect at national level, after the moment it becomes applicable in 2018. The main argument for this thesis is that national parliaments of the Member States will need to take action in a way or another, or that national governments will need to issue executive orders to grant new powers to supervisory authorities, including the power to fine.

This post will bring forward some facts emerging from EU primary law and from the case-law of the Court of Justice of the EU (CJEU) that need to be taken into account before talking about such a de facto grace period.

The conclusion is that, just like all EU regulations, the GDPR is directly applicable and has immediate effect from the date it becomes applicable according to its publication in the EU Official Journal (in this case, 25 May 2018), with no other national measures being required to give it effect in the Member States (not even translations at national level). While it is true that it contains provisions that give a margin of appreciation to Member States if they wish to intervene, most of the articles are sufficiently clear, detailed and straightforward to allow direct application, if need be ( for instance, if a Member State is late in adjusting and adapting its national data protection law).

1) EU regulations enjoy “direct applicability”: the rule is that they are “immediately applicable” and they don’t need national transposition

First and foremost, it is a fact emerging from the EU treaties that EU Regulations enjoy direct applicability, which means that once they become applicable they do not need to be transposed into national law.

This rule is set out in the second paragraph of Article 288 of the Treaty on the European Union, which states that:

“A regulation shall have general application. It shall be binding in its entirety and directly applicable in all Member States.”

On the contrary, according to the third paragraph of Article 288 TFEU, directives “shall be binding, as to the result to be achieved, upon each Member State to which it is addressed, but shall leave to the national authorities the choice of form and methods.”

Therefore, as the CJEU explained in settled case-law, “by virtue of the very nature of regulations and of their function in the system of sources of Community law, the provisions of those regulations generally have immediate effect in the national legal systems without it being necessary for the national authorities to adopt measures of application” (see Case C-278/02 Handlbauer2004, §25 and Case 93/71 Leonesio, 1972, §5) and in addition they also “operate to confer rights on individuals which the national courts have a duty to protect” (Case C-70/15 Lebek, 2016, §51).

However, the CJEU also ruled that “some of their provisions may nonetheless necessitate, for their implementation, the adoption of measures of application by the Member States” (Case C-278/02 Handlbauer2004, §26; C-403/98 Monte Arcosu, 2001, §26). But this is not the case of sufficiently clear and precise provisions, where Member States don’t enjoy any margin of manoeuvre. For instance, the Court found in Handlbauer that “this is not the case as regards Article 3(1) of Regulation No 2988/95 which, by fixing the limitation period for proceedings at four years as from the time when the irregularity is committed, leaves the Member States no discretion nor does it require them to adopt implementation measures” (§27).

Therefore, whenever an EU regulation leaves the Member States no discretion, nor does it require them to adopt implementation measures, the provisions of that regulation are directly and immediately applicable as they are.

2) EU regulations’ direct applicability is not depending on any national measure (not even translation published in national official journals)

The CJEU explained as far back as 1973 that for EU regulations to take effect in national legal systems of Member States there is not even the need to have their texts translated and published in the national official journals.

Asked whether the provisions of a Regulation can be “introduced into the legal order of Member States by internal measures reproducing the contents of Community provisions in such a way that the subject-matter is brought under national law”, the Court replied that “the direct application of a Regulation means that its entry into force and its application in favour of or against those subject to it are independent of any measure of reception into national law” (Case 34/73 Variola, 1973, §9 and §10). AG Kokott explained that such measures include “any publicity by the Member States” (Opinion in C-161/06 Skoma-lux, §54) in an Opinion that was substantially upheld by the Court in a judgment stating that the publication of a regulation in the Official Journal of the EU in an official language of a Member State is the only condition to give it effect and direct applicability in that Member State (Judgment in Case C-161/06).

The Court concluded in Variola that “a legislative measure under national law which reproduces the text of a directly applicable rule of Community law cannot in any way affect such direct applicability, or the Court’s jurisdiction under the Treaty” (operative part of the judgment). The Court also explained in Variola that “by virtue of the obligations arising from the Treaty and assumed on ratification, Member States are under a duty not to obstruct the direct applicability inherent in Regulations and other rules of Community law. Strict compliance with this obligation is an indispensable condition of simultaneous and uniform application of Community Regulations throughout the Community” (Case 34/73 Variola, 1973, §10).

3) National authorities could impose administrative penalties directly on the basis of a provision of a Regulation, where necessary 

The Court dealt with the question of national authorities imposing administrative fines directly on the basis of the provisions of an EU regulation in Case C-367/09 Belgish Interventie en Restitutie Bureau  on the interpretation of provisions from Regulation 2988/95.

After recalling its case-law on direct applicability of EU regulations (§32), including the exemption that some provisions of a Regulation necessitate for their implementation the adoption of measures of application (§33), the CJEU found that in that specific case national authorities cannot impose fines directly on the basis of Articles 5 and 7 of Regulation 2988/95 because “those provisions merely lay down general rules for supervision and penalties for the purpose of safeguarding the EU’s financial interests (…). In particular, those provisions do not specify which of the penalties listed in Article 5 of Regulation No 2988/95 should be applied in the case of an irregularity detrimental to the EU’s financial interests nor the category of operators on whom such penalties are to be imposed in such cases” (§36).

Therefore, the Court did not question the possibility of a national authority to impose fines directly on the legal basis provided by a regulation. The CJEU went directly to analyse the content of the relevant provision and found that fines could not be imposed because of the general character of that provision, which required additional measures to be adopted both at Member State and at EU level (were the provisions more clear, the authorities could have directly issued fines on the basis of the regulation).

One look at Article 83 GDPR and one can easily tell that this is not the case of that provision – it is clear who imposes fines, for what, against whom, on what criteria and what is the maximum amount for each category of fines. Neither is it the case of Article 58 on the powers of supervisory authorities. Article 83 GDPR allows Member States some discretion only if they wish to provide specific rules for fining public authorities (paragraph 7) and only if their legal system does not provide for administrative fines – in this case, the states are allowed to apply Article 83 in such a manner that the fine is initiated by the competent supervisory authority and imposed by competent national courts (paragraph 9).

4) Conclusion: beware of the GDPR from day 1

The GDPR, like all EU regulations, is directly applicable and has immediate effect in the legal order of Member States by virtue of its publication in the Official Journal of the EU and the conditions of applicability in time expressed therein, no additional national measures being required to give it effect.

While there are provisions that give Member States a margin of appreciation and a discretion to implement national measures, most of the provisions are sufficiently clear and precise to be applied as they are.

Of course there will be national data protection laws that will specify additional rules to the GDPR, giving effect to that margin of appreciation. But the national laws that will complement an EU regulation, such as the GDPR, are valid only as long as “they do not obstruct its direct applicability and do not conceal its [EU] nature, and if they specify that a discretion granted to them by that regulation is being exercised, provided that they adhere to the parameters laid down under it” (CJEU, Case C‑316/10 Danske Svineproducenter Justitsministeriet, §41).

As always, here is the fine print (or the caveat) whenever we are discussing about the interpretation of EU law: only the CJEU has the authority to interpret EU law in a binding manner.

(Note: The author is grateful to dr. Mihaela Mazilu-Babel, who provided support with preliminary research for this post)

***

Find what you’re reading useful? Please consider supporting pdpecho.

Even if post Brexit-UK adopts the GDPR, it will be left without its “heart”

Gabriela Zanfir Fortuna

brexit

There has been lately a wave of optimism of those looking for legal certainty that the GDPR will be adopted by the UK even after the country leaves the European Union. This wave was prompted by a declaration of the British Secretary of State, Karen Bradley, at the end of October, when she stated before a Committee of the Parliament that “We will be members of the EU in 2018 and therefore it would be expected and quite normal for us to opt into the GDPR and then look later at how best we might be able to help British business with data protection while maintaining high levels of protection for members of the publicThe information commissioner of the UK, Elisabeth Denham, welcomed the news. On another hand, as Amberhawk explained in detail, this will not mean that the UK will automatically be considered as ensuring an adequate level of protection.

The truth is that as long as the UK is still a Member of the EU, it can’t opt in or opt out, for that matter, from regulations (other than the ones subject to the exemptions negotiated by the UK when it entered the Union – but this is not the case for the GDPR). They are “binding in their entirety” and “directly applicable”, according to Article 288 of the Treaty on the Functioning of the EU. So, yes, quite normally, if the UK is still a Member State of the EU on 25 May 2018, then the GDPR will start applying in the UK just as it will be applying in Estonia or France.

The fate of the GDPR after Brexit becomes effective will be as uncertain as the fate of all other EU legislative acts transposed in the UK or directly applicable in the UK. But let’s imagine the GDPR will remain national law after Brexit, in a form or another. If this happens, it is likely that it will take a life of its own, departing from harmonised application throughout the EU. First and foremost, the GDPR in the UK will not be applied in the light of the Charter of Fundamental Rights of the EU and especially its Article 8 – the right to the protection of personal data. The Charter played an extraordinary role in the strengthening of data protection in the EU after it became binding, in 2009, being invoked by the Court of Justice of the EU in its landmark judgments – Google v Spain,  Digital Rights Ireland and Schrems.

The Court held as far back as 2003 that “the provisions of Directive 95/46, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy, must necessarily be interpreted in the light of fundamental rights” (Österreichischer Rundfunk, para 68). This principle was repeated in most of the following cases interpreting Directive 95/46 and other relevant secondary law for this field, perhaps with the most notable results in Digital Rights Ireland and Schrems. 

See, for instance:

“As far as concerns the rules relating to the security and protection of data retained by providers of publicly available electronic communications services or of public communications networks, it must be held that Directive 2006/24 does not provide for sufficient safeguards, as required by Article 8 of the Charter, to ensure effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data” (Digital Rights Ireland, para. 66).

“As regards the level of protection of fundamental rights and freedoms that is guaranteed within the European Union, EU legislation involving interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must, according to the Court’s settled case-law, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data. The need for such safeguards is all the greater where personal data is subjected to automatic processing and where there is a significant risk of unlawful access to that data” (Schrems, para. 91).

Applying data protection law outside the spectrum of fundamental rights will most likely not ensure sufficient protection to the person. While the UK will still remain under the legal effect of the European Convention of Human Rights and its Article 8 – respect for private life – this by far does not equate to the specific protection ensured to personal data by Article 8 of the Charter as interpreted and applied by the CJEU.

Not only the Charter will not be binding for the UK post-Brexit, but the Court of Justice of the EU will not have jurisdiction anymore on the UK territory (unless some sort of spectacular agreement is negotiated for Brexit). Moreover, EU law will not enjoy supremacy over national law, as there is the case right now. This means that the British data protection law will be able to depart from the European standard (GDPR) to the extent desirable by the legislature. For instance, there will be nothing staying in the way of the British legislature to adopt permissive exemptions to the rights of the data subject, pursuant to Article 23 GDPR.

So when I mentioned in the title that the GDPR in the post-Brexit UK will in any case be left without its “heart”, I was referring to its application and interpretation in the light of the Charter of the Fundamental Rights of the EU.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Interested in the GDPR? See the latest posts:

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

A look at political psychological targeting, EU data protection law and the US elections

Cambridge Analytica, a company that uses “data modeling and psychographic profiling” (according to its website), is credited with having decisively contributed to the outcome of the presidential election in the U.S.. They did so by using “a hyper-targeted psychological approach” allowing them to see trends among voters that no one else saw and thus to model the speech of the candidate to resonate with those trends. According to Mashable, the same company also assisted the Leave. EU campaign that leaded to Brexit.

How do they do it?

“We collect up to 5,000 data points on over 220 million Americans, and use more than 100 data variables to model target audience groups and predict the behavior of like-minded people” (my emphasis), states their website (for comparison, the US has a 324 million population). They further explain that “when you go beneath the surface and learn what people really care about you can create fully integrated engagement strategies that connect with every person at the individual level” (my emphasis).

According to Mashable, the company “uses a psychological approach to polling, harvesting billions of data from social media, credit card histories, voting records, consumer data, purchase history, supermarket loyalty schemes, phone calls, field operatives, Facebook surveys and TV watching habits“. This data “is bought or licensed from brokers or sourced from social media”.

(For a person who dedicated their professional life to personal data protection this sounds chilling.)

Legal implications

Under US privacy law this kind of practice seems to have no legal implications, as it doesn’t involve processing by any authority of the state, it’s not a matter of consumer protection and it doesn’t seem to fall, prima facie, under any piece of the piecemeal legislation dealing with personal data in the U.S. (please correct me if I’m wrong).

Under EU data protection law, this practice would raise a series of serious questions (see below), without even getting into the debate of whether this sort of intimate profiling would also breach the right to private life as protected by Article 7 of the EU Charter of Fundamental Rights and Article 8 of the European Convention of Human Rights (the right to personal data protection and the right to private life are protected separately in the EU legal order). Put it simple, the right to data protection enshrines the “rules of the road” (safeguards) for data that is being processed on a lawful ground, while the right to private life protects the inner private sphere of a person altogether, meaning that it can prohibit the unjustified interferences in the person’s private life. This post will only look at mass psychological profiling from the data protection perspective.

Does EU data protection law apply to the political profilers targeting US voters?

But why would EU data protection law even be applicable to a company creating profiles of 220 million Americans? Surprisingly, EU data protection law could indeed be relevant in this case, if it turns out that the company carrying out the profiling is based in the UK (London-based), as several websites claim in their articles (here, here and here).

Under Article 4(1)(a) of Directive 95/46, the national provisions adopted pursuant to the directive shall apply “where the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State“. Therefore, the territorial application of Directive 95/46 is triggered by the place of establishment of the controller.  Moreover, Recital 18 of the Directive’s Preamble explains that “in order to ensure that individuals are not deprived of the protection to which they are entitled under this Directive, any processing of personal data in the Community (EU – n.) must be carried out in accordance with the law of one of the Member States” and that “in this connection, processing carried out under the responsibility of a controller who is established in a Member State should be governed by the law of that State” (see also CJEU Case C-230/14 Weltimmo, paras. 24, 25, 26).

There are, therefore, no exceptions to applying EU data protection rules to any processing of personal data that is carried out under the responsibility of a controller established in a Member State. Is it relevant here whether the data subjects are not European citizens, and whether they would not even be physically located within Europe? The answer is probably in the negative. Directive 95/46 provides that the data subjects it protects are “identified or identifiable natural persons“, without differentiating them based on their nationality. Neither does the Directive link its application to any territorial factor concerning the data subjects. Moreover, according to Article 8 of the EU Charter of Fundamental Rights, “everyone has the right to the protection of personal data concerning him or her”.

I must emphasise here that the Court of Justice of the EU is the only authority that can interpret EU law in a binding manner and that until the Court decides how to interpret EU law in a specific case, we can only engage in argumentative exercises. If the interpretation proposed above would be found to have some merit, it would indeed be somewhat ironic to have the data of 220 million Americans protected by EU data protection rules.

What safeguards do persons have against psychological profiling for political purposes?

This kind of psychological profiling for political purposes would raise a number of serious questions. First of all, there is the question of whether this processing operation involves processing of “special categories of data”. According to Article 8(1) of Directive 95/46, “Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.” There are several exceptions to this prohibition, of which only two would conceivably be applicable to this kind of profiling:

  • if the data subject has given his explicit consent to the processing of those data (letter a) or
  • the processing relates to data which are manifestly made public by the data subject (letter e).

In order for this kind of psychological profiling to be lawful, the controller must obtain explicit consent to process all the points of data used for every person profiled. Or the controller must only use those data points that were manifestly made public by a person.

Moreover, under Article 15(1) of Directive 95/46, the person has the right “not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”. It is of course to be interpreted to what extent psychological profiling for political purposes produces legal effects or significantly affects the person.

Another problem concerns the obligation of the controller to inform every person concerned that this kind of profiling is taking place (Articles 10 and 11 of Directive 95/46) and to give them details about the identity of the controller, the purposes of the processing and all the personal data that is being processed. In addition, the person should be informed that he or she has the right to ask for a copy of the data the controller holds about him or her and the right to ask for the erasure of that data if it was processed unlawfully (Article 12 of Directive 95/46).

Significantly, the person has the right to opt-out of a processing operation, at any time, without giving reasons, if that data is being processed for the purposes of direct marketing (Article 14(b) of Directive 95/46). For instance, in the UK, the supervisory authority – the Information Commissioner’s Office, issued Guidance for political campaigns in 2014 and gave the example of “a telephone call which seeks an individual’s opinions in order to use that data to identify those people likely to support the political party or referendum campaign at a future date in order to target them with marketing” as constituting direct marketing.

Some thoughts

  • The analysis of how EU data protection law is relevant for this kind of profiling would be more poignant if it would be made under the General Data Protection Regulation, which will become applicable on 25 May 2018 and which has a special provision for profiling.
  • The biggest ever fine issued by the supervisory authority in the UK is 350.000 pounds, this year. Under the GDPR, breaches of data protection rules will lead to fines up to 20 million euro or 4% of the controller’s global annual turnover for the previous year, whichever is higher.
  • If any company based in the UK used this kind of psychological profiling and micro-targeting for the Brexit campaign, that processing operation would undoubtedly fall under the rules of EU data protection law. This stands true of any analytics company that provides these services to political parties anywhere in the EU using personal data of EU persons. Perhaps this is a good time to revisit the discussion we had at CPDP2016 on political behavioural targeting (who would have thought the topic will gain so much momentum this year?)
  • I wonder if data protection rules should be the only “wall (?)” between this sort of targeted-political-message-generating campaign profiling and the outcome of democratic elections.
  • Talking about ethics, data protection and big data together is becoming more urgent everyday.

***

Find what you’re reading useful? Consider supporting pdpecho.