Tag Archives: right to erasure

A US Bill from 1974 shares so much DNA with the GDPR, it could be its ancestor

America’s own GDPR was introduced in Congress in 1974. This Bill applied to government and companies, it restricted international transfers and offered U.S. and foreign “data subjects” rights to access, erasure and even… explanation.

The U.S. has been recently working towards finally adopting comprehensive privacy and data protection rules, with unfolding efforts both at federal and state level. Until now, only Californians can claim they actually achieved something on the road to protecting their rights impacted by the widespread collection and use of personal information. Other serious efforts are undergoing in Washington State, but they may end up being undermined by good intentions.

These developments are possible right now due to a combination of EU’s General Data Protection Regulation’s (GDPR) global reach and notoriety, the countless privacy scandals affecting Americans, and the absence of comprehensive statutory legal protections in the U.S. of privacy and other individual rights that may be affected by the collection and use of personal information.

But did you know this is not the first time the U.S. is having privacy law fever? In the late ’60s and early ’70s, American lawmakers were concerned about the rise of automated data processing and computerized databases. Serious efforts were put into analyzing how the rights of the American people could be protected against misuses and abuses of personal information. The Fair Credit Reporting Act was adopted in 1970. An influential Report was published in 1973 by the Department of Health, Education and Welfare (HEW) proposing a set of Fair Information Practice Principles built on an impressive, meticulous analysis (read it if you haven’t done so yet; bonus: it’s peppered with smart literary mottos in between chapters). The Report called for comprehensive federal privacy legislation applicable both to government and companies.

About six months after the publication of the HEW Report, in January 1974, Bill S.3418 was introduced in the US Senate by three Senators — Ervin, Percy and Muskie, ‘to establish a Federal Privacy Board, to oversee the gathering and disclosure of information concerning individuals, and to provide management systems in all Federal agencies, State and local governments, and other organizations’.

This Bill was clearly ahead of its time and aged astoundingly well, especially when compared to some of the key characteristics of the GDPR — the current global golden standard for comprehensive data protection law:

It applied to both public and private sectors, at federal and state level

The Bill had a very broad scope of application. It covered the activity of “organizations” defined as any Federal agencies; the government of the District of Columbia; any authority of any State, local government, or other jurisdiction; any public or private entity engaged in business for profit. It only exempted from its rules information systems pertaining to Federal agencies that were vital to national defense, as well as criminal investigatory files of Federal, State or local law enforcement and any information maintained by the press or news media, except for information related to their employees.

It created a Federal Privacy Board to oversee its application

The Federal Privacy Board would have been created as part of the Executive branch, composed of five members appointed by the President with the approval of the Senate, for a three year mandate. The Board would have been granted effective powers to investigate violations of the law — including by being granted admission to the premises where any information system or computers are kept, to recommend either criminal or civil penalties, and to actually order any organization found in breach of the law ’to cease and desist such violation’.

It equally protected the rights of Americans and foreigners as data subjects

It’s quite difficult to believe it (especially in the context of the endless Transatlantic debates that ultimately lead to the Judicial Redress Act), but this Bill explicitly protected “any data subject of a foreign nationality, whether residing in the United States or not” by requiring organizations to afford them “the same rights under this Act as are afforded to citizens in the United States”. Such a broad personal scope has been a characteristic of the European data protection law framework even before the GDPR. It also made possible the legal challenges brought in the UK against Cambridge Analytica by David Caroll, a U.S. citizen residing in New York.

It provided restrictions for international data transfers to jurisdictions which did not apply the protections enshrined in the Bill

Under this Bill, organizations were required to “transfer no personal information beyond the jurisdiction of the United States without specific authorization from the data subject or pursuant to a treaty or executive agreement in force guaranteeing that any foreign government or organization receiving personal information will comply with the applicable provisions of this Act with respect to such information”. The idea of restricting transfers of personal data to countries which do not ensure a similar level of protection is a staple of the EU data protection law regime and the source of some of the biggest EU-US tensions related to tech and data governance.

It provided for rights of access to, correction, “purging” of personal information. And for notification of purging to former recipients!

The Bill provided for an extensive right of access to one’s own personal information. It required organizations to grant data subjects “the right to inspect, in a form comprehensible” all personal information related to them, the nature of the sources of the information and the recipients of the personal information. In addition, it also granted individuals the right to challenge and correct information. As part of this right to challenge and correct information, the Bill even provided for a kind of “right to be forgotten”, since it asked organizations to “purge any such information that is found to be incomplete, inaccurate, not pertinent, not timely nor necessary to be retained, or can no longer be verified”. Moreover, the Bill also required organizations to “furnish to past recipients of such information notification that the item has been purged or corrected” at the request of the data subject.

It provided for transparency rights into statistical models and receiving some explanation

The same provision granting a right to challenge and correct personal information referred also to individuals wishing “to explain” information about them in information systems, but it is not clear how organizations should have particularly responded to explanation requests. Elsewhere in the Bill, organizations “maintaining an information system that disseminates statistical reports or research findings based on personal information drawn from the system, or from systems of other organizations” were required to “make available to any data subject (without revealing trade secrets) methodology and materials necessary to validate statistical analyses” (!). Moreover, those organizations were also asked not to make information available for independent analysis “without guarantees that no personal information will be used in a way that might prejudice judgments about any data subject”.

It provided some rules even for collection of personal information

One of the key questions to ask about data protection legislation generally is whether it intervenes at the time of collection of personal data, as opposed to merely regulating its use. This Bill cared about collection too. It provided that organizations must “collect, maintain, use and disseminate only personal information necessary to accomplish a proper purpose of the organization”, “collect information to the greatest extent possible from the data subject directly” and even “collect no personal information concerning the political or religious beliefs, affiliations, and activities of data subjects which is maintained, used or disseminated in or by any information system operated by any governmental agency, unless authorized by law”.

There are other remarkable features of this Bill that remind of features of the GDPR, such as broad definitions of personal information and data subjects (“an individual about whom personal information is indexed or may be located under his name, personal number, or other identifiable particulars, in an information system”) and show sophisticated thinking about managing the impact automated processing of personal data might have on the rights of individuals. Enforcement of the Bill included criminal and civil penalties applied with the help of the U.S. Attorney General and the Federal Privacy Board, as well as a private right of action limited only to breaches of the right to access personal information.

So what happened to it? Throughout the legislative process in Congress, this Bill was almost completely rewritten and it ultimately became the US Privacy Act 1974 — a privacy law quite limited in scope (applicable only to Federal agencies) and ambitions compared to the initial proposal. The answer about what might have happened during this process to fundamentally rewrite the Bill is somewhere in these 1466 pages recording the debates around the US Privacy Act of 1974.

Be it a failed attempt to provide comprehensive data protection and privacy legislation in the U.S., it nonetheless shows how much common thinking is shared by Europe and America. At the same time this Bill was introduced in the U.S. Senate, Europe was having its own data protection law fever, with many legislative proposals being discussed in Western Europe after the first data protection law was adopted in 1970 in the German land of Hesse. But according to Frits Hondius, a Dutch scholar documenting these efforts in his volume “Emerging Data Protection in Europe” published in 1975:

“A factor of considerable influence was the development of data protection on the American scene. Almost every issue that arose in Europe was also an issue in the United States, but at an earlier time and on a more dramatic scale. (…) The writings by American authors about privacy and computers (e.g. Westin and Miller), the 1966 congressional hearings, and the examples set by federal and state legislation, such as the US Fair Credit Reporting Act 1970 and the US Privacy Act 1974, have made a deep impact on data protection legislation in Europe.”

After a shared start in the late ‘60s and early ‘70s, the two privacy and data protection law regimes evolved significantly different. Almost half a century later, it seems to be Europe’s turn to impact the data protection and privacy law debate in the U.S..

Exam scripts are partly personal data and other practical findings of the CJEU in Nowak

The Court of Justice of the European Union (CJEU) gave its judgment in Case C-434/16 Nowak on 20 December 2017, and it is significant from several points of view:

  • It provides a good summarized description of what constitutes “personal data”, referring to both objective and subjective information, regardless of its sensitivity, and it also details what the “related to” criterion from the legal definition of personal data means;
  • It *almost* departs from its YS jurisprudence on the concept of personal data;
  • It applies the interpretation that the Article 29 Working Party gave to the “related to” criterion in its Opinion on personal data from 2007, highlighting thus the weight that the interpretation of data protection law given by the European DPAs might have;
  • It establishes that written answers submitted by a candidate during an exam are personal data of the candidate (this is relevant for all education services providers);
  • It also establishes that the questions of the exam do not fall in the category of “personal data” – hence, not the entire exam script is considered personal data, but only the answers submitted by the candidate;
  • It establishes that the comments reviewers make on the margins of one’s written answers to an exam are personal data of the person being examined, while also being personal data of the reviewer;
  • It establishes that exam scripts should only be kept in an identifiable form only as long as they can be challenged.

This comment looks closer at all of these findings.

Facts of the Case

Mr Nowak was a trainee accountant who requested access to his exam script from the Institute of Chartered Accountants of Ireland (CAI), after failing the examination. He first challenged the results of the exam with no success. He then submitted a subject access request to the CAI, asking to receive a copy of all his personal data held by the CAI. He obtained 17 documents, but the exam script was not among them.

Mr Nowak brought this to the attention of the Irish Data Protection Commissioner (DPC) through an email, arguing that his exam script was also his personal data. The DPC  answered by email that exam scripts “would not generally constitute personal data”. Mr Nowak submitted then a formal complaint with the DPC against the CAI. The official response of the DPC was to reject the complaint on the ground that it is “frivolous or vexatious” (the same reason used to reject the first complaint of Max Schrems challenging the EU-US Safe Harbor scheme).

Mr Nowak then challenged this decision of the Irish DPC in front of the Circuit Court, then the High Court and then the Court of Appeal, which all decided against him. Finally, he challenged the decision of the Court of Appeal at the Supreme Court who decided to stay proceedings and send questions for a preliminary ruling to the CJEU, since the case required interpretation of EU law – in particular, how should the concept of “personal data” as provided for by EU Directive 95/46 be interpreted (a small procedural reminder here: Courts of last instance are under an obligation to send questions for a preliminary ruling to the CJEU in all cases that require the interpretation of EU law, per Article 267 TFEU last paragraph).

Questions referred

The Supreme Court asked the CJEU two questions (in summary):

  1. Is information recorded in/as answers given by an exam candidate capable of being personal data?
  2. If this is the case, then what factors are relevant in determining whether in a given case such information is personal data?

Pseudonymised data is personal data

First, recalling its Breyer jurisprudence, the Court establishes that, for information to be treated as personal data, it is of no relevance whether all the information enabling the identification of the data subject is in the hands of one person or whether the identifiers are separated (§31). In this particular case, it is not relevant “whether the examiner can or cannot identify the candidate at the time when he/she is correcting and marking the examination script” (§30).

The Court then looks at the definition of personal data from Directive 95/46, underlying that it has two elements: “any information” and “related to an identified or identifiable natural person”.

“Any information” means literally any information, be it objective or subjective

The Court recalls that the scope of Directive 95/46 is “very wide and the personal data covered … is varied” (§33).

“The use of the expression ‘any information’ in the definition of the concept of ‘personal data’ … reflects the aim of the EU legislature to assign a wide scope to that concept, which is not restricted to information that is sensitive or private, but potentially encompasses all kinds of information, not only objective but also subjective, in the form of opinions and assessments, provided that it ‘relates’ to the data subject.” (§34)

Save this paragraph, as it is a new jurisprudential source of describing what constitutes personal data – it is certainly a good summary, in line with the Court’s previous case-law (see an excellent overview of the Court’s approach to the definition of personal data here, p. 40 – 41). It makes clear that, for instance, comments on social media, reviews of products/companies, ratings and any other subjective assessments are personal data, as long as they relate to an identified or identifiable individual. This is also true for any sort of objective information (think shoe number), regardless of whether it is sensitive or private, as long as it relates to an identified or identifiable individual.

“Related to” must be judged in relation to “content, purpose or effect/consequences”

The condition for any information to be considered personal data is that it relates to a natural person. According to the Court, this means that “by reason of its content, purpose or effect, (it) is linked to a particular person” (§35). The Court thus applies the test developed by the Article 29 Working Party in its 2007 Opinion on the concept of personal data. Ten years ago, the DPAs wrote that “in order to consider that the data ‘relate’ to an individual, a ‘content’ element OR a ‘purpose’ element OR a ‘result’ element should be present” (2007 Opinion, p. 10).

The Court now adopted this test in its case-law, giving an indication of how important the common interpretation given by data protection authorities in official guidance is. However, the Court does not directly refer to the Opinion.

Applying the test to the facts of the case, the Court showed that the content of exam answers “reflects the extent of the candidate’s knowledge and competence in a given field and, in some cases, his intellect, thought processes, and judgment” (§37). Additionally, following AG Kokott’s Opinion, the Court also pointed out that “in the case of a handwritten script, the answers contain, in addition, information as to his handwriting” (§37).

The purpose of the answers is “to evaluate the candidate’s professional abilities and his suitability to practice the profession concerned” (§38) and the consequence of the answers “is liable to have an effect on his or her rights and interests, in that it may determine or influence, for example, the chance of entering the profession aspired to or of obtaining the post sought” (§39).

Comments of reviewers are two times personal data

The test is then applied to the comments of reviewers on the margin of a candidate’s answers. The Court showed that “The content of those comments reflects the opinion or the assessment of the examiner of the individual performance of the candidate in the examination, particularly of his or her knowledge and competences in the field concerned. The purpose of those comments is, moreover, precisely to record the evaluation by the examiner of the candidate’s performance, and those comments are liable to have effects for the candidate” (§43).

It is important to note here that complying with only one of the three criteria (content, purpose, effects) is enough to qualify information as “relating to” an individuals, even if the Court found in this particular case that all of them are met. This is shown by the us of “or” in the enumeration made in §35, as shown above.

The Court also found that “the same information may relate to a number of individuals and may constitute for each of them, provided that those persons are identified or identifiable, personal data” (§45), having regard to the fact that the comments of the examiners are personal data of both the examiners and the “examinee”.

Information can be Personal data regardless of whether one is able to rectify it or not

It was the Irish DPC that argued that qualifying information as “personal data” should be affected by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification (§46). The logic here was that if data cannot be rectified, it cannot be considered personal – just as exam answers cannot be rectified after the exam finished.

The Court (rightfully so) disagreed with this claim, following the opinion of the Advocate General and contradicting its own findings in Case C-141/12 YS (see a more detailed analysis of the interaction between the two judgments below). It argued that “a number of principles and safeguards, provided for by Directive 95/46, are attached to that classification and follow from that classification” (§47), meaning that protecting personal data goes far beyond the ability to access and rectify your data. This finding is followed by a summary of the fundamental mechanisms encompassed by data protection.

Data protection is a web of safeguards, accountability and individual rights

Starting from recital 25 of Directive 95/46 (yet again, how important recitals are! Think here of Recital 4 of the GDPR and the role it can play in future cases – “The processing of personal data should be designed to serve mankind”), the Court stated that:

“…the principles of protection provided for by that directive are reflected, on the one hand, in the obligations imposed on those responsible for processing data, obligations which concern in particular data quality, technical security, notification to the supervisory authority, and the circumstances under which processing can be carried out, and, on the other hand, in the rights conferred on individuals, the data on whom are the subject of processing, to be informed that processing is taking place, to consult the data, to request corrections and even to object to processing in certain circumstances” (§48).

The Court thus looks at data protection as a web of accountability, safeguards (reflected in technical security measures, data quality, conditions for lawful processing data) and rights conferred to the individuals.

In this case, not considering exam answers personal data just because they cannot be “corrected” after the exam would strip this information from the other web of protections, such as being processed on a legitimate ground, being retained only for the necessary period of time and so on. The Court does not phrase this finding this way, but it states that:

“Accordingly, if information relating to a candidate, contained in his or her answers submitted at a professional examination and in the comments made by the examiner with respect to those answers, were not to be classified as ‘personal data’, that would have the effect of entirely excluding that information from the obligation to comply not only with the principles and safeguards that must be observed in the area of personal data protection, and, in particular, the principles relating to the quality of such data and the criteria for making data processing legitimate, established in Articles 6 and 7 of Directive 95/46, but also with the rights of access, rectification and objection of the data subject, provided for in Articles 12 and 14 of that directive, and with the supervision exercised by the supervisory authority under Article 28 of that directive” (§49).

Furthermore, the Court shows that errors in the answers given to an exam do not constitute “inaccuracy” of personal data, because the level of knowledge of a candidate is revealed precisely by the errors in his or her answers, and revealing the level of knowledge is the purpose of this particular data processing. As the Court explains, “[i]t is apparent from Article 6(1)(d) of Directive 95/46 that the assessment of whether personal data is accurate and complete must be made in the light of the purpose for which that data was collected” (§53).

Exam scripts should only be kept in an identifiable form as long as they can be challenged

The Court further explained that both exam answers and reviewers’ comments can nevertheless be subject to “inaccuracy” in a data protection sense, “for example due to the fact that, by mistake, the examination scripts were mixed up in such a way that the answers of another candidate were ascribed to the candidate concerned, or that some of the cover sheets containing the answers of that candidate are lost, so that those answers are incomplete, or that any comments made by an examiner do not accurately record the examiner’s evaluation of the answers of the candidate concerned” (§54).

Also, the Court also admitted the possibility that “a candidate may, under Article 12(b) of Directive 95/46, have the right to ask the data controller to ensure that his examination answers and the examiner’s comments with respect to them are, after a certain period of time, erased, that is to say, destroyed” (§55).

Another finding of the Court that will be useful to schools, universities and other educational institutions is that keeping exam scripts related to an identifiable individual is not necessary anymore after the examination procedure is closed and can no longer be challenged: “Taking into consideration the purpose of the answers submitted by an examination candidate and of the examiner’s comments with respect to those answers, their retention in a form permitting the identification of the candidate is, a priori, no longer necessary as soon as the examination procedure is finally closed and can no longer be challenged, so that those answers and comments have lost any probative value” (§55).

The Court distances itself from the findings in C-141/12 YS, but still wants to keep that jurisprudence alive

One of the biggest questions surrounding the judgment in Nowak was whether the Court will follow AG’s Opinion and change it’s jurisprudence from C-141/12 YS.  In that judgment, the Court found that the legal analysis used by the Dutch Ministry of Immigration in a specific case of asylum seekers is not personal data, and the main reason invoked was that “[i]n contrast to the data relating to the applicant for a residence permit which is in the minute and which may constitute the factual basis of the legal analysis contained therein, such an analysis … is not in itself liable to be the subject of a check of its accuracy by that applicant and a rectification under Article 12(b) of Directive 95/46” (§45).

The Court further noted: “In those circumstances, extending the right of access of the applicant for a residence permit to that legal analysis would not in fact serve the directive’s purpose of guaranteeing the protection of the applicant’s right to privacy with regard to the processing of data relating to him, but would serve the purpose of guaranteeing him a right of access to administrative documents, which is not however covered by Directive 95/46.” Finally, the finding was that “[i]t follows from all the foregoing considerations … that the data relating to the applicant for a residence permit contained in the minute and, where relevant, the data in the legal analysis contained in the minute are ‘personal data’ within the meaning of that provision, whereas, by contrast, that analysis cannot in itself be so classified” (§48).

Essentially, in YS the Court linked the ability of accessing and correcting personal data with the classification of information as personal data, finding that if the information cannot be corrected, then it cannot be accessed and it cannot be classified as personal data.

By contrast, following AG Kokott’s analysis, in Nowak the Court essentially states that classifying information as personal data must not be affected by the existence of the rights to access and rectification – in the sense that the possibility to effectively invoke them should not play a role in establishing that certain information is or is not personal data: “the question whether written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers should be classified as personal data cannot be affected … by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification, pursuant to Article 12(a) and (b) of Directive 95/46” (§46).

However, the Court is certainly not ready to fully change its jurisprudence established in YS, and even refers to its judgment in YS in a couple of paragraphs. In the last paragraphs of Nowak, the Court links the ability to correct or erase data to the existence of the right of accessing that data (but not to classifying information as personal data).

The Court states that: “In so far as the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers are therefore liable to be checked for, in particular, their accuracy and the need for their retention… and may be subject to rectification or erasure…, the Court must hold that to give a candidate a right of access to those answers and to those comments… serves the purpose of that directive of guaranteeing the protection of that candidate’s right to privacy with regard to the processing of data relating to him (see, a contrario, judgment of 17 July 2014, YS and Others, C‑141/12 and C‑372/12, EU:C:2014:2081, paragraphs 45 and 46), irrespective of whether that candidate does or does not also have such a right of access under the national legislation applicable to the examination procedure”.

After previously showing an ever deeper understanding of data protection in its Nowak judgment, the Court sticks to some of its findings from YS, even if this meant perpetuating a confusion between the fundamental right to respect for private life and the fundamental right to the protection of personal data: “it must be recalled that the protection of the fundamental right to respect for private life means, inter alia, that any individual may be certain that the personal data relating to him is correct and that it is processed in a lawful manner” (§57 in Nowak and §44 in YS). Lawful processing of personal data and the right to keep personal data accurate are, in fact, enshrined in Article 8 of the EU Charter – the right to the protection of personal data, and not in Article 7 – the right to respect for private life.

Obiter dictum 1: the curious insertion of “exam questions” in the equation

The Court also does something curious in these last paragraphs. It simply states, after the paragraphs sending to the YS judgment, that “the rights of access and rectification, under Article 12(a) and (b) of Directive 95/46, do not extend to the examination questions, which do not as such constitute the candidate’s personal data” (§58). The national court did not ask about this specific point. AG Kokott also does not address this issue at all in her Opinion. This might have been raised during the hearing, but no context is provided to it. The Court simply states that “Last, it must be said…” and follows it with the finding regarding test questions.

While it is easy to see that questions of a specific test, by themselves, are not personal data, as they do not relate with regard to their content, purpose or effect to a specific individual, the situation is not as clear when the questions are part of the “solved” exam sheet of a specific candidate. The question is: “Are the answers of the test inextricably linked to the questions?” Imagine a multiple choice test, where the candidate only gains access to his/her answers, without obtaining access to the questions of that test. Accessing the answers would be unintelligible. For instance, EPSO candidates have been trying for years to access their own exam sheets held by the EPSO agency of the European Union, with no success. This is exactly because EPSO only provides access to the series of letters chosen as answers from the multiple choice test. Challenges of this practice have all failed, including those brought to the attention of the former Civil Service Tribunal of the CJEU (see this case, for example). This particular finding in Nowak closes the barely opened door for EPSO candidates to finally have access to their whole test sheet.

Obiter dictum 2: reminding Member States they can restrict the right of access

With an apparent reason and referring to the GDPR, the CJEU recalls, as another obiter dictum, under the same “it must be said” (§58 and §59), that both Directive 95/46 and the GDPR “provide for certain restrictions of those rights” (§59) – access, erasure etc.

It also specifically refers to grounds that can be invoked by Member States when limiting the right to access under the GDPR: when such a restriction constitutes a necessary measure to safeguard the rights and freedoms of others (§60,§61), or if it is done for other objectives of general public interest of the Union or of a Member State (§61).

These findings are not followed by any other considerations, as the Court concludes with a finding that had already been reached around §50: “the answer to the questions referred is that Article 2(a) of Directive 95/46 must be interpreted as meaning that, in circumstances such as those of the main proceedings, the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers constitute personal data, within the meaning of that provision” (§62).

If you want to have a look at a summary of AG Kokott’s excellent Conclusions in this case and then compare them to the judgment of the Court, click here. The Court did follow the Conclusions to a great extent.

 

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

CJEU’s AG Bot referred to the GDPR in his recent ‘right to be forgotten’ Opinion

It may only become applicable on 25 May 2018, but the GDPR already made its official debut in the case-law of the CJEU.

It was the last paragraph (§101) of the Conclusions of AG Bot in Case C-398/15 Manni, published on 8 September, that specifically referred to Regulation 2016/679 (the official name of the GDPR). The case concerns the question of whether the right to erasure (the accurate name of the more famous “right to be forgotten”) as enshrined in Article 12 of Directive 95/46 also applies in the case of personal data of entrepreneurs recorded in the Public Registry of companies, if their organisation went bankrupt years ago. Curiously, the preliminary ruling question doesn’t specifically refer to the right to erasure, but to the obligation in Article 6(1)(e) for controllers not to retain the data longer than necessary to achieve the purpose for which they were collected.

In fact, Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§30).

Disclaimer! The Opinion is not yet available in English, but in another handful of official languages of the EU. Therefore, the following quotes are all my translation from French or Romanian.

AG Bot advised the Court to reply to the preliminary ruling questions in the sense that all personal data in the Public Registry of companies should be retained there indefinitely, irrespective of the fact that companies to whose administrators the data refer are still active or not. “Public Registries of companies cannot achieve their main purpose, namely the consolidation of legal certainty by disclosing, in accordance with the transparency principle, legally accurate information, if access to this information would not be allowed indefinitely to all third parties” (§98).

The AG adds that “the choice of natural persons to get involved in the economic life through a commercial company implies a permanent requirement of transparency. For this main reason, detailed throughout the Opinion, I consider that the interference in the the right to the protection of personal data that are registered in a Public Registry of companies, specifically ensuring their publicity for an indefinite period of time and aimed towards any person who asks for access to these data, is justified by the preponderant interest of third parties to access those data” (§100).

Restricting the circle of ‘interested third parties’ would be incompatible with the purpose of the Public Registry

Before reaching this conclusion, the AG dismissed a proposal by the Commission that suggested a limited access to the personal data of administrators of bankrupt companies could be ensured only for those third parties that “show a legitimate interest” in obtaining it.

The AG considered that this suggestion “cannot, at this stage of development of EU law, ensure a fair balance between the objective of protecting third parties and the right to the protection of personal data registered in Public Registries of companies” (§87). In this regard, he recalled that the objective to protect the interest of third parties as enshrined in the First Council Directive 68/151  “is provided for in a sufficiently wide manner so as to encompass not only the creditors of a company, but also, in general, all persons that want to obtain information regarding that company” (§88).

Earlier, the AG had also found that the suggestion to anonymise data regarding the administrators of bankrupt companies is not compatible with the historical function of the Public Registry and with the objective to protect third parties that is inherent to such registries. “The objective to establish a full picture of a bankrupt company is incompatible with processing anonymous data” (§78).

Throughout the Opinion, the AG mainly interprets the principles underpinning the First Council Directive 68/151/EC (of 9 March 1968 on co-ordination of safeguards which, for the protection of the interests of members and others, are required by Member States of companies within the meaning of the second paragraph of Article 58 of the Treaty, with a view to making such safeguards equivalent throughout the Community)  and it is apparent that it enjoys precedence over Directive 95/46/EC.

Finally: the reference to the GDPR

The AG never refers in his analysis to Article 12 of Directive 95/46,  which grants data subjects the right to erasure. However, come the last paragraph of the Opinion, the AG does refer to Article 17(3)(b) and (d) from Regulation (EU) 2016/679 (yes, the GDPR). He applies Article 17 GDPR to the facts of the case and mentions that the preceding analysis “is compatible” with it, because “this Article provides that the right to erasure of personal data, or ‘the right to be forgotten’, does not apply to a processing operation ‘for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’ or ‘for archiving purposes in the public interest'” (§101).

While I find the Opinion of the AG clear and well argued, I have two comments. I wish he had referred more comprehensively to the fundamental rights aspect of the case when balancing the provisions of the two directives. But most of all, I wish he would have analysed the right to erasure itself, the conditions that trigger it and the exemptions under Article 13 of Directive 95/46.

My bet on the outcome of the case: the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.

A small thing that bugs me about this case is that I find there is a differentiation between searching a Registry of Companies being interested in a company name and searching a Registry of Companies being interested in a specific natural person. I mean, all third parties may very well be interested in finding out everything there is to know about bankrupt Company X, discovering thus that Mr Manni was the administrator. To me, this does not seem to be the same situation as searching the Public Registry of companies using Mr Manni’s name to find out all about Mr Manni’s background. In §88 the AG even mentions, when recognising the all encompassing interest of every third party to access all information about a certain company indefinitely, that Directive 68/151 protects the interest of “all persons that want to obtain information regarding this company“. I know the case is about keeping or deleting the personal data of Mr Manni from the Registry. And ultimately it is important to keep the information there due to the general interest of knowing everything about the history of a company. However, does it make any difference for the lawfulness of certain processing operations related to the data in the Registry that the Registry of companies is used to create profiles of natural persons? I don’t know. But it’s something that bugged me while reading the Opinion. Moreover, if you compare this situation to the “clean slate” rules for certain offenders that have their data erased from the criminal record, it is even more bugging.  (Note: at §34 the AG specifies he is only referring in his Opinion to the processing of personal data by the Chamber of Commerce and not by private companies specialising in providing background information about entrepreneurs).

Fun fact #1

The GDPR made its ‘unofficial’ debut in the case-law of the CJEU in the Opinion of AG Jaaskinen in C-131/14 Google v. Spain delivered on 25 June 2013. In fact, it was precisely Article 17 that was referred to in this Opinion as well, in §110. There’s another reference to the GDPR in §56, mentioning the new rules on the field of application of EU data protection law. Back then, the text of the GDPR was merely a proposal of the Commission – nor the EP, or the Council had adopted their own versions of the text, before entering the trilogue which resulted in the adopted text of Regulation 2016/679.

Fun fact #2

AG Bot is the AG that the delivered the Opinion in the Schrems case as well. The Court followed his Opinion to a large extent for its Judgment. There are fair chances the Court will follow again his Opinion.

***

Find what you’re reading useful? Consider supporting pdpecho.