Tag Archives: CJEU

The CJEU decides lack of access to personal data does not unmake a joint controller: A look at Wirtschaftsakademie

Who is the controller?

The Court of Justice of the EU decided in Case C-210/16 Wirtschaftsakademie that Facebook and the administrator of a fan page created on Facebook are joint controllers under EU data protection law. The decision sent a mini shockwave to organizations that use Facebook Pages, just one week after the GDPR entered into force. What exactly does it mean that they are joint controllers and what exactly do they have to do in order to be compliant? The judgment leaves these questions largely unanswered, but it gives some clues as to finding answers.

Being a joint controller means they have a shared responsibility (with Facebook) to comply with EU data protection law for the processing of personal data occurring through their Facebook Page. As the Court highlighted, they have this responsibility even if they do not have access at all to personal data collected through cookies placed on the devices of visitors of the Facebook page, but just to the aggregated results of the data collection.

The judgment created a great deal of confusion. What has not been yet sufficiently emphasized in the reactions to the Wirtschaftsakademie judgment is that this shared responsibility is not equal: it depends on the stage of the processing the joint controller is involved in and on the actual control it has over the processing. This is, in any case, a better position to be in rather than “controller” on behalf of whom Facebook is processing personal data, or “co-controller” with Facebook. This would have meant full legal liability for complying with data protection obligations for the personal data processed through the page. It is, however, a worse position than being a third party or a recipient that is not involved in any way in establishing purposes and means of the processing. That would have meant there is no legal responsibility for the data being processed through the page. Technically, those were the other options the Court probably looked at before taking the “joint controllership” path.

It is important to note that the Court did not mention at all which are the responsibilities of whom – not even with regard to providing notice. The failure of both Facebook and the page administrator to inform visitors about cookies being placed on their device was the reason invoked by the DPA in the main national proceedings, but the Court remained silent on who is responsible for this obligation.

This summary looks at what the Court found, explaining why it reached its conclusion, and trying to carve out some of the practical consequences of the judgment (also in relation to the GDPR).

This first part of the commentary on the judgment will only cover the findings related to “joint controllership”. The findings related to the competence of the German DPA will be analyzed in a second part. While the judgment interprets Directive 95/46, most of the findings will remain relevant under the GDPR as well, to the extent they interpret identical or very similar provisions of the two laws.

Facts of the Case

Wirtschaftsakademie is an organization that offers educational services and has a Facebook fan page. The Court described that administrators of fan pages can obtain anonymous statistical information available to them free of charge. “That information is collected by means of evidence files (‘cookies’), each containing a unique user code, which are active for two years and are stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages” (#15). The user code “is collected and processed when the fan pages are open” (#15).

The DPA of Schleswig-Holstein ordered Wirtschaftsakademie to close the fan page if it will not be brought to compliance, on the ground that “neither Wirtschaftsakademie, nor Facebook, informed visitors to the Fan Page that Facebook, by means of cookies, collected personal data concerning them and then processed the data” (#16).

The decision of the DPA was challenged by Wirtschaftsakademie, arguing that “it was not responsible under data protection law for the processing of the data by Facebook or the cookies which Facebook installed” (#16). After the DPA lost in lower instances, it appealed these solutions to the Federal Administrative Court, arguing that the main data protection law breach of Wirtschafstakademie was the fact that it commissioned “an inappropriate supplier” because  the supplier “did not comply with data protection law” (#22).

The Federal Administrative Court sent several questions for a preliminary ruling to the CJEU aiming to clarify whether indeed Wirtschaftsakademie had any legal responsibility for the cookies placed by Facebook through its Fan Page and whether the Schleswig Holstein DPA had competence to enforce German data protection law against Facebook, considering that Facebook’s main establishment in the EU is in Ireland and its German presence is only linked to marketing (#24).

“High level of protection” and “effective and complete protection”

The Court starts its analysis by referring again to the aim of the Directive to “ensure a high level of protection of fundamental rights and freedoms, and in particular their right to privacy in respect to processing of personal data” (#26) – and it is to be expected that all analyses under the GDPR would start from the same point. This means that all interpretation of the general data protection law regime will be done in favor of protecting the fundamental rights of data subjects.

Based on the findings in Google Spain, the Court restates that “effective and complete protection of the persons concerned” requires a “broad definition of controller” (#28). Effective and complete protection is another criterion that the Court often takes into account when interpreting data protection law in favor of the individual and his or her rights.

{In fact, one of the afterthoughts of the Court after establishing the administrator is a joint controller, was that “the recognition of joint responsibility of the operator of the social network and the administrator of a fan page hosted on that network in relation to the processing of the personal data of visitors to that page contributes to ensuring more complete protection of the rights of persons visiting a fan page” (#42)}.

The referring Court did not even consider the possibility that the administrator is a controller

Having set up the stage like this, the Court goes on and analyzes the definition of “controller”. To be noted, though, that the referring Court never asked whether the administrator of the fan page is a controller or a joint controller, but asked whether it has any legal responsibility for failing to choose a compliant “operator of its information offering” while being an “entity that does not control the data processing within the meaning of Article 2(d) of Directive 95/46” (#24 question 1).

It seems that the referring Court did not even take into account that the fan page administrator would have any control over the data, but was wondering whether only “controllers” have legal responsibility to comply with data protection law under Directive 95/46, or whether other entities somehow involved in the processing could also have some responsibility.

However, the Court does not exclude the possibility that the administrator may be a controller. First of all, it establishes that processing of personal data is taking place, as described at #15, and that the processing has at least one controller.

Facebook is “primarily” establishing means and purposes of the processing

It recalls the definition of “controller” in Article 2(d) of the Directive and highlights that “the concept does not necessarily refer to a single entity and may concern several actors taking part in that processing, with each of them then being subject to the applicable data protection provisions” (#29). The distribution of responsibilities from the last part of the finding is brought up by the Court without having any such reference in Article 2(d)[1].

This is important, because the next finding of the Court is that, in the present case, “Facebook Ireland must be regarded as primarily determining the purposes and means of processing the personal data of users of Facebook and persons visiting the fan pages hosted on Facebook” (#30). Reading this paragraph together with #29 means that Facebook will have a bigger share of the obligations in a joint controllership situation with fan pages administrators.

This idea is underlined by the following paragraph which refers to identifying the “extent” to which a fan page administrator “contributes… to determining, jointly with Facebook Ireland and Facebook Inc., the purposes and means of processing” (#31). To answer this question, the Court lays out its arguments in three layers:

1) It describes the processing of personal data at issue, mapping the data flows – pointing to the personal data being processed, data subjects and all entities involved:

  • The data processing at issue (placing of cookies on the Fan Page visitors’ device) is “essentially carried out by Facebook” (#33);
  • Facebook “receives, registers and processes” the information stored in the placed cookies not only when a visitor visits the Fan Page, but also when he or she visits services provided by other Facebook family companies and by “other companies that use the Facebook services” (#33);
  • Facebook partners and “even third parties” may use cookies to provide services to Facebook or the business that advertise on Facebook (#33);
  • The creation of a fan page “involves the definition of parameters by the administrator, depending inter alia on the target audience … , which has an influence on the processing of personal data for the purpose of producing statistics based on visits to the fan page” (#36);
  • The administrator can request the “processing of demographic data relating to its target audience, including trends in terms of age, sex, relationship and occupation”, lifestyle, location, online behavior, which tell the administrator where to make special offers and better target the information it offers (#37);
  • The audience statistics compiled by Facebook are transmitted to the administrator “only in anonymized form” (#38);
  • The production of the anonymous statistics “is based on the prior collection, by means of cookies installed by Facebook …, and the processing of personal data of (the fan page) visitors for such statistical purposes” (#38);

2) It identifies the purposes of this processing:

  • There are two purposes of the processing:
    • “to enable Facebook to improve its system of advertising transmitted via its network” and
    • “to enable the fan page administrator to obtain statistics produced by Facebook from the visits of the page”, which is useful for “managing the promotion of its activity and making it aware of the profiles of the visitors who like its fan page or use its applications, so that it can offer them more relevant content” (#34);

3) It establishes a connection between the two entities that define the two purposes of processing:

  • Creating a fan page “gives Facebook the opportunity to place cookies on the computer or other device of a person visiting its fan page, whether or not that person has a Facebook account” (#35);
  • The administrator may “define the criteria in accordance with which the statistics are to be drawn up and even designate the categories of persons whose personal data is to be made use of by Facebook”, “with the help of filters made available by Facebook” (#36);
  • Therefore, the administrator “contributes to the processing of the personal data of visitors to its page” (#36);

One key point: not all joint controllers must have access to the personal data being processed

In what is the most impactful finding of this judgment, the Court uses one of the old general principles of interpreting and applying the law, ubi lex non distinguit, nec nos distinguere debemus, and it states that “Directive 95/46 does not, where several operators are jointly responsible for the same processing, require each of them to have access to the personal data concerned” (#38). Therefore, the fact that administrators have access only to anonymized data will have no impact upon the existence of their legal responsibility as joint controllers, since the criteria that matters is establishing purposes and means of the processing and that at least one of the entities involved in the processing has access to and is processing personal data. The fact that they only have access to anonymized data should nonetheless matter when establishing the degree of responsibility.

Hence, after describing the involvement of fan page administrators in the processing at issue – and in particular their role in defining parameters for processing depending on their target audience and in the determination of the purposes of the processing, the Court finds that “the administrator must be categorized, in the present case, as a controller responsible for that processing within the European Union, jointly with Facebook Ireland” (#39).

Enhanced responsibility for non-users visiting the page

The Court also made the point that fan pages can be visited by non-users of Facebook, implying that were it not for the existence of that specific fan page they accessed because they were looking for information related to the administrator of the page, Facebook would not be able to place cookies on their devices and process personal data related to them for its own purposes and for the purposes of the fan page. “In that case, the fan page responsibility for the processing of the personal data of those persons appears to be even greater, as the mere consultation of the home page by visitors automatically starts the processing of their personal data” (#42).

Jointly responsible, not equally responsible

Finally, after establishing that there is joint controllership and joint responsibility, the Court makes the very important point that the responsibility is not equal and it depends on the degree of involvement of the joint controller in the processing activity:

The existence of joint responsibility does not necessarily imply equal responsibility of the various operators involved in the processing of personal data. On the contrary, those operators may be involved at different stages of that processing of personal data and to different degrees, so that the level of responsibility of each of them must be assessed with regard to all the relevant circumstances of the particular case(#43).

Comments and conclusions

In the present case, the Court found early in the judgment that Facebook “primarily” establishes the means and purposes of the processing. This means that it is primarily responsible for compliance with data protection obligations. At the same time, the administrator of the fan page has responsibility to comply with some data protection provisions, as joint controller. The Court did not clarify, however, what exactly the administrator of the fan page must do in order to be compliant.

For instance, the Court does not go into analyzing how the administrator complies or not with the Directive in this case – therefore, assuming that the judgment requires administrators to provide data protection notice is wrong. The lack of notice was a finding of the DPA in the initial proceedings. Moreover, the DPA ordered Wirtschaftsakademie to close its Facebook page because it found that neither Facebook, nor the page administrator had informed visitors about the cookies being placed on their devices (#16).

The CJEU merely establishes that the administrator is a joint controller and that it shares responsibility for compliance with Facebook depending on the degree of their involvement in the processing.

The only clear message from the Court with regard to the extent of legal responsibility of the administrator as joint controller is that it has enhanced responsibility towards visitors of the fan page that are not Facebook users. This being said, it is very likely that informing data subjects is one of the obligations of the GDPR that can potentially fall on the shoulders of fan page administrators in the absence of Facebook stepping up and providing notice, since they can edit the interface with visitors to a certain extent.

Another message that is not so clear, but can be extracted from the judgment is that the degree of responsibility of the joint controllers “must be assessed with regard to all the relevant circumstances of the particular case” (#43). This could mean that if the two joint controllers were to enter a joint controllership agreement (as the GDPR now requires), the Courts and DPAs may be called to actually look at the reality of the processing in order to determine the responsibilities each of them has, in order to avoid a situation where the joint controller primarily responsible for establishing means and purposes contractually distributes obligations to the other joint controller that the latter could not possibly comply with.

As for the relevance of these findings under the GDPR, all the “joint controllership” part of the judgment is very likely to remain relevant, considering that the language the Court interpreted from Directive 95/46 is very similar to the language used in the GDPR (see Article 2(d) of the Directive and Article 4(7) GDPR). However, the GDPR does add a level of complexity to the situation of joint controllers, in Article 26. The Court could, eventually, add to this jurisprudence an analysis of the extent to which the joint controllership agreement required by Article 26 is relevant to establish the level of responsibility of a joint controller.

Given that the GDPR requires joint controllers to determine in a transparent manner their respective responsibilities for compliance through an arrangement, one consequence of the judgment is that such an arrangement should be concluded between Facebook and fan page administrators (Article 26(1) GDPR). The essence of the arrangement must then be made available to visitors of fan pages (Article 26(2) GDPR).

However, there is one obligation under the GDPR that, when read together with the findings of the Court, results in a conundrum. Article 26(3) GDPR provides that the data subject may exercise his or her rights “in respect of and against each of the controller”, regardless of how the responsibility is shared contractually between them. In the case at hand, the Court acknowledges that the administrator only has access to anonymized data. This means that even if data subjects would make, for example, a request for access or erasure of data to the administrator, it will not be in a position to solve such requests. A possibility is that any requests made to a joint controller that does not have access to data will be forwarded by the latter to the joint controller that does have access (what is important is that the data subject has a point of contact and eventually someone they can claim their rights to). This is yet another reason why a written agreement to establish the responsibility of each joint controller is useful. Practice will solve the conundrum, ultimately, with DPAs and national Courts likely playing their part.

 

 

 

[1] “(d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or Community laws or regulations, the controller or the specific criteria for his nomination may be designated by national or Community law;”

Brief case-law companion for the GDPR professional

This collection of quotes from relevant case-law has been compiled with the purpose of being useful to all those working with EU data protection law. The majority of the selected findings are part of a “Countdown to the GDPR” I conducted on social media, one month before the Regulation became applicable, under #KnowYourCaseLaw. This exercise was prompted by a couple of reasons.

First, data protection in the EU is much older and wider than the General Data Protection Regulation (GDPR) and it has already invited the highest Courts in Europe to weigh in on the protection of this right. Knowing what those Courts have said is essential.

Data protection law in the EU is not only a matter of pure EU law, but also a matter of protecting human rights following the legal framework of the Council of Europe (starting with Article 8 of the European Convention on Human Rights – ‘ECHR’). The interplay between these two legal regimes is very important, given the fact that the EU recognizes fundamental rights protected by the ECHR as general principles of EU law – see Article 6(3) TEU.

Finally, knowing relevant case-law makes the difference between a good privacy professional and a great one.

What to expect

This is not a comprehensive collection of case-law and it does not provide background for the cases it addresses. The Handbook of data protection law, edition 2018, is a great resource if this is what you are looking for.

This is a collection of specific findings of the Court of Justice of the EU (CJEU), the European Court of Human Rights (ECtHR) and one bonus finding of the German Constitutional Court. There are certainly other interesting findings that have not been included here (how about an “Encyclopedia of interesting findings” for the next project?). The ones that have been included provide insight into specific issues, such as the definition of personal data, what constitutes data related to health, what does freely consent mean or what type of interference with fundamental rights is profiling. Readers will even find a quote from a concurring opinion of an ECtHR judge that is prescient, to say the least.

Enjoy the read!

Brief Case-Law Companion for the GDPR Professional

Exam scripts and examiner’s corrections are personal data of the exam candidate (AG Kokott Opinion in Nowak)

AG Kokott delivered her Opinion on 20 July in Case C-434/16 Nowak v Data Protection Commissioner, concluding that “a handwritten examination script capable of being ascribed to an examination candidate, including any corrections made by examiners that it may contain, constitutes personal data within the meaning of Article 2(a) of Directive 95/46/EC” (Note: all highlights in this post are mine).
This is a really exciting Opinion because it provides insight into:

  • the definition of personal data,
  • the purpose and the functionality of the rights of the data subject,
  • the idea of abusing data protection related rights for non-data protection purposes,
  • how the same ‘data item’ can be personal data of two distinct data subjects (examiners and examinees),
  • what constitutes a “filing system” of personal data processed otherwise than by automated means.

But also because it technically (even if not literally) invites the Court to change its case-law on the definition of personal data, and specifically the finding that information consisting in a legal assessment of facts related to an individual does not qualify as personal data (see C-141/12 and C-372/12 YS and Others).

The proceedings were initially brought in front of the Irish Courts by Mr Nowak, who, after failing an exam organised by a professional association of accountants (CAI) four times, asked for access to see his exam sheet on the basis of the right to access his own personal data. Mr Nowak submitted a request to access all his personal data held by CAI and received 17 items, none of which was the exam sheet. He then submitted a complaint to the Irish Data Protection Commissioner, who decided not to investigate it, arguing that an exam sheet is not personal data. The decision not to investigate on this ground was challenged in front of a Court. Once the case reached the Irish Supreme Court, it was referred to the Court of Justice of the EU to clarify whether an exam sheet falls under the definition of “personal data” (§9 to §14).

Analysis relevant both for Directive 95/46 and for the GDPR

Yet again, AG Kokott refers to the GDPR in her Conclusions, clarifying that “although the Data Protection Directive will shortly be repealed by the General Data Protection Regulation, which is not yet applicable, the latter will not affect the concept of personal data. Therefore, this request for a preliminary ruling is also of importance for the future application of the EU’s data protection legislation” (m.h.).

The nature of an exam paper is “strictly personal and individual”

First, the AG observes that “the scope of the Data Protection Directive is very wide and the personal data covered by the Directive is varied” (§18).

The Irish DPC argued that an exam script is not personal data because “examination exercises are normally formulated in abstract terms or relate to hypothetical situations”, which means that “answers to them are not liable to contain any information relating to an identified or identifiable individual” (§19).

This view was not followed by the AG, who explained that it is incongruent with the purpose of an exam. “In every case“, she wrote, “the aim of an examination — as opposed, for example, to a representative survey — is not to obtain information that is independent of an individual. Rather, it is intended to identify and record the performance of a particular individual, i.e. the examination candidate”  (§24; m.h.). Therefore, “every examination aims to determine the strictly personal and individual performance of an examination candidate. There is a good reason why the unjustified use in examinations of work that is not one’s own is severely punished as attempted deception” (§24; m.h.).

What about exam papers identified by codes?

In a clear indication that pseudonymized data are personal data, the AG further noted that an exam script is personal data also in those cases where instead of bearing the examination candidate’s name, the script has an identification number or bar code: “Under Article 2(a) of the Data Protection Directive, it is sufficient for the existence of personal information that the data subject may at least be indirectly identified. Thus, at least where the examination candidate asks for the script from the organisation that held the examination, that organisation can identify him by means of the identification number” (§28).

Characteristics of handwriting, personal data themselves 

The AG accepted the argument of Mr Nowak that answers to an exam that are handwritten “contain additional information about the examination candidate, namely about his handwriting” (&29). Therefore, the characteristics of the handwriting are personal data themselves. The AG explains that “a script that is handwritten is thus, in practice, a handwriting sample that could at least potentially be used at a later date as evidence to determine whether another text was also written in the examination candidate’s writing. It may thus provide indications of the identity of the author of the script” (§29). According to the AG, it’s not relevant whether such a handwriting sample is a suitable means of identifying the writer beyond doubt: “Many other items of personal data are equally incapable, in isolation, of allowing the identification of individuals beyond doubt” (§30).

Classifying information as ‘personal data’ is a stand alone exercise (does not depend on whether rights can be exercised)

The Irish DPC argued that one of the reasons why exam scripts are not personal data in this case is because the “purpose” of the right to access and the right to rectification of personal data precludes them to be “personal data” (§31). The DPC is concerned that Recital 41 of Directive 95/46 specifies that any person must be able to exercise the right of access to data relating to him which is being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing. “The examination candidate will seek the correction of incorrect examination answers”, the argument goes (§31).

AG Kokott rebuts this argument by acknowledging that the classification of information as personal data “cannot be dependent on whether there are specific provisions about access to this information” or on eventual problems with rectification of data (§34). “If those factors were regarded as determinative, certain personal data could be excluded from the entire protective system of the Data Protection Directive, even though the rules applicable in their place do not ensure equivalent protection but fragmentary protection at best” (§34)

Even if classification information as “personal data” would depend in any way on the purpose of the right to access, the AG makes it clear that this purpose is not strictly linked to rectification, blocking or erasure: “data subjects generally have a legitimate interest in finding out what information about them is processed by the controller” (§39). This finding is backed up by the use of “in particular” in Recital 41 of the Directive (§39).

The purpose of processing and… the passage of time, both relevant for obtaining access, rectification

After clarifying that it’s irrelevant what an individual wants to do with their data, once accessed (see also the summary below on the ‘abuse of rights’), AG Kokott explains that a legitimate interest in correcting an “exam script”-related data is conceivable.

She starts from the premise that “the accuracy and completeness of personal data pursuant to Article 6(1)(d) must be judged by reference to the purpose for which the data was collected and processed” (§35). The AG further identifies the purpose of an exam script as determining  “the knowledge and skills of the examination candidate at the time of the examination, which is revealed precisely by his examination performance and particularly by the errors in the examination” (§35). “The existence of errors in the solution does not therefore mean that the personal data incorporated in the script is inaccurate”, the AG concludes (§35).

Rectification could be achieved if, for instance, “the script of another examination candidate had been ascribed to the data subject, which could be shown by means of, inter alia, the handwriting, or if parts of the script had been lost” (§36).

The AG also found that the legitimate interest of the individual to have access to their own data is strengthened by the passage of time, to the extent that their recollection of the contents of their answer is likely to be considerably weaker a few years after the exam. This makes it possible that “a genuine need for information, for whatever reasons, will be reflected in a possible request for access. In addition, there is greater uncertainty with the passing of time — in particular, once any time limits for complaints and checks have expired — about whether the script is still being retained. In such circumstances the examination candidate must at least be able to find out whether his script is still being retained” (§41).

Is Mr Nowak abusing his right of access under data protection law?

AG Kokott recalls CJEU’s case-law on “abuse of rights” and the double test required by the Court to identify whether there had been any abuse of rights in a particular case (C-423/15 Kratzer and the case-law cited there at §38 to §40), which can be summed up to (§44):

i) has the purpose of the EU legislation in question been misused?

ii)  is the essential aim of the transaction to obtain an undue advantage?

The DPC submitted during the procedure that if exam scripts would be considered personal data, “a misuse of the aim of the Data Protection Directive would arise in so far as a right of access under data protection legislation would allow circumvention of the rules governing the examination procedure and objections to examination decisions” (§45).

The AG considers that “any alleged circumvention of the procedure for the examination and objections to the examination results via the right of access laid down by data protection legislation would have to be dealt with using the provisions of the Data Protection Directive” and she specifically refers to the restrictions to the right of access laid down in Article 13 of the Directive with the aim “to protect certain interests specified therein” (§46). She also points out that if restricting access to exam scripts can’t be circumscribed to those exceptions, than “it must be recognised that the legislature has given precedence to the data protection requirements which are anchored in fundamental rights over any other interests affected in a specific instance” (§47).

The AG also looks at the exceptions to the right of access under the GDPR and finds that it is more nuanced than the Directive in this regard. “First, under Article 15(4) of the regulation, the right to obtain a copy of personal data is not to adversely affect the rights and freedoms of others. Second, Article 23 of the regulation sets out the grounds for a restriction of data protection guarantees in slightly broader terms than Article 13 of the Directive, since, in particular, protection of other important objectives of general public interest of the Union or of a Member State pursuant to Article 23(1)(e) of the regulation may justify restrictions” (§48).

However, it seems that she doesn’t find the slight broadening of the scope of exemptions in the GDPR as justifying the idea of an abuse of right in this particular case.

The AG also argues that “on the other hand, the mere existence of other national legislation that also deals with access to examination scripts is not sufficient to allow the assumption that the purpose of the Directive is being misused” (§49). She concludes that even if such misuse would be conceivable, the second limb of the “abuse of rights” test would not be satisfied: “it is still not apparent where the undue advantage lies if an examination candidate were to obtain access to his script via his right of access. In particular, no abuse can be identified in the fact that someone obtains information via the right of access which he could not otherwise have obtained” (§50).

Examiner’s correction on the exam script are the examinee’s personal data and his/her own personal data at the same time

The AG looks into whether any corrections made by the examiner on the examination script are also personal data with respect to the examination candidate (a question raised by some of the parties), even though she considers that the answer will not impact the result of the main proceedings (§52, §53).

It is apparent that the facts of this case resemble the facts of YS and Others, where the Court refused extension of the right of access to the draft legal analysis of an asylum application on the grounds that that did not serve the purpose of the Data Protection Directive but would establish a right of access to administrative documents. The Court argued in YS that such an analysis “is not information relating to the applicant for a residence permit, but at most information about the assessment and application by the competent authority of the law to the applicant’s situation” (§59; see YS and Others, §40). The AG considers that only “at first glance” the cases are similar. But she doesn’t convincingly differentiate between the two cases in the arguments that follow.

However, she is convincing when explaining why the examiner’s corrections are “personal data”. AG Kokott explains that the purpose of the comments made by examiners on an exam script is “the evaluation of the examination performance and thus they relate indirectly to the examination candidate” (§61). It does not matter that the examiners don’t know the identity of the examination candidate who produced the script, as long as the candidate can be easily identified by the organisation holding the examination (§60 and §61).

The AG further adds that “comments on an examination script are typically inseparable from the script itself … because they would not have any informative value without it” (§62). And it is “precisely because of that close link between the examination script and any corrections made on it”, that “the latter also are personal data of the examination candidate pursuant to Article 2(a) of the Data Protection Directive” (§63).

In an important statement, the AG considers that “the possibility of circumventing the examination complaint procedure is not, by contrast, a reason for excluding the application of data protection legislation” (§64). “The fact that there may, at the same time, be additional legislation governing access to certain information is not capable of superseding data protection legislation. At most it would be admissible for the individuals concerned to be directed to the simultaneously existing rights of information, provided that these could be effectively claimed” (§64).

Finally, the AG points out “for the sake of completeness” that “corrections made by the examiner are, at the same time, his personal data”. AG Kokott sees the potential conflict between the right of the candidate to access their personal data and the right of the examiners to protect their personal data and underlines that the examiner’s rights “are an appropriate basis in principle for justifying restrictions to the right of access pursuant to Article 13(1)(g) of the Data Protection Directive if they outweigh the legitimate interests of the examination candidate” (§65).

The AG considers that “the definitive resolution to this potential conflict of interests is likely to be the destruction of the corrected script once it is no longer possible to carry out a subsequent check of the examination procedure because of the lapse of time” (§65).

An exam script forms part of a filing system

One last consideration made by AG Kokott is whether processing of an exam script would possibly fall outside the scope of Directive 95/46, considering that it does not seem to be processed using automated means (§66, §67).

The AG points out that the Directive also applies to personal data processed otherwise than by automated means as long as they form part of a “filing system”, even if this “filing system” is not electronically saved (§69).

“This concept covers any structured set of personal data which is accessible according to specific criteria. A physical set of examination scripts in paper form ordered alphabetically or according to other criteria meets those requirements” (§69), concludes the AG.

Conclusion. What will the Court say?

The Conclusions of AG Kokott in Nowak contain a thorough analysis, which brings several dimensions to the data protection debate that have been rarely considered by Courts – the self-standing importance of the right of access to one’s own data (beyond any ‘utilitarianism’ of needing it to obtain something else), the relevance of passage of time for the effectiveness of data protection rights, the limits of the critique that data protection rights may be used to achieve other purposes than data protection per se, the complexity of one data item being personal data of two different individuals (and the competing interests of those two individuals).

The Court will probably closely follow the Conclusions of the AG for most of the points she raised.

The only contentious point will be the classification of an examiner’s corrections as personal data of the examined candidate, because following the AG will mean that the Court would reverse its case-law from YS and Others.

If we apply the criteria developed by AG Kokott in this Opinion, it is quite clear that the analysis concerning YS and their request for asylum is personal data: the legal analysis is closely linked to the facts concerning YS and the other asylum applicants and the fact that there may be additional legislation governing access to certain information (administrative procedures in the case of YS) is not capable of superseding data protection legislation. Moreover, if we add to this the argument that access to one’s own personal data is valuable in itself and does not need to satisfy other purpose, reversing this case-law is even more likely.

The only arguable difference between this case and YS and Others is that, unlike what the AG found in §62 (“comments on an examination script are typically inseparable from the script itself… because they would not have any informative value without it”), it is conceivable that a legal analysis in general may have value by itself. However, a legal analysis of particular facts is void of value when applied to different individual facts. In this sense, a legal analysis can also be considered inseparable from the particular facts it assesses. What would be relevant in classifying it as personal data would then remain the identifiability of the person that the particularities refer to…

I was never convinced by the argumentation of the Court (or AG Sharpston for that matter) in YS and Others and I would welcome either reversing this case-law (which would be compatible with what I was expecting the outcome of YS to be) or having a more convincing argumentation as to why such an analysis/assessment of an identified person’s specific situation is not personal data. However, I am not getting my hopes high. As AG Kokott observed, the issue in the main proceedings can be solved without getting into this particular detail. In any case, I will be looking forward to this judgement.

(Summary and analysis by dr. Gabriela Zanfir-Fortuna)

 

Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)

The Conclusions of Advocate General Kokott in C-73/16 Puškár were published on 30 March and remained under the radar, even though they deal with a couple of very important questions for EU data protection law that may have wide implications: effective judicial remedies, lawful grounds for processing other than consent, the right to access one’s own personal data. As a bonus, the AG refers to and analyses Article 79 GDPR – the right to a judicial remedy.

The analysis regarding effective judicial remedies under Article 47 Charter and Directive 95/46 could be relevant for the debate on essentially equivalence when it comes to adequacy decisions for international data transfers (for those of you who don’t remember, one of the two main findings in Schrems was that the Safe Harbor framework touched the essence of the right to effective judicial remedies, breaching thus Article 47 Charter). In this sense, the AG founds that a measure that does not restrict the category of people who could in principle have recourse to judicial review does not touch the essence of this right. Per a contrario, if a measure does restrict these categories of people, it would touch the essence of the right to an effective judicial remedy, and, therefore, it would breach the Charter.

Finally, a question of great importance for EU law in general is also tackled: what should national courts do when the case-law of the CJEU and the case-law of the ECtHR diverge regarding the protection of fundamental rights?

Here is what you will further read:

  1. Facts of the case and questions referred to the CJEU
  2. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy
  3. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents
  4. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)
  5. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection
  6. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed when the ECtHR case-law provides a higher level of protection for the rights of a person
  7. What to expect from the Court

Note that all highlights from the post are made by the author.

  1. Facts of the case and questions referred to the CJEU

C-73/16 Puškár concerns the request of Mr Puškár to have his name removed from a blacklist kept by the Finance Directorate of Slovakia which contains names and national ID numbers for persons “who purport to act, as ‘fronts’, as company directors”. The list associates a legal person or persons with a natural person who supposedly acted on their behalf (§15) and is created for the purposes of tax administration and combating tax fraud (§23 2nd question for a preliminary ruling). It transpires from several paragraphs of the Conclusions that Mr Puskar found out about the list and the fact that he is on the list from a leak (§23 2nd question;§72; §76). Instead of relying on the more straightforward right to erasure or right to object under data protection law, Mr Puškár claimed that “his inclusion in the above mentioned list infringes his personal rights, specifically the right to the protection of his good name, dignity and good reputation” (§16).

The Supreme Court rejected his claims, partly on procedural issues, partly on substantive grounds (§18). Later, the Constitutional Court found that “the Supreme Court infringed the fundamental right to the protection of personal data against unauthorised collection and other abuses, in addition to the right to privacy”, quashed its decision and send back the case to the Supreme Court for retrial, grounding its findings on ECtHR case-law (§20). In the context of these second round proceedings, the Supreme Court sent questions for a preliminary ruling to the CJEU to essentially clarify:

  • whether the right to an effective remedy under Article 47 of the Charter in the context of data protection is compatible with a national law requirement that a claimant must first exhaust the procedures available under administrative law (administrative complaints) before going to Court;
  • whether the legitimate grounds for processing under Directive 95/46 and Articles 7 and 8 of the Charter preclude tax authorities to create such a blacklist without the consent of the individuals on the list;
  • whether the list obtained by the claimant without the consent of the tax authorities is admissible as evidence;
  • whether national courts should give precedence to the case-law of the CJEU or the case-law of the ECtHR on a specific topic where the two diverge.
  1. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy

To reply to the first question, AG Kokott looks at Articles 28(4) and 22 of Directive 95/46 and also at Article 79 of the General Data Protection Regulation, which will replace Directive 95/46 starting with 25 May 2018.

Article 28(4) of Directive 95/46 states that each supervisory authority (Data Protection Authority) is to hear claims lodged by any person concerning the protection of his rights and freedoms with regard to the processing of personal data. Article 22 provides that, without prejudice to the remedy referred to in Article 28(4), every person is to have a right to a judicial remedy for any breach of the rights guaranteed him by the national law applicable to the processing in question (§37, §38).

In practice, this means that an individual who engages in Court proceedings for a breach of data protection law must be able to also initiate administrative proceedings with a DPA (complaints lodged with DPAs).

The same rule is kept under Article 79 GDPR, slightly broadened: the right to a judicial remedy must be effective and must be granted without prejudice to any administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority. 

AG Kokott explains that these rules still do not clarify “whether the bringing of legal proceedings may be made contingent upon exhaustion of another remedy. All that can be taken from Article 79 of the General Data Protection Regulation is that the judicial remedy must be effective. An obligation to exhaust some other remedy before bringing legal proceedings will consequently be impermissible if the judicial remedy is rendered ineffective as a result of this precondition” (§43).

The AG found that Article 47(1) of the Charter and the principle of effectiveness “ultimately embody the same legal principle” and that they can be examined jointly using the rules in Articles 47(1) and 52(1) of the Charter – which is the provision that enshrines the rules for limiting the exercise of the fundamental rights in the Charter (§51). Hence, the question is whether the obligation to exhaust administrative procedures before going to Court amounts to a justified interference with the right to an effective judicial remedy.

AG Kokott remarks that the interference is provided for by Slovakian law and that it does not touch the essence of the right to effective judicial remedy because “it does not restrict the category of people who could in principle have recourse to judicial review” (§56). [Small comment here: this means that a provision which would restrict the category of people who could in principle have recourse to judicial review touches the essence of the right in Article 47 Charter. Check out paragraphs 45 and 46 of the EDPS Opinion on the EU-US Umbrella Agreement commenting on the fact that Article 19 of the Agreement provides for the possibility of judicial redress only for citizens of the EU, excluding thus categories of individuals that would otherwise be covered by the Charter, such as asylum seekers and residents].

It remained to be analysed whether the interference complies with the principle of proportionality, which “requires that a measure be ‘appropriate, necessary and proportionate to the objective it pursues’” (§58). The AG retains the submission of the Supreme Court that “the exhaustion of the administrative remedy represents a gain in efficiency, as it provides the administrative authority with an opportunity to remedy the alleged unlawful intervention, and saves it from unwanted court proceedings” (§59). The AG considers that “obligatory preliminary proceedings are undoubtedly appropriate for achieving the objectives” and that a “less onerous method” does not suggest itself as capable of realising them to the same extent (§62).

However, the AG points out that the “specific form” of the administrative remedy is important to determine the appropriateness of the measure in practice. This condition applies in particular if there is uncertainty “as to whether the time limit for bringing an action begins to run before a decision has been made in the administrative action” (§64). Additionally, Article 47(2) Charter establishes the right of every person to have their case dealt with within a reasonable period of time. “While this right in fact relates to judicial proceedings, naturally it may not be undermined by a condition for the bringing of an action” (§67).

In conclusion, the AG considers that the right to effective judicial review under Article 47 Charter and the principle of effectiveness “do not preclude an obligation to exhaust an administrative remedy being a condition on bringing legal proceedings if the rules governing that remedy do not disproportionately impair the effectiveness of judicial protection. Consequently, the obligatory administrative remedy must not cause unreasonable delay or excessive costs for the overall legal remedy” (§71).

  1. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents

Essentially, the question asked by the Supreme Court is whether the contested list may be excluded as evidence due to the fact that it came into the possession of the claimant without the consent of the competent authorities (§72).

The AG considers that “a review should be carried out to determine whether the person affected has a right of access to the information in question. If this were the case, the interest in preventing unauthorized use would no longer merit protection” (§83).

Further, it is recalled that “under the second sentence of Article 8(2) of the Charter and Article 12 of the Data Protection Directive, everyone has the right of access to data which has been collected concerning him or her. This also applies in principle to data being recorded in the contested list. Furthermore, the persons so affected would, by virtue of the collection of the data, have to be informed of the use of the data, under either Article 10 or Article 11 of the Data Protection Directive” (§85).

While indeed Article 13 of the Directive allows this right to information to be restricted, it also “expressly requires that such restrictions be imposed by legislative measures” (§86). The AG acknowledged that “there is a potential risk that inspection and monitoring activities based on the list would be less effective if it were known who was named on that list” (§87). However, the national Court must examine:

  • “whether a restriction of the right of information of this kind is provided for” (§88) and
  • “where appropriate” if it is “justified” (§88). This is an indication that even if such an exemption would be provided for by law, a further analysis is needed to see whether the exemption is justified.

A key point the AG makes is that “even if there are indications of a legitimate interest in a hypothetical, legally justified non-disclosure of the list in question, the national courts must also examine whether in the individual case these outweigh the legitimate interests of the individual in bringing the proceedings” (§89). This is important because it is a clear indication that when a controller relies on their legitimate interest as a ground for processing, it always has to engage in a balancing exercise with the legitimate interests (and rights) of the data subject.

In conclusion, the AG established that refusing to accept as evidence a document obtained by the claimant without the consent of an authority is not possible under the principle of a fair hearing in Article 47 Charter when the document contains personal data of the claimant, which the authority is required to disclose to the claimant under Article 12 and 13 of the Data Protection Directive.

  1. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)

The Supreme Court wanted to know whether the fundamental right to privacy (Article 7 Charter) and protection of personal data (Article 8 Charter) and the Data Protection Directive prohibit a Member State from creating a list of personal data for the purposes of tax collection without the consent of the persons concerned.

The AG points out that “this question is primarily to be answered in the light of the Data Protection Directive, as this specifies the rights to privacy and data protection” (§95).

The AG further recalls that Article 7 of the Data Protection Directive allows processing of personal data if it is based on one of the six lawful grounds for processing provided for (§99) [NB: of which only one is “consent”!]. While the AG acknowledges that three of the six conditions are applicable in this case (1 – performance of a task in the public interest [Article 7(e)]; 2 – legitimate interest of the controller [Article 7(f)] and 3 – necessity of compliance with a legal obligation [Article 7(c)]), she considers the examination of the latter 2 as “superfluous”: “This is because all parties acknowledge that tax collection and combating tax fraud are tasks in the public interest within the meaning of Article 7(e) of the Data Protection Directive” (§100).

A much-welcomed clarification is further brought by the AG, who specifies that Article 7(e) of the Data Protection Directive “must be read in conjunction with the principles of Article 6. According to Article 6(1)(b), personal data must only be collected for specified, explicit and legitimate purposes. Within the scope of Article 7(e), the purpose of the data processing is inseparably linked to the delegated tasks. Consequently, the transfer of the task must clearly include the purpose of the processing” (§106).

This clarification is welcomed because it reminds controllers that even if they correctly process personal data on one of the lawful grounds for processing (such as consent or legitimate interest) in compliance with Article 7 of the Directive, they still have to comply with all the other safeguards for processing personal data, including the principles for processing in Article 6 of the Directive (purpose limitation, data minimization etc).

The AG remarks that the reference for a preliminary ruling does not specify the purpose of the contested list and leaves it to the Supreme Court to look further into this question (§107). Additionally, the AG also considers that the Supreme Court “will have to examine whether the creation and use of the contested list and in particular the naming of Mr Puškár is necessary for the claimed public interest”. This is yet another reminder how important “necessity” is for personal data protection in the EU legal framework (check out EDPS’s recently published “Necessity Toolkit”).

Another very interesting point that the AG brings forward is how naming a person on this black list constitutes “a considerable interference with the rights of the person concerned”, beyond the right to privacy in Article 7 Charter – it also touches (§110):

  • “his reputation and could lead to serious, practical disadvantages in his dealings with the tax authorities;
  • the presumption of innocence in Article 48(1) of the Charter;
  • the legal persons associated with the person concerned, which will be affected in terms of their freedom to conduct business under Article 16 of the Charter”.

This finding is a testimony of the importance of complying with the right to the protection of personal data, as non-compliance would have various consequences on several other fundamental rights.

As the AG explains, “such a serious interference of this kind can only be proportionate if there are sufficient grounds for the suspicion that the person concerned purported to act as a company director of the legal persons associated with him and in so doing undermined the public interest in the collection of taxes and combating tax fraud” (§111).

In conclusion, the tax authorities can create a blacklist such as the one in the main proceedings on the grounds of Article 7(e) of the Data Protection Directive, but this assumes that (§117):

  • “the task was legally assigned to the tax authorities,
  • the use of the list is appropriate and necessary for the purposes of the tax authorities and
  • there are sufficient grounds to suspect that these persons should be on the list”.
  1. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection

Further, the AG spelled out that “neither the fundamental rights to privacy, Article 7 of the Charter, or data protection, Article 8, would in this case prevent the creation and use of the list” (§117).

The analysis to reach this conclusion was another missed opportunity to persuade the Court of Justice to better delineate the two fundamental rights protected by Article 7 and Article 8 of the Charter. The AG referred to these as “the fundamental rights to privacy and data protection”.

Without a clear analysis of what constitutes interference with the two rights, the AG referred to “naming of a person on the contested list” as “affecting” both fundamental rights (§115). In the same paragraph, she further analysed en masse “these interferences”, writing that they are only justified “if they have a sufficient legal basis, respect the essence of both fundamental rights, and preserve the principle of proportionality” (§ 115). Considering that the legality and proportionality of the measure were addressed in previous sections, the AG merely stated that “the adverse effects associated with inclusion on the contested list, those interferences do not meet the threshold of a breach of the essence of those rights” before concluding that neither of the two Charter articles would prevent the creation of such a blacklist.

  1. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed, even if the ECtHR case-law provides a higher level of protection for the rights of a person

The last question is one that is extremely interesting for EU lawyers in general, not necessarily for EU data protection lawyers, because it tackles the issue of different levels of protection of the same fundamental right emerging from the case-law of the Court of Justice of the EU in Luxembourg, on one hand, and the European Court of Human Rights in Strasbourg, on the other hand.

As the AG summarizes it, “the fourth question is aimed at clarifying whether a national court may follow the case-law of the Court of Justice of the European Union where this conflicts with the case-law of the ECtHR” (§118). This issue is relevant in our field because Article 8 of the European Convention of Human Rights shares partially the same material scope of Article 7 and Article 8 of the EU Charter of Fundamental Rights (Article 8 of the Convention is more complex), and Article 52(3) of the Charter states that “the rights in the Charter, which correspond to rights guaranteed by the European Convention on the Protection of Human Rights and Fundamental Freedoms (ECHR), have the same meaning and scope as conferred by the ECHR” (§122). However, the second sentence of Article 52(3) of the Charter permits EU law to accord more extensive protection (§122).

The AG specifies that “EU law permits the Court of Justice to deviate from the case-law of the ECtHR only to the extent that the former ascribes more extensive protection to specific fundamental rights than the latter. This deviation in turn is only permitted provided that it does not also cause another fundamental right in the Charter corresponding to a right in the ECHR to be accorded less protection than in the case-law of the ECtHR. One thinks, for example, of cases in which a trade-off must be made between specific fundamental rights” (§123).

Not surprisingly, the AG advises that when the case-law of the two Courts comes in conflict, the national courts should directly apply the case-law of the CJEU when it affords more protection to the fundamental rights in question, but they should send a reference for a preliminary ruling to the CJEU to ask which way to go when the case-law of the ECtHR affords enhanced protection to the fundamental right in question (§124 and §125). The argument of the AG is that the latter case “inevitably leads to a question of the interpretation of EU law with regard to the fundamental right in question and Article 52(3) of the Charter” which, if performed by the national Court, could further “amount to the view that the interpretation of the fundamental right in question by the Court of Justice is not compatible with Article 52(3)”.

As for the relevance of this question to the case at hand – it remains a mystery. The AG herself pointed out that “the admissibility of the question in this form is dubious, particularly as the Supreme Court does not state on which issue the two European courts supposedly are in conflict and the extent to which such a conflict is significant for the decision in the main proceedings” (§119).

  1. What to expect from the Court

How will the CJEU reply to these questions? My bet is that, in general, the Court will follow the AG on substance. However, it is possible that the Court will simplify the analysis and reformulate the questions in such a way that the answers will be structured around three main issues:

  • lawfulness of creating such a blacklist (and the lawful grounds for processing in the Data Protection Directive) and compatibility of this interference with both Article 7 and Article 8 of the Charter (I do hope, having low expectations nonetheless, that we will have more clarity of what constitutes interference with each of the two rights from the Court’s perspective);
  • compatibility of procedural law of Slovakia in the field of data protection with Article 47 Charter (in fact, this may be the only point where the Court could lay out a different result than the one proposed by the AG, in the sense that the condition to exhaust first administrative remedies before engaging in litigation may be considered a non-proportionate interference with the right to effective judicial remedy; it is also possible that the Court will refer for the first time directly to the GDPR);
  • the relationship between ECtHR and CJEU case-law on the same fundamental right.

Suggested citation: G. Zanfir-Fortuna, “Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)”, pdpEcho.com, 24 April 2017.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

Door-to-door gathering of data by religious group goes to the CJEU

Non-automated processing | Filing system | Household Exemption | Controller | Religious community

The Court of Justice of the EU received questions for a preliminary ruling from Finland regarding the practice of a religious group (Jehova’s Witnesses) to gather and record data after door-to-door visits, without informing the concerned individuals about this practice. The questions referred in Case C-25/17 Tietosuojavaltuutettu v Jehovah’s Witnesses concern the interpretation of several key points of Directive 95/45:

  1. Exceptions from the application of the Directive – and particularly Article 3(2) second paragraph, which excludes processing “by a natural person in the course of a purely personal or household activity” from the material scope of the Directive. The referring court wants the CJEU to clarify whether this exception applies to gathering data and writing observations in paper file connected to the door-to-door activity, by members of the religious group (Question 1).
  2. The concept of “filing system” as defined in Article 2(d) of the Directive.The question referred by the national Court is whether, taken as a whole, the manual collection of personal data (name and address and other information and characteristics of a person) carried out in connection with door-to-door evangelical work constitutes a filing system, being thus subject to the application of the Directive (Question 2).
  3. The concept of “controller” under Article 2(d) of the Directive. In particular, the referring court wants the CJEU to clarify whether in this situation the controller is considered to be the religious community as a whole, “even though the religious community claims that only the individual members carrying out evangelical work have access to the data collected” (Questions 3 and 4).

Without knowing the details of the case, and based only on the information available in the questions referred by the national Court, here is my bet on how the CJEU will reply:

  • The definition of “purely household activity” does not extend to the door-to-door evangelical work of a religious community; this exemption is to be interpreted strictly (“must be narrowly construed”; “must apply only in so far as is strictly necessary”), according to the CJEU in C-212/13 Rynes (§28 and §29). The CJEU also explained that this exception applies “only where it is carried out in the purely personal or household setting of the person processing the data” (§31) – which is not the case of representatives of a religious community gathering information during evangelical work.
  • The records the evangelical workers keep should be considered as constituting a “filing system”. This concept is defined as “any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis”. According to Recital 15 of the Directive, data in a filing system is “structured according to specific criteria relating to individuals, so as to permit easy access to the personal data in question”. If the religious community would claim that their records are not structured according to specific criteria – e.g. ZIP codes; members of the community/non-members; individuals interested in the community/individuals not interested, and that they don’t allow easy access to the personal data in question, then the purpose of having a detailed record would not be achieved. In other words, having an unstructured file is incongruent with the purpose of the activity. While it is true that the Member States have been given a margin of appreciation to lay down different criteria for determining the constituents of a structured set of personal data and the different criteria governing access to such a set, the criteria must be compatible with the definition in the Directive. Moreover, applying “loosely” the definition would amount to a limitation in relation to the protection of personal data, which must apply “only in so far as is strictly necessary” (Rynes §28, DRI §52).
  • The controller of this processing operation should be considered the religious community, as this entity establishes the purposes of the processing activity (the records are probably meant to facilitate the evangelical work of the community – there is no reference in the questions sent to the declared purpose of this activity, but it is only logical that such records are kept to facilitate the evangelical work) and the means of this activity (“by dividing up the areas in which the activity is carried out among members involved in evangelical work, supervising the work of those members and maintaining a list of individuals who do not wish to receive visits from evangelists” – according to the referring Court)

Since this new case provided an opportunity to discuss processing of personal data done by a religious community, there are a couple of additional points to be made.

First of all, according to Recital 35 of the Directive, “processing of personal data by official authorities for achieving aims, laid down in constitutional law or international public law, of officially recognized religious associations is carried out on important grounds of public interest“. This means that the religious associations do not need to rely on consent or on their legitimate interest as lawful grounds for processing. However, relying on public interest for the lawful ground of processing does not mean that they don’t have to comply with all the other obligations under data protection law. For instance, they still have to comply with the data quality principles, they still have to inform data subjects about the details of the processing activity and they still have to reply to requests of access, correction, erasure.

Second, some of the data gathered in such circumstances is sensitive data, as it refers to “religious beliefs” (Article 8 of the Directive, Article 9 of the GDPR). This means that the data should be processed with additional care and strengthened safeguards.

In case you are wondering whether the GDPR specifically addresses processing of data by religious communities, churches, Recital 35 of the Directive was transplanted to the GDPR, in Recital 55. In addition, the GDPR enshrines a specific provision that covers “existing data protection rules of churches and religious associations” – Article 91. This provision allows Member States that have specific legislation (“comprehensive rules”) in place dedicated to churches and religious communities, at the time of entry into force of the GDPR, to continue to apply those rules, but only if “they are brought into line with this Regulation”. In addition, according to the second paragraph, processing of personal data done by churches and religious associations that apply comprehensive national rules according to the first paragraph “shall be subject to the supervision of an independent supervisory authority, which may be specific”. Again, the conditions for this to happen is that this specific supervisory authority must fulfil the conditions laid down for independent supervisory authorities in the GDPR.

***

Note: Thanks to Dr. Mihaela Mazilu-Babel for pointing out this new case.

Find what you’re reading useful? Please consider supporting pdpecho.

 

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

The right to be forgotten goes back to the CJEU (with Google, CNIL, sensitive data, freedom of speech)

The Conseil d’Etat announced today that it referred several questions to the Court of Justice of the EU concerning the interpretation of the right to be forgotten, pursuant to Directive 95/46 and following the CJEU’s landmark decision in the Google v Spain case.

The questions were raised within proceedings involving the application of four individuals to the Conseil d’Etat to have decisions issued by the CNIL (French DPA) quashed. These decisions rejected their requests for injunctions against Google to have certain Google Search results delisted.

According to the press release of the Conseil d’Etat, “these requests were aimed at removing links relating to various pieces of information :a video that explicitly revealed the nature of the relationship that an applicant was deemed to have entertained with a person holding a public office; a press article relating to the suicide committed by a member of the Church of Scientology, mentioning that one of the applicants was the public relations manager of that Church; various articles relating to criminal proceedings concerning an applicant; and articles relating the conviction of another applicant for having sexually aggressed minors.

The Conseil d’Etat further explained that in order to rule on these claims, it has deemed necessary to answer a number of questions “raising serious issues with regard to the interpretation of European law in the light of the European Court of Justice’s judgment in its Google Spain case.

Such issues are in relation with the obligations applying to the operator of a search engine with regard to web pages that contain sensitive data, when collecting and processing such information is illegal or very narrowly framed by legislation, on the grounds of its content relating to sexual orientations, political, religious or philosophical opinions, criminal offences, convictions or safety measures. On that point, the cases brought before the Conseil d’Etat raise questions in close connection with the obligations that lie on the operator of a search engine, when such information is embedded in a press article or when the content that relates to it is false or incomplete”.

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU case to follow: purpose limitation, processing sensitive data, non-material damage

A new case received by the General Court of the CJEU was published in the Official Journal of the EU in February, Case T-881/16 HJ v EMA.

A British citizen seeks to engage the non-contractual liability of the European Medicines Agency for breaching data protection law. The applicant claims that “the documents in his personal file, which were made public and accessible to any member of staff of the European Medicines Agency for a period of time, were not processed fairly and lawfully but were processed for purposes other than those for which they were collected without that change in purpose having been expressly authorised by the applicant”.

Further, the applicant claims that “the dissemination of that sensitive data consequently called into question the applicant’s integrity, causing him real and certain non-material harm”.

The applicant asks the Court to “order the defendant to pay the applicant the symbolic sum of EUR 1 by way of compensation for the non-material harm suffered”.

Even if in the published summary there is no mention of the applicable law, it is clear that Regulation 45/2001 is relevant in this case – the data protection regulation applicable to EU institutions and bodies (EMA is an EU body). The rules of Regulation 45/2001 are fairly similar to those of Directive 95/46.

(Thanks dr. Mihaela Mazilu-Babel for bringing this case to my attention)

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

Data retention, only possible under strict necessity: targeted retention and pre-authorised access to retained data

The Court of Justice of the European Union (‘the Court’ or ‘CJEU’) gave a second judgment this week on the compatibility of data retention measures with the fundamental rights of persons as guaranteed by the Charter of Fundamental Rights of the EU (in Joined Cases C-203/15 and C-698/15 Tele2Sverige). The Court confirmed all its findings from the earlier Digital Rights Ireland judgment and took the opportunity to clarify and nuance some of its initial key-findings (for an analysis of the DRI judgment, see my article published in 2015).

The two cases that were joined by the Court emerged in the fallout of the invalidation of the Data Retention Directive by the CJEU in the DRI judgment. Even if that Directive was declared invalid for breaching fundamental rights, most of the national laws that transposed it in the Member States were kept in force invoking Article 15(1) of the ePrivacy Directive. This Article provided for an exception to the rule of ensuring confidentiality of communications, which allowed Member States to “inter alia, adopt legislative measures providing for the retention of data for a limited period justified on the grounds laid down in this paragraph”. What the Member States seem to have disregarded with their decision to keep national data retention laws in force was that the same paragraph, last sentence, provided that “all the measures referred to in this paragraph (including data retention – my note) shall be in accordance with the general principles of Community law” (see §91 and §92 of the judgment). Respect for fundamental rights is one of those principles.

The Tele2Sverige case was initiated by a telecommunications service provider that followed the decision of the Court in DRI and stopped to retain data, because it considered that the national law requiring it do retain data was in breach of EU law. The Swedish authorities did not agree with this interpretation and this is how the Court was given the opportunity to clarify the relationship between national data retention law and EU law after the invalidation of the Data Retention Directive. The Watson case originates in the UK, was initiated by individuals and refers to the Data Retention and Investigatory Powers Act 2014(DRIPA).

In summary, the Court found that “national legislation which, for the purpose of fighting crime, provides for general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” is in breach of Article 7 (right to private life), Article 8 (right to the protection of personal data) and Article 11 (right to freedom of speech) from the Charter of Fundamental Rights of the EU. The Court clarified that such legislation is precluded by Article 15(1) of the ePrivacy Directive. (See §1 from the executive part of the judgment)

Moreover, the Court found that national legislation in the field of the ePrivacy Directive that regulates the access of competent national authorities to retained data is incompatible with the three fundamental rights mentioned above, as long as:

  1. the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime;
  2. access is not subject to prior review by a court or an independent administrative authority;
  3. there is no requirement that the data concerned should be retained within the European Union (§2 of the operative part of the judgment).

There are a couple of remarkable findings of the Court in the Tele2Sverige/Watson judgment, analysed below. Brace yourselves for a long post. But it’s worth it. I’ll be looking at (1) how indiscriminate retention of metadata interferes with freedom of speech, (2) why data retention is merely an exception of the principle of confidentiality of communications and must not become the rule, (3) why the Court considers retaining on a generalised basis metadata is a far-reaching intrusion in the right to private life, (4) what is “targeted retention” and under what conditions the Court sees it acceptable and, finally (5) what is the impact of all of this on the Privacy Shield and PNR schemes.

 

(1) Indiscriminate retention of metadata interferes with freedom of speech

Even though none of the preliminary ruling questions asked the Court to look at compliance of national data retention measures also in the light of Article 11 Charter (freedom of speech), the Court did so by its own motion.

This was needed so that the Court finishes what it began in DRI. In that previous case, the Court referred to Article 11 Charter in §28, replying to a specific preliminary ruling question, by mentioning that:

“it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter”.

However, it never analysed if that was the case. In §70, the Court just stated that, after finding the Directive to be invalid because it was not compliant with Articles 7 and 8 of the Charter, “there is no need to examine the validity of Directive 2006/24 in the light of Article 11 of the Charter”.

This time, the Court developed its argument. It started by underlying that data retention legislation such as that at issue in the main proceedings “raises questions relating to compatibility not only with Articles 7 and 8 of the Charter, which are expressly referred to in the questions referred for a preliminary ruling, but also with the freedom of expression guaranteed in Article 11 of the Charter” (§92).

The Court continued by emphasising that the importance of freedom of expression must be taken into consideration when interpreting Article 15(1) of the ePrivacy Directive “in the light of the particular importance accorded to that freedom in any democratic society” (§93). “That fundamental right (freedom of expression), guaranteed in Article 11 of the Charter, constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded” (§93), it continues.

The Court justifies the link between data retention and freedom of expression by slightly more confidently (compared to DRI) stating that:

“the retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Article 11 of the Charter” (§101)

The operative part of the judgment clearly states that Articles 7, 8 and 11 of the Charter preclude data retention legislation such as that in the main proceedings.

(2) The exception to the “principle of confidentiality” must not become the rule

The Court refers several times to a “principle of confidentiality of communications” (§85, §90, §95, §115). It explains in §85 that this principle is established by the ePrivacy Directive and “implies, inter alia, (…) that, as a general rule, any person other than the users is prohibited from storing, without the consent of the users concerned, the traffic data related to electronic communications. The only exceptions relate to persons lawfully authorised in accordance with Article 15(1) of that directive and to the technical storage necessary for conveyance of a communication.”

With regard to the first exception, the Court recalls that, because Article 15(1) is construed so as “to restrict the scope of the obligation of principle to ensure confidentiality of communications and related traffic data”, it “must, in accordance with the Court’s settled case-law, be interpreted strictly” (§89). The Court adds, using strong language:

“That provision cannot, therefore, permit the exception to that obligation of principle and, in particular, to the prohibition on storage of data, laid down in Article 5 of Directive 2002/58, to become the rule, if the latter provision is not to be rendered largely meaningless” (§89).

In any case, the Court adds, all exceptions adopted pursuant to Article 15(1) of the ePrivacy Directive must be in accordance with the general principles of EU law, which include the fundamental rights guaranteed by the Charter (§91) and must strictly have one of the objectives enumerated in Article 15(1) of the ePrivacy Directive (§90).

As for the second derogation to the principle, the Court looks at recitals 22 and 26 of the ePrivacy Directive and affirms that the retention of traffic data is permitted “only to the extent necessary and for the time necessary for the billing and marketing of services and the provision of value added services. (…) As regards, in particular, the billing of services, that processing is permitted only up to the end of the period during which the bill may be lawfully challenged or legal proceedings brought to obtain payment. Once that period has elapsed, the data processed and stored must be erased or made anonymous” (§85).

(3) A”very far-reaching” and “particularly serious” interference

The Court observed that the national data retention laws at issue in the main proceedings “provides for a general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication, and that it imposes on providers of electronic communications services an obligation to retain that data systematically and continuously, with no exceptions” (§97).

The data retained is metadata and is described in detail in §98. The Court confirmed its assessment in DRI that metadata “taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as everyday habits, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them” (§99). It also added that this data “provides the means (…) of establishing a profile of the individuals concerned, information that is no less sensitive, having regard to the right to privacy, than the actual content of communications” (§99).

The Court went further to emphasise that this kind of undiscriminating gathering of data represents a “very far-reaching” and “particularly serious” interference in the fundamental rights to private life and protection of personal data (§100). Moreover, “he fact that the data is retained without the subscriber or registered user being informed is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance” (§100).

The Court indicates that such a far-reaching interference can only be justified by the objective of fighting serious crime (§102). And even in this case, the objective of fighting serious crime does not justify in itself “general and indiscriminate retention of all traffic and location data” (§103). The measures must, in addition, be strictly necessary to achieve this objective (§106).

The Court found that the national legislation such as that at issue in the main proceedings does not comply with this request, because (§105):

  • it “covers, in a generalised manner, all subscribers and registered users and all means of electronic communication as well as all traffic data, provides for no differentiation, limitation or exception according to the objective pursued”.
  • “It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings”.
  • It “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences”.
  • “it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

(4) Targeted data retention is permissible. Here is a list with all conditions:

The Court spells out that fundamental rights do not prevent a Member State from adopting “legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to:

  • the categories of data to be retained,
  • the means of communication affected,
  • the persons concerned and
  • the retention period adopted, to what is strictly necessary” (§108).

In addition, such legislation must:

  • “lay down clear and precise rules governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse.
  • indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that such a measure is limited to what is strictly necessary” §109().

Other conditions that need to be fulfilled for a data retention legislation to be considered compatible with fundamental rights are indicated directly or indirectly by the Court in further paragraphs.

Such legislation must:

  • be restricted to “retention in relation to data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or
  • persons who could, for other reasons, contribute, through their data being retained, to fighting crime” (§106).
  • “meet objective criteria, that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected” (§110).
  • “be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences, and to contribute in one way or another to fighting serious crime or to preventing a serious risk to public security” (§111).
  • “lay down clear and precise rules indicating in what circumstances and under which conditions the providers of electronic communications services must grant the competent national authorities access to the data. (…) a measure of that kind must be legally binding under domestic law” (§117).
  • “lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data” (§118).
  • provide that data must be “retained within the European Union” (§122).
  • provide for “the irreversible destruction of the data at the end of the data retention period” (§122).
  • must “ensure review, by an independent authority, of compliance with the level of protection guaranteed by EU law with respect to the protection of individuals in relation to the processing of personal data, that control being expressly required by Article 8(3) of the Charter” (§123).

Other specific conditions emerge with regard to access of competent authorities to the retained data. Access:

  • “can be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime” (§119). [The Court refers here to the ECtHR cases of Zacharov and Szabo, after a long series of privacy related cases where it did not refer at all to the ECtHR case-law].
  • must be subject to “a prior review carried out either by a court or by an independent administrative body” (…) “the decision of that court or body should be made following a reasoned request by those authorities submitted, inter alia, within the framework of procedures for the prevention, detection or prosecution of crime” (§120). The only exception for the prior review are “cases of validly established urgency” (§120).
  • must be notified by authorities to the persons affected “under the applicable national procedures, as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities. That notification is, in fact, necessary to enable the persons affected to exercise, inter alia, their right to a legal remedy” (§121).
  • must be restricted solely to fighting serious crime (§125).

(5) Possible effects on the Privacy Shield and on PNR schemes

This judgment could have indirect effects on the “Privacy Shield” and slightly more immediate effects on Passenger Name Records schemes.

The indirect effect on the Privacy Shield and on all other adequacy schemes could only manifest in the context of a challenge of such transfer instruments before the CJEU. The seriousness with which the Court of Justice detailed all conditions that must be met by a legislative measure providing for a particular processing of personal data to be compliant with the fundamental rights to private life and to the protection of personal data strengthen the condition of “essentially equivalence”.

In other words, it will be difficult to convince the Court that a third country that allows collection of metadata (and all the more so content of communications) on a large scale and access to that data which is not made under the supervision of an independent authority, provides an adequate level of protection that would lawfully allow transfers of data from the EU to that third country. (For comparison, the CJEU referred to the Digital Rights Ireland case for 8 times and in key findings in its judgment in Schrems).

As for PNR schemes, the effects may come sooner and more directly, as we are waiting for the Court’s Opinion in Avis 1/15 on the compliance of the EU-PNR Canada agreement with fundamental rights. It is to be expected that the Court will copiously refer back to its new list of conditions for access by authorities to retained personal data when looking at how all PNR data is directly transferred by companies to law enforcement authorities in a third country, with no limitations.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Greek judges asked the CJEU if they should dismiss evidence gathered under the national law that transposed the invalidated Data Retention Directive

Here is a new case at the Court of Justice of the EU that the data protection world will be looking forward to, as it addresses questions about the practical effects of the invalidation of the Data Retention Directive.

old_bailey_microcosm

(licensed under Creative Commons)

Case C-475/16 K. (yes, like those Kafka characters) concerns criminal proceedings against K. before Greek courts, which apparently involve evidence gathered under the Greek national law that transposed the now-invalidated Data Retention Directive. The Directive was invalidated in its entirety by the CJEU in 2014, after the Court found in its Digital Rights Ireland judgment that the provisions of the Directive breached Articles 7 (right to respect for private life) and 8 (right to the protection of personal data) of the Charter of Fundamental Rights.

The Greek judges sent in August a big set out questions for a preliminary ruling to the CJEU (17 questions). Among those, there are a couple of very interesting ones, because they deal with the effects in practice of the invalidation of an EU Directive and what happens with national laws of the Member States that transposed the Directive.

For instance, the national judge asks whether national courts are obliged not to apply legislative measures transposing the annulled Directive and whether this obligation also means that they must dismiss evidence obtained as a consequence of those legislative measures (Question 3). The national judge also wants to know if maintaining the national law that transposes an invalidated Directive constitutes an obstacle to the establishment and functioning of the internal market (Question 16).

Another question raised by the national judge is whether the national legislation that transposed the annulled Data Retention Directive and that remained in force at national level after the annulment is still considered as falling under the scope of EU law (Question 4). The answer to this question is important because the EU Charter and the supremacy of EU law do not apply to situations that fall outside the scope of EU law.

The Greek judge didn’t miss the opportunity to also ask about the effect on the national law transposing the Data Retention Directive of the fact that this Directive was also enacted to implement a harmonised framework at the European level under Article 15(1) of the ePrivacy Directive (Question 5). The question is whether this fact is enough to bring the surviving national data retention laws under the scope of EU law.

As long as the Charter will be considered applicable to the facts of the case, the national judge further wants to know whether national law that complies partly with the criteria set out in the Digital Rights Ireland decision still breaches Articles 7 and 8 of the Charter because it doesn’t comply with all of it (Question 13). For instance, the national judge estimates that the national law doesn’t comply with the request that the persons whose data are retained must be at least indirectly in a situation which is liable to give rise to criminal prosecutions (para 58 DRI), but it complies with the request that the national law must contain substantive and procedural conditions for the access of competent authorities to the retained data and objective criteria by which the number of persons authorised to access these data is limited to what is strictly necessary (paras 61, 62 DRI).

Lastly, it will be also interesting to see whether the Court decides to address the issue of what “serious crime” means in the context of limiting the exercise of fundamental rights (Questions 10 and 11).

If you would like to dwell into some of these topics, have a look at the AG Opinion in the Tele2Sverige case, published on 19 July 2016. The judgment in that case is due on 21 December 2016. Also, have a look at this analysis of the Opinion.

As for a quick “what to expect” in the K. case from my side, here it is:

  • the CJEU will seriously re-organise the 17 questions and regroup them in 4 to 5 topics, also clarifying that it only deals with the interpretation of EU law, not national law or facts in national proceedings;
  • the national laws transposing the Data Retention Directive will probably be considered as being in the field of EU law – as they also regulate within the ambit of the ePrivacy Directive;
  • the Court will restate the criteria in DRI and probably clarify that all criteria must be complied with, no exceptions, in order for national measures to comply with the Charter;
  • the CJEU will probably not give indications to the national courts on whether they should admit or dismiss evidence collected on the bases of national law that does not comply with EU law – it’s too specific and the Court is ‘in the business’ of interpreting EU law; the best case scenario, which is possible, is that the Court will give some guidance on the obligations of Member States (and hopefully their authorities) regarding the effects of their transposing national laws when relevant EU secondary law is annulled;
  • as for what “serious crime” means in the context of limiting fundamental rights, let’s see about that. Probably the Court will give useful guidance.

***

Find what you’re reading useful? Please consider supporting pdpecho.