Tag Archives: CJEU

Door-to-door gathering of data by religious group goes to the CJEU

Non-automated processing | Filing system | Household Exemption | Controller | Religious community

The Court of Justice of the EU received questions for a preliminary ruling from Finland regarding the practice of a religious group (Jehova’s Witnesses) to gather and record data after door-to-door visits, without informing the concerned individuals about this practice. The questions referred in Case C-25/17 Tietosuojavaltuutettu v Jehovah’s Witnesses concern the interpretation of several key points of Directive 95/45:

  1. Exceptions from the application of the Directive – and particularly Article 3(2) second paragraph, which excludes processing “by a natural person in the course of a purely personal or household activity” from the material scope of the Directive. The referring court wants the CJEU to clarify whether this exception applies to gathering data and writing observations in paper file connected to the door-to-door activity, by members of the religious group (Question 1).
  2. The concept of “filing system” as defined in Article 2(d) of the Directive.The question referred by the national Court is whether, taken as a whole, the manual collection of personal data (name and address and other information and characteristics of a person) carried out in connection with door-to-door evangelical work constitutes a filing system, being thus subject to the application of the Directive (Question 2).
  3. The concept of “controller” under Article 2(d) of the Directive. In particular, the referring court wants the CJEU to clarify whether in this situation the controller is considered to be the religious community as a whole, “even though the religious community claims that only the individual members carrying out evangelical work have access to the data collected” (Questions 3 and 4).

Without knowing the details of the case, and based only on the information available in the questions referred by the national Court, here is my bet on how the CJEU will reply:

  • The definition of “purely household activity” does not extend to the door-to-door evangelical work of a religious community; this exemption is to be interpreted strictly (“must be narrowly construed”; “must apply only in so far as is strictly necessary”), according to the CJEU in C-212/13 Rynes (§28 and §29). The CJEU also explained that this exception applies “only where it is carried out in the purely personal or household setting of the person processing the data” (§31) – which is not the case of representatives of a religious community gathering information during evangelical work.
  • The records the evangelical workers keep should be considered as constituting a “filing system”. This concept is defined as “any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis”. According to Recital 15 of the Directive, data in a filing system is “structured according to specific criteria relating to individuals, so as to permit easy access to the personal data in question”. If the religious community would claim that their records are not structured according to specific criteria – e.g. ZIP codes; members of the community/non-members; individuals interested in the community/individuals not interested, and that they don’t allow easy access to the personal data in question, then the purpose of having a detailed record would not be achieved. In other words, having an unstructured file is incongruent with the purpose of the activity. While it is true that the Member States have been given a margin of appreciation to lay down different criteria for determining the constituents of a structured set of personal data and the different criteria governing access to such a set, the criteria must be compatible with the definition in the Directive. Moreover, applying “loosely” the definition would amount to a limitation in relation to the protection of personal data, which must apply “only in so far as is strictly necessary” (Rynes §28, DRI §52).
  • The controller of this processing operation should be considered the religious community, as this entity establishes the purposes of the processing activity (the records are probably meant to facilitate the evangelical work of the community – there is no reference in the questions sent to the declared purpose of this activity, but it is only logical that such records are kept to facilitate the evangelical work) and the means of this activity (“by dividing up the areas in which the activity is carried out among members involved in evangelical work, supervising the work of those members and maintaining a list of individuals who do not wish to receive visits from evangelists” – according to the referring Court)

Since this new case provided an opportunity to discuss processing of personal data done by a religious community, there are a couple of additional points to be made.

First of all, according to Recital 35 of the Directive, “processing of personal data by official authorities for achieving aims, laid down in constitutional law or international public law, of officially recognized religious associations is carried out on important grounds of public interest“. This means that the religious associations do not need to rely on consent or on their legitimate interest as lawful grounds for processing. However, relying on public interest for the lawful ground of processing does not mean that they don’t have to comply with all the other obligations under data protection law. For instance, they still have to comply with the data quality principles, they still have to inform data subjects about the details of the processing activity and they still have to reply to requests of access, correction, erasure.

Second, some of the data gathered in such circumstances is sensitive data, as it refers to “religious beliefs” (Article 8 of the Directive, Article 9 of the GDPR). This means that the data should be processed with additional care and strengthened safeguards.

In case you are wondering whether the GDPR specifically addresses processing of data by religious communities, churches, Recital 35 of the Directive was transplanted to the GDPR, in Recital 55. In addition, the GDPR enshrines a specific provision that covers “existing data protection rules of churches and religious associations” – Article 91. This provision allows Member States that have specific legislation (“comprehensive rules”) in place dedicated to churches and religious communities, at the time of entry into force of the GDPR, to continue to apply those rules, but only if “they are brought into line with this Regulation”. In addition, according to the second paragraph, processing of personal data done by churches and religious associations that apply comprehensive national rules according to the first paragraph “shall be subject to the supervision of an independent supervisory authority, which may be specific”. Again, the conditions for this to happen is that this specific supervisory authority must fulfil the conditions laid down for independent supervisory authorities in the GDPR.

***

Note: Thanks to Dr. Mihaela Mazilu-Babel for pointing out this new case.

Find what you’re reading useful? Please consider supporting pdpecho.

 

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

The right to be forgotten goes back to the CJEU (with Google, CNIL, sensitive data, freedom of speech)

The Conseil d’Etat announced today that it referred several questions to the Court of Justice of the EU concerning the interpretation of the right to be forgotten, pursuant to Directive 95/46 and following the CJEU’s landmark decision in the Google v Spain case.

The questions were raised within proceedings involving the application of four individuals to the Conseil d’Etat to have decisions issued by the CNIL (French DPA) quashed. These decisions rejected their requests for injunctions against Google to have certain Google Search results delisted.

According to the press release of the Conseil d’Etat, “these requests were aimed at removing links relating to various pieces of information :a video that explicitly revealed the nature of the relationship that an applicant was deemed to have entertained with a person holding a public office; a press article relating to the suicide committed by a member of the Church of Scientology, mentioning that one of the applicants was the public relations manager of that Church; various articles relating to criminal proceedings concerning an applicant; and articles relating the conviction of another applicant for having sexually aggressed minors.

The Conseil d’Etat further explained that in order to rule on these claims, it has deemed necessary to answer a number of questions “raising serious issues with regard to the interpretation of European law in the light of the European Court of Justice’s judgment in its Google Spain case.

Such issues are in relation with the obligations applying to the operator of a search engine with regard to web pages that contain sensitive data, when collecting and processing such information is illegal or very narrowly framed by legislation, on the grounds of its content relating to sexual orientations, political, religious or philosophical opinions, criminal offences, convictions or safety measures. On that point, the cases brought before the Conseil d’Etat raise questions in close connection with the obligations that lie on the operator of a search engine, when such information is embedded in a press article or when the content that relates to it is false or incomplete”.

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU case to follow: purpose limitation, processing sensitive data, non-material damage

A new case received by the General Court of the CJEU was published in the Official Journal of the EU in February, Case T-881/16 HJ v EMA.

A British citizen seeks to engage the non-contractual liability of the European Medicines Agency for breaching data protection law. The applicant claims that “the documents in his personal file, which were made public and accessible to any member of staff of the European Medicines Agency for a period of time, were not processed fairly and lawfully but were processed for purposes other than those for which they were collected without that change in purpose having been expressly authorised by the applicant”.

Further, the applicant claims that “the dissemination of that sensitive data consequently called into question the applicant’s integrity, causing him real and certain non-material harm”.

The applicant asks the Court to “order the defendant to pay the applicant the symbolic sum of EUR 1 by way of compensation for the non-material harm suffered”.

Even if in the published summary there is no mention of the applicable law, it is clear that Regulation 45/2001 is relevant in this case – the data protection regulation applicable to EU institutions and bodies (EMA is an EU body). The rules of Regulation 45/2001 are fairly similar to those of Directive 95/46.

(Thanks dr. Mihaela Mazilu-Babel for bringing this case to my attention)

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

Data retention, only possible under strict necessity: targeted retention and pre-authorised access to retained data

The Court of Justice of the European Union (‘the Court’ or ‘CJEU’) gave a second judgment this week on the compatibility of data retention measures with the fundamental rights of persons as guaranteed by the Charter of Fundamental Rights of the EU (in Joined Cases C-203/15 and C-698/15 Tele2Sverige). The Court confirmed all its findings from the earlier Digital Rights Ireland judgment and took the opportunity to clarify and nuance some of its initial key-findings (for an analysis of the DRI judgment, see my article published in 2015).

The two cases that were joined by the Court emerged in the fallout of the invalidation of the Data Retention Directive by the CJEU in the DRI judgment. Even if that Directive was declared invalid for breaching fundamental rights, most of the national laws that transposed it in the Member States were kept in force invoking Article 15(1) of the ePrivacy Directive. This Article provided for an exception to the rule of ensuring confidentiality of communications, which allowed Member States to “inter alia, adopt legislative measures providing for the retention of data for a limited period justified on the grounds laid down in this paragraph”. What the Member States seem to have disregarded with their decision to keep national data retention laws in force was that the same paragraph, last sentence, provided that “all the measures referred to in this paragraph (including data retention – my note) shall be in accordance with the general principles of Community law” (see §91 and §92 of the judgment). Respect for fundamental rights is one of those principles.

The Tele2Sverige case was initiated by a telecommunications service provider that followed the decision of the Court in DRI and stopped to retain data, because it considered that the national law requiring it do retain data was in breach of EU law. The Swedish authorities did not agree with this interpretation and this is how the Court was given the opportunity to clarify the relationship between national data retention law and EU law after the invalidation of the Data Retention Directive. The Watson case originates in the UK, was initiated by individuals and refers to the Data Retention and Investigatory Powers Act 2014(DRIPA).

In summary, the Court found that “national legislation which, for the purpose of fighting crime, provides for general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” is in breach of Article 7 (right to private life), Article 8 (right to the protection of personal data) and Article 11 (right to freedom of speech) from the Charter of Fundamental Rights of the EU. The Court clarified that such legislation is precluded by Article 15(1) of the ePrivacy Directive. (See §1 from the executive part of the judgment)

Moreover, the Court found that national legislation in the field of the ePrivacy Directive that regulates the access of competent national authorities to retained data is incompatible with the three fundamental rights mentioned above, as long as:

  1. the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime;
  2. access is not subject to prior review by a court or an independent administrative authority;
  3. there is no requirement that the data concerned should be retained within the European Union (§2 of the operative part of the judgment).

There are a couple of remarkable findings of the Court in the Tele2Sverige/Watson judgment, analysed below. Brace yourselves for a long post. But it’s worth it. I’ll be looking at (1) how indiscriminate retention of metadata interferes with freedom of speech, (2) why data retention is merely an exception of the principle of confidentiality of communications and must not become the rule, (3) why the Court considers retaining on a generalised basis metadata is a far-reaching intrusion in the right to private life, (4) what is “targeted retention” and under what conditions the Court sees it acceptable and, finally (5) what is the impact of all of this on the Privacy Shield and PNR schemes.

 

(1) Indiscriminate retention of metadata interferes with freedom of speech

Even though none of the preliminary ruling questions asked the Court to look at compliance of national data retention measures also in the light of Article 11 Charter (freedom of speech), the Court did so by its own motion.

This was needed so that the Court finishes what it began in DRI. In that previous case, the Court referred to Article 11 Charter in §28, replying to a specific preliminary ruling question, by mentioning that:

“it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter”.

However, it never analysed if that was the case. In §70, the Court just stated that, after finding the Directive to be invalid because it was not compliant with Articles 7 and 8 of the Charter, “there is no need to examine the validity of Directive 2006/24 in the light of Article 11 of the Charter”.

This time, the Court developed its argument. It started by underlying that data retention legislation such as that at issue in the main proceedings “raises questions relating to compatibility not only with Articles 7 and 8 of the Charter, which are expressly referred to in the questions referred for a preliminary ruling, but also with the freedom of expression guaranteed in Article 11 of the Charter” (§92).

The Court continued by emphasising that the importance of freedom of expression must be taken into consideration when interpreting Article 15(1) of the ePrivacy Directive “in the light of the particular importance accorded to that freedom in any democratic society” (§93). “That fundamental right (freedom of expression), guaranteed in Article 11 of the Charter, constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded” (§93), it continues.

The Court justifies the link between data retention and freedom of expression by slightly more confidently (compared to DRI) stating that:

“the retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Article 11 of the Charter” (§101)

The operative part of the judgment clearly states that Articles 7, 8 and 11 of the Charter preclude data retention legislation such as that in the main proceedings.

(2) The exception to the “principle of confidentiality” must not become the rule

The Court refers several times to a “principle of confidentiality of communications” (§85, §90, §95, §115). It explains in §85 that this principle is established by the ePrivacy Directive and “implies, inter alia, (…) that, as a general rule, any person other than the users is prohibited from storing, without the consent of the users concerned, the traffic data related to electronic communications. The only exceptions relate to persons lawfully authorised in accordance with Article 15(1) of that directive and to the technical storage necessary for conveyance of a communication.”

With regard to the first exception, the Court recalls that, because Article 15(1) is construed so as “to restrict the scope of the obligation of principle to ensure confidentiality of communications and related traffic data”, it “must, in accordance with the Court’s settled case-law, be interpreted strictly” (§89). The Court adds, using strong language:

“That provision cannot, therefore, permit the exception to that obligation of principle and, in particular, to the prohibition on storage of data, laid down in Article 5 of Directive 2002/58, to become the rule, if the latter provision is not to be rendered largely meaningless” (§89).

In any case, the Court adds, all exceptions adopted pursuant to Article 15(1) of the ePrivacy Directive must be in accordance with the general principles of EU law, which include the fundamental rights guaranteed by the Charter (§91) and must strictly have one of the objectives enumerated in Article 15(1) of the ePrivacy Directive (§90).

As for the second derogation to the principle, the Court looks at recitals 22 and 26 of the ePrivacy Directive and affirms that the retention of traffic data is permitted “only to the extent necessary and for the time necessary for the billing and marketing of services and the provision of value added services. (…) As regards, in particular, the billing of services, that processing is permitted only up to the end of the period during which the bill may be lawfully challenged or legal proceedings brought to obtain payment. Once that period has elapsed, the data processed and stored must be erased or made anonymous” (§85).

(3) A”very far-reaching” and “particularly serious” interference

The Court observed that the national data retention laws at issue in the main proceedings “provides for a general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication, and that it imposes on providers of electronic communications services an obligation to retain that data systematically and continuously, with no exceptions” (§97).

The data retained is metadata and is described in detail in §98. The Court confirmed its assessment in DRI that metadata “taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as everyday habits, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them” (§99). It also added that this data “provides the means (…) of establishing a profile of the individuals concerned, information that is no less sensitive, having regard to the right to privacy, than the actual content of communications” (§99).

The Court went further to emphasise that this kind of undiscriminating gathering of data represents a “very far-reaching” and “particularly serious” interference in the fundamental rights to private life and protection of personal data (§100). Moreover, “he fact that the data is retained without the subscriber or registered user being informed is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance” (§100).

The Court indicates that such a far-reaching interference can only be justified by the objective of fighting serious crime (§102). And even in this case, the objective of fighting serious crime does not justify in itself “general and indiscriminate retention of all traffic and location data” (§103). The measures must, in addition, be strictly necessary to achieve this objective (§106).

The Court found that the national legislation such as that at issue in the main proceedings does not comply with this request, because (§105):

  • it “covers, in a generalised manner, all subscribers and registered users and all means of electronic communication as well as all traffic data, provides for no differentiation, limitation or exception according to the objective pursued”.
  • “It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings”.
  • It “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences”.
  • “it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

(4) Targeted data retention is permissible. Here is a list with all conditions:

The Court spells out that fundamental rights do not prevent a Member State from adopting “legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to:

  • the categories of data to be retained,
  • the means of communication affected,
  • the persons concerned and
  • the retention period adopted, to what is strictly necessary” (§108).

In addition, such legislation must:

  • “lay down clear and precise rules governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse.
  • indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that such a measure is limited to what is strictly necessary” §109().

Other conditions that need to be fulfilled for a data retention legislation to be considered compatible with fundamental rights are indicated directly or indirectly by the Court in further paragraphs.

Such legislation must:

  • be restricted to “retention in relation to data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or
  • persons who could, for other reasons, contribute, through their data being retained, to fighting crime” (§106).
  • “meet objective criteria, that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected” (§110).
  • “be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences, and to contribute in one way or another to fighting serious crime or to preventing a serious risk to public security” (§111).
  • “lay down clear and precise rules indicating in what circumstances and under which conditions the providers of electronic communications services must grant the competent national authorities access to the data. (…) a measure of that kind must be legally binding under domestic law” (§117).
  • “lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data” (§118).
  • provide that data must be “retained within the European Union” (§122).
  • provide for “the irreversible destruction of the data at the end of the data retention period” (§122).
  • must “ensure review, by an independent authority, of compliance with the level of protection guaranteed by EU law with respect to the protection of individuals in relation to the processing of personal data, that control being expressly required by Article 8(3) of the Charter” (§123).

Other specific conditions emerge with regard to access of competent authorities to the retained data. Access:

  • “can be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime” (§119). [The Court refers here to the ECtHR cases of Zacharov and Szabo, after a long series of privacy related cases where it did not refer at all to the ECtHR case-law].
  • must be subject to “a prior review carried out either by a court or by an independent administrative body” (…) “the decision of that court or body should be made following a reasoned request by those authorities submitted, inter alia, within the framework of procedures for the prevention, detection or prosecution of crime” (§120). The only exception for the prior review are “cases of validly established urgency” (§120).
  • must be notified by authorities to the persons affected “under the applicable national procedures, as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities. That notification is, in fact, necessary to enable the persons affected to exercise, inter alia, their right to a legal remedy” (§121).
  • must be restricted solely to fighting serious crime (§125).

(5) Possible effects on the Privacy Shield and on PNR schemes

This judgment could have indirect effects on the “Privacy Shield” and slightly more immediate effects on Passenger Name Records schemes.

The indirect effect on the Privacy Shield and on all other adequacy schemes could only manifest in the context of a challenge of such transfer instruments before the CJEU. The seriousness with which the Court of Justice detailed all conditions that must be met by a legislative measure providing for a particular processing of personal data to be compliant with the fundamental rights to private life and to the protection of personal data strengthen the condition of “essentially equivalence”.

In other words, it will be difficult to convince the Court that a third country that allows collection of metadata (and all the more so content of communications) on a large scale and access to that data which is not made under the supervision of an independent authority, provides an adequate level of protection that would lawfully allow transfers of data from the EU to that third country. (For comparison, the CJEU referred to the Digital Rights Ireland case for 8 times and in key findings in its judgment in Schrems).

As for PNR schemes, the effects may come sooner and more directly, as we are waiting for the Court’s Opinion in Avis 1/15 on the compliance of the EU-PNR Canada agreement with fundamental rights. It is to be expected that the Court will copiously refer back to its new list of conditions for access by authorities to retained personal data when looking at how all PNR data is directly transferred by companies to law enforcement authorities in a third country, with no limitations.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Greek judges asked the CJEU if they should dismiss evidence gathered under the national law that transposed the invalidated Data Retention Directive

Here is a new case at the Court of Justice of the EU that the data protection world will be looking forward to, as it addresses questions about the practical effects of the invalidation of the Data Retention Directive.

old_bailey_microcosm

(licensed under Creative Commons)

Case C-475/16 K. (yes, like those Kafka characters) concerns criminal proceedings against K. before Greek courts, which apparently involve evidence gathered under the Greek national law that transposed the now-invalidated Data Retention Directive. The Directive was invalidated in its entirety by the CJEU in 2014, after the Court found in its Digital Rights Ireland judgment that the provisions of the Directive breached Articles 7 (right to respect for private life) and 8 (right to the protection of personal data) of the Charter of Fundamental Rights.

The Greek judges sent in August a big set out questions for a preliminary ruling to the CJEU (17 questions). Among those, there are a couple of very interesting ones, because they deal with the effects in practice of the invalidation of an EU Directive and what happens with national laws of the Member States that transposed the Directive.

For instance, the national judge asks whether national courts are obliged not to apply legislative measures transposing the annulled Directive and whether this obligation also means that they must dismiss evidence obtained as a consequence of those legislative measures (Question 3). The national judge also wants to know if maintaining the national law that transposes an invalidated Directive constitutes an obstacle to the establishment and functioning of the internal market (Question 16).

Another question raised by the national judge is whether the national legislation that transposed the annulled Data Retention Directive and that remained in force at national level after the annulment is still considered as falling under the scope of EU law (Question 4). The answer to this question is important because the EU Charter and the supremacy of EU law do not apply to situations that fall outside the scope of EU law.

The Greek judge didn’t miss the opportunity to also ask about the effect on the national law transposing the Data Retention Directive of the fact that this Directive was also enacted to implement a harmonised framework at the European level under Article 15(1) of the ePrivacy Directive (Question 5). The question is whether this fact is enough to bring the surviving national data retention laws under the scope of EU law.

As long as the Charter will be considered applicable to the facts of the case, the national judge further wants to know whether national law that complies partly with the criteria set out in the Digital Rights Ireland decision still breaches Articles 7 and 8 of the Charter because it doesn’t comply with all of it (Question 13). For instance, the national judge estimates that the national law doesn’t comply with the request that the persons whose data are retained must be at least indirectly in a situation which is liable to give rise to criminal prosecutions (para 58 DRI), but it complies with the request that the national law must contain substantive and procedural conditions for the access of competent authorities to the retained data and objective criteria by which the number of persons authorised to access these data is limited to what is strictly necessary (paras 61, 62 DRI).

Lastly, it will be also interesting to see whether the Court decides to address the issue of what “serious crime” means in the context of limiting the exercise of fundamental rights (Questions 10 and 11).

If you would like to dwell into some of these topics, have a look at the AG Opinion in the Tele2Sverige case, published on 19 July 2016. The judgment in that case is due on 21 December 2016. Also, have a look at this analysis of the Opinion.

As for a quick “what to expect” in the K. case from my side, here it is:

  • the CJEU will seriously re-organise the 17 questions and regroup them in 4 to 5 topics, also clarifying that it only deals with the interpretation of EU law, not national law or facts in national proceedings;
  • the national laws transposing the Data Retention Directive will probably be considered as being in the field of EU law – as they also regulate within the ambit of the ePrivacy Directive;
  • the Court will restate the criteria in DRI and probably clarify that all criteria must be complied with, no exceptions, in order for national measures to comply with the Charter;
  • the CJEU will probably not give indications to the national courts on whether they should admit or dismiss evidence collected on the bases of national law that does not comply with EU law – it’s too specific and the Court is ‘in the business’ of interpreting EU law; the best case scenario, which is possible, is that the Court will give some guidance on the obligations of Member States (and hopefully their authorities) regarding the effects of their transposing national laws when relevant EU secondary law is annulled;
  • as for what “serious crime” means in the context of limiting fundamental rights, let’s see about that. Probably the Court will give useful guidance.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Even if post Brexit-UK adopts the GDPR, it will be left without its “heart”

Gabriela Zanfir Fortuna

brexit

There has been lately a wave of optimism of those looking for legal certainty that the GDPR will be adopted by the UK even after the country leaves the European Union. This wave was prompted by a declaration of the British Secretary of State, Karen Bradley, at the end of October, when she stated before a Committee of the Parliament that “We will be members of the EU in 2018 and therefore it would be expected and quite normal for us to opt into the GDPR and then look later at how best we might be able to help British business with data protection while maintaining high levels of protection for members of the publicThe information commissioner of the UK, Elisabeth Denham, welcomed the news. On another hand, as Amberhawk explained in detail, this will not mean that the UK will automatically be considered as ensuring an adequate level of protection.

The truth is that as long as the UK is still a Member of the EU, it can’t opt in or opt out, for that matter, from regulations (other than the ones subject to the exemptions negotiated by the UK when it entered the Union – but this is not the case for the GDPR). They are “binding in their entirety” and “directly applicable”, according to Article 288 of the Treaty on the Functioning of the EU. So, yes, quite normally, if the UK is still a Member State of the EU on 25 May 2018, then the GDPR will start applying in the UK just as it will be applying in Estonia or France.

The fate of the GDPR after Brexit becomes effective will be as uncertain as the fate of all other EU legislative acts transposed in the UK or directly applicable in the UK. But let’s imagine the GDPR will remain national law after Brexit, in a form or another. If this happens, it is likely that it will take a life of its own, departing from harmonised application throughout the EU. First and foremost, the GDPR in the UK will not be applied in the light of the Charter of Fundamental Rights of the EU and especially its Article 8 – the right to the protection of personal data. The Charter played an extraordinary role in the strengthening of data protection in the EU after it became binding, in 2009, being invoked by the Court of Justice of the EU in its landmark judgments – Google v Spain,  Digital Rights Ireland and Schrems.

The Court held as far back as 2003 that “the provisions of Directive 95/46, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy, must necessarily be interpreted in the light of fundamental rights” (Österreichischer Rundfunk, para 68). This principle was repeated in most of the following cases interpreting Directive 95/46 and other relevant secondary law for this field, perhaps with the most notable results in Digital Rights Ireland and Schrems. 

See, for instance:

“As far as concerns the rules relating to the security and protection of data retained by providers of publicly available electronic communications services or of public communications networks, it must be held that Directive 2006/24 does not provide for sufficient safeguards, as required by Article 8 of the Charter, to ensure effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data” (Digital Rights Ireland, para. 66).

“As regards the level of protection of fundamental rights and freedoms that is guaranteed within the European Union, EU legislation involving interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must, according to the Court’s settled case-law, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data. The need for such safeguards is all the greater where personal data is subjected to automatic processing and where there is a significant risk of unlawful access to that data” (Schrems, para. 91).

Applying data protection law outside the spectrum of fundamental rights will most likely not ensure sufficient protection to the person. While the UK will still remain under the legal effect of the European Convention of Human Rights and its Article 8 – respect for private life – this by far does not equate to the specific protection ensured to personal data by Article 8 of the Charter as interpreted and applied by the CJEU.

Not only the Charter will not be binding for the UK post-Brexit, but the Court of Justice of the EU will not have jurisdiction anymore on the UK territory (unless some sort of spectacular agreement is negotiated for Brexit). Moreover, EU law will not enjoy supremacy over national law, as there is the case right now. This means that the British data protection law will be able to depart from the European standard (GDPR) to the extent desirable by the legislature. For instance, there will be nothing staying in the way of the British legislature to adopt permissive exemptions to the rights of the data subject, pursuant to Article 23 GDPR.

So when I mentioned in the title that the GDPR in the post-Brexit UK will in any case be left without its “heart”, I was referring to its application and interpretation in the light of the Charter of the Fundamental Rights of the EU.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Interested in the GDPR? See the latest posts:

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

Section 5. The awkward two level necessity test that convinced the AG PNR schemes are acceptable

(Section 5 of the Analysis of the AG Opinion in the “PNR Canada” Case: unlocking an “unprecedented and delicate” matter)

After he establishes that the Court should carry out “a strict review of compliance with the requirements resulting from the principle of proportionality, and more particularly, from the adequacy of the level of protection of the fundamental rights guaranteed in the Union when Canada processes and uses the PNR data pursuant to the agreement envisaged” (§200), the AG further assesses if the interference is “strictly necessary”.

He considers the “strict necessity” test as a component of the proportionality test, together with “the ability of the interference to achieve the ‘public security’ objective pursued by the Agreement”.

With regard to the latter criterion, the AG does not believe “there are any real obstacles to recognising that the interference constituted by the agreement envisaged is capable of attaining the objective of public security, in particular the objective of combating terrorism and serious transnational crime” (§205). “As the United Kingdom Government and the Commission, in particular, have claimed, the transfer of PNR data for analysis and retention provides the Canadian authorities with additional opportunities to identify passengers, hitherto not known and not suspected, who might have connections with other persons and/or passengers involved in a terrorist network or participating in serious transnational criminal activities” (§205).

In addition, the AG finds the statistics provided by the Commission and the UK relevant to find that “the data constitutes a valuable tool for criminal investigations” (§205). He reaches this conclusion in spite of the fact that at §151, when summarizing the contributions of the parties before the Court, the AG recalls that “The Commission accepts that there are no precise statistics indicating the contribution which PNR data makes to the prevention and detection of crime and terrorism, and to the investigation and prosecution of offences of those types.”

With regard to the strict necessity of the interference, the AG establishes that its assessment “entails ascertaining whether the contracting parties have struck a ‘fair balance’ between the objective of combating terrorism and serious transnational crime and the objective of protecting personal data and respecting the private life of the persons concerned” (§207), by making a reference to §77 of the Schecke judgment. That paragraph in Schecke seems to me to establish a different principle – namely that, when balancing two opposing rights, one of which is the right to the protection of personal data, it must be taken into account that “derogations and limitations in relation to the protection of personal data must apply only in so far as is strictly necessary”[1].

Notwithstanding, the AG follows by stating that “the terms of the agreement envisaged must also consist of the measures least harmful to the rights recognised by Articles 7 and 8 of the Charter, while making an effective contribution to the public security objective pursued by the agreement envisaged” (§208). He explains:

“That means that it is not sufficient to imagine, in the abstract, the existence of alternative measures that would be less intrusive in the fundamental rights at issue. Those alternative measures must also be sufficiently effective, that is to say, their effectiveness must, in my view, be comparable with those provided for in the agreement envisaged, in order to attain the public security objective pursued by that agreement” (§208).

In quite a big leap, AG Mengozzi relies for this twofold test for necessity on a paragraph in the Schwartz judgment, §53, which states that “the Court has not been made aware of any measures which would be both sufficiently effective in helping to achieve the aim of protecting against the fraudulent use of passports and less of a threat to the rights recognised by Articles 7 and 8 of the Charter than the measures deriving from the method based on the use of fingerprints.”

This twofold test is not used in any of the most recent landmark judgments of the Court – DRI, which relies greatly on the analysis of the condition of “necessity”, and Schrems. However, looking at strict necessity through this lens of proportionality and equivalent effectiveness persuaded the AG to conclude that PNR schemes, even if they constitute the kind of interference he accurately described in §176, are acceptable.

Comparing the wealth of PNR data to data collected usually for border control purposes by immigration authorities, including Advance Passenger Information and information collected by Canadian authorities for their eVA program, the AG concluded that “data of that type (API, eVA – my note) does not reveal information about the booking methods, payment methods used and travel habits, the cross-checking of which can be useful for the purposes of combating terrorism and other serious transnational criminal activities. Independently of the methods used to process that data, the API and the data required for the issue of an eVA are therefore not sufficient to attain with comparable effectiveness the public security objective pursued by the agreement envisaged” (§214).

The AG further justifies that PNR data of all passengers are transferred to the Canadian authorities, “even though there is no indication that their conduct may have a connection with terrorism or serious transnational crime” (215) by arguing that “as the interested parties have explained, the actual interest of PNR schemes, whether they are adopted unilaterally or form the subject matter of an international agreement, is specifically to guarantee the bulk transfer of data that will allow the competent authorities to identify, with the assistance of automated processing and scenario tools or predetermined assessment criteria, individuals not known to the law enforcement services who may nonetheless present an ‘interest’ or a risk to public security and who are therefore liable to be subjected subsequently to more thorough individual checks” (§216).

He finds at §244, referring to the fact that the Agreement involves transfers of data of all passengers between the Union and Canada, irrespective of whether they are suspects or not, that no other measure which, while limiting the number of persons whose PNR data is automatically processed by the Canadian competent authority, would be capable of attaining with comparable effectiveness the public security aim pursued by the contracting parties has been brought to the Court’s attention in the context of the present proceedings”.

The AG therefore concluded that “generally, the scope ratione personae of the agreement envisaged cannot be limited further without harming the very object of the PNR regimes” (§245).

Another characteristic of PNR schemes that is generally considered questionable – the lack of an ex ante control of access to PNR data, is found justifiable by the AG in the light of the “fair balance” test for strict necessity: “the appropriate balance that must be struck between the effective pursuit of the fight against terrorism and serious transnational crime and respect for a high level of protection of the personal data of the passengers concerned does not necessarily require that a prior control of access to the PNR data must be envisaged” (§269).

Therefore, the idea of PNR schemes seems to be compatible with the fundamental rights to data protection and respect for private life, in the view of AG Mengozzi. However, the list of conditions he develops for the Agreement in the current case to be fully compliant with EU primary law is quite long and quite strict and it bears bad news for other similar arrangements.

 

……………………………………………

[1] §77 of Schecke states this: “It is thus necessary to determine whether the Council of the European Union and the Commission balanced the European Union’s interest in guaranteeing the transparency of its acts and ensuring the best use of public funds against the interference with the right of the beneficiaries concerned to respect for their private life in general and to the protection of their personal data in particular. The Court has held in this respect that derogations and limitations in relation to the protection of personal data must apply only in so far as is strictly necessary (Satakunnan Markkinapörssi and Satamedia, paragraph 56).”