Tag Archives: ePrivacy Directive

Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking

The Court of Justice of the European Union published yesterday its long-awaited judgment in the Planet49 case, referred by a German Court in proceedings initiated by a non-governmental consumer protection organization representing the participants to an online lottery. It dealt with questions which should have been clarified long time ago, after Article 5(3) was introduced in Directive 2002/58 (the ‘ePrivacy Directive’) by an amendment from 2009, with Member States transposing and then applying its requirements anachronistically:

  • Is obtaining consent through a pre-ticked box valid when placing cookies on website users’ devices?
  • Must the notice given to the user when obtaining consent include the duration of the operation of the cookies being placed and whether or not third parties may have access to those cookies?
  • Does it matter for the application of the ePrivacy rules whether the data accessed through the cookies being placed is personal or non-personal?

The Court answered all of the above, while at the same time signaling to Member States that a disparate approach in transposing and implementing the ePrivacy Directive is not consistent with EU law, and setting clear guidance on what ‘specific’, ‘unambiguous’ and ‘informed’ consent means.

The core of the Court findings is that:

  • pre-ticked boxes do not amount to valid consent,
  • expiration date of cookies and third party sharing should be disclosed to users when obtaining consent,
  • different purposes should not be bundled under the same consent ask,
  • in order for consent to be valid ‘an active behaviour with a clear view’ (which I read as ‘intention’) of consenting should be obtained (so claiming in notices that consent is obtained by having users continuing to use the website very likely does not meet this threshold) and,
  • (quite consequential), these rules apply to cookies regardless of whether the data accessed is personal or not.

Unfortunately, though, the Court did not tackle one other very important issue: what does ‘freely given’ consent mean? In other words, would requiring and obtaining consent for placing cookies with the purpose of online tracking for behavioural advertising as a condition to access an online service, such as an online lottery (as in Planet49’s case), be considered as ‘freely given’ consent?

An answer to this question would have affected all online publishers and online service providers that condition access to their services to allowing online behaviour tracking cookies being installed on user devices and rely on ‘cookie walls’ as a source of income for their businesses. What is interesting is that the Court included a paragraph in the judgment specifically enunciating that it does not give its view on this issue because it was not asked to do so by the referring German Court (paragraph 64). Notably, ‘freely given’ is the only of the four conditions for valid consent that the Court did not assess in its judgment and that it specifically singled out as being left out in the open.

Finally, one very important point to highlight is that the entirety of the findings were made under the rules for valid consent as they were provided by Directive 95/46. The Court even specified that its finding concerning ‘unambiguous’ consent is made under the old directive. This is relevant because the definition of consent in Article 2(h) of Directive 95/46 only refers to ‘any freely given specific and informed indication’ of agreement. However, Article 7(a) of the directive provides that the data subject’s consent may make a processing lawful if it was given ‘unambiguously’.

With the GDPR, the four scattered conditions have been gathered under Article 4(11) and have been reinforced by clearer recitals. The fact remains that conditions for valid consent were just as strong under Directive 95/46. The Court almost ostensibly highlights that its interpretation is made on the conditions provided under the old legal regime and they only apply to the GDPR ‘a fortiori‘ (paragraph 60); (see here for what a fortiori means in legal interpretation).

Consequently, it seems that consent obtained for placing cookies with the help of pre-ticked boxes or through inaction or action without intent to give consent, even prior to the GDPR entering into force, has been unlawfully obtained. It remains to be seen if any action by supervisory authorities will follow to tackle some of those collections of data built relying on unlawfully obtained consent, or whether they will take a clean slate approach.

For a deeper dive into the key findings of the Planet49 CJEU judgment, read below:

Discrepancies in applying ePrivacy at Member State level, unjustifiable based on Directive’s text

Before assessing the questions referred on substance, the Court makes some preliminary findings. Among them, it finds that ‘the need for a uniform application of EU law and the principle of equality require that the wording of a provision of EU law which makes no express reference to the law of the Member States for the purpose of determining its meaning and scope must normally be given an autonomous and uniform interpretation throughout the European Union’ (paragraph 47). Article 5(3) of the ePrivacy Directive does not provide any room for Member State law to determine the scope and meaning of its provisions, by being sufficiently clear and precise in what it asks the Member States to do (see paragraph 46 for the Court’s argument).

In practice, divergent transposition and implementation of the ePrivacy Directive has created different regimes across the Union, which had consequences for the effectiveness of its enforcement.

‘Unambiguous’ means ‘active behavior’ and intent to give consent

The Court starts its assessment from a linguistic interpretation of the wording of Article 5(3) of Directive 2002/58. It notes that the provision doesn’t require a specific way of obtaining consent to the storage of and access to cookies on users’ devices. The Court observes that ‘the wording ‘given his or her consent’ does however lend itself to a literal interpretation according to which action is required on the part of the user in order to give his or her consent.

In that regard, it is clear from recital 17 of Directive 2002/58 that, for the purposes of that directive, a user’s consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an internet website‘ (paragraph 49).

The Court highlights that per Article 2(f) of Directive 2002/58 the meaning of a user’s ‘consent’ under the ePrivacy Directive is meant to be the same as that of a data subject’s consent under Directive 95/46 (paragraph 50). By referring to Article 2(h) of the former data protection directive, the Court observes that ‘the requirement of an ‘indication’ of the data subject’s wishes clearly points to active, rather than passive, behaviour’ (paragraph 52). The Court then concludes that ‘consent given in the form of a preselected tick in a checkbox does not imply active behaviour on the part of a website user’ (paragraph 52).

Interestingly, the Court points out that this interpretation of what ‘indication’ means ‘is borne out by Article 7 of Directive 95/46’ (paragraph 53), and in particular Article 7(2) which ‘provides that the data subject’s consent may make such processing lawful provided that the data subject has given his or her consent ‘unambiguously’’ (paragraph 54). So even if the definition of consent in Directive 95/46 does not refer to this condition in particular, the Court nevertheless anchored its main arguments in it.

The Court then made another important interpretation concerning what ‘unambiguous’ consent means: ‘Only active behaviour on the part of the data subject with a view to giving his or her consent may fulfil that requirement’ (paragraph 54). This wording (‘with a view to’) suggests that there is a condition of willfulness, of intent to give consent in order for the indication of consent to be lawful.

In addition, to be even clearer, the Court finds that ‘it would appear impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by not deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed. It is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited” (paragraph 55).

A fortiori, it appears impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by merely continuing with his or her activity on the website visited (continuing browsing or scrolling), nor whether the consent has been informed, provided that the information given to him or her does not even include a pre-ticked checkbox which would at least give the opportunity to uncheck the box. Also, just like the Court points out, it is not inconceivable that a user would not have read the information announcing him or her that by continuing to use the website they give consent.

With these two findings in paragraphs 54 and 55 the Court seems to clarify once and for all that informing users that by continuing their activity on a website signifies consent to placing cookies on their device is not sufficient to obtain valid consent under the ePrivacy Directive read in the light of both Directive 95/46 and the GDPR.

‘Specific’ means consent can’t be inferred from bundled purposes

The following condition that the Court analyzes is that of specificity. In particular, the Court finds that ‘specific’ consent means that ‘it must relate specifically to the processing of the data in question and cannot be inferred from an indication of the data subject’s wishes for other purposes” (paragraph 58). This means that bundled consent will not be considered valid and that consent should be sought granularly for each purpose of processing.

‘Informed’ means being able to determine the consequences of any consent given

One of the questions sent for a preliminary ruling by the German Court concerned specific categories of information that should be disclosed to users in the context of obtaining consent for placing cookies. Article 5(3) of the ePrivacy Directive requires that the user is provided with ‘clear and comprehensive information’ in accordance with Directive 95/46 (now replaced by the GDPR). The question was whether this notice must also include (a) the duration of the operation of cookies and (b) whether or not third parties may have access to those cookies.

The Court clarified that providing ‘clear and comprehensive’ information means ‘that a user is in a position to be able to determine easily the consequences of any consent he or she might give and ensure that the consent given is well informed. It must be clearly comprehensible and sufficiently detailed so as to enable the user to comprehend the functioning of the cookies employed’ (paragraph 74). Therefore, it seems that using language that is easily comprehensible for the user is important, just as it is important painting a full picture of the function of the cookies for which consent is sought.

The Court found specifically with regard to cookies that ‘aim to collect information for advertising purposes’ that ‘the duration of the operation of the cookies and whether or not third parties may have access to those cookies form part of the clear and comprehensive information‘ which must be provided to the user (paragraph 75).

Moreover, the Court adds that ‘information on the duration of the operation of cookies must be regarded as meeting the requirement of fair data processing‘ (paragraph 78). This is remarkable, since the Court doesn’t usually make findings in its data protection case-law with regard to the fairness of processing. Doubling down on its fairness considerations, the Court goes even further and links fairness of the disclosure of the retention time to the fact that ‘a long, or even unlimited, duration means collecting a large amount of information on users’ surfing behaviour and how often they may visit the websites of the organiser of the promotional lottery’s advertising partners’ (paragraph 78).

It is irrelevant if the data accessed by cookies is personal or anonymous, ePrivacy provisions apply regardless

The Court was specifically asked to clarify whether the cookie consent rules in the ePrivacy Directive apply differently depending on the nature of the data being accessed. In other words, does it matter that the data being accessed by cookie is personal or anonymized/aggregated/de-identified?

First of all, the Court points out that in the case at hand, ‘the storage of cookies … amounts to a processing of personal data’ (paragraph 67). That being said, the Court nonetheless notes that the provision analyzed merely refers to ‘information’ and does so ‘without characterizing that information or specifying that it must be personal data’ (paragraph 68).

The Court explained that this general framing of the provision ‘aims to protect the user from interference with his or her private sphere, regardless of whether or not that interference involves personal data’ (paragraph 69). This finding is particularly relevant for the current legislative debate over the revamp of the ePrivacy Directive. It is clear that the core difference between the GDPR framework and the ePrivacy regime is what they protect: the GDPR is concerned with ensuring the protection of personal data and fair data processing whenever personal data is being collected and used, while the ePrivacy framework is concerned with shielding the private sphere of an individual from any unwanted interference. That private sphere/private center of interest may include personal data or not.

The Court further refers to recital 24 of the ePrivacy Directive, which mentions that “any information stored in the terminal equipment of users of electronic communications networks are part of the private sphere of the users requiring protection under the European Convention for the Protection of Human Rights and Fundamental Freedoms. That protection applies to any information stored in such terminal equipment, regardless of whether or not it is personal data, and is intended, in particular, as is clear from that recital, to protect users from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge” (paragraph 70).

Conclusion

The judgment of the CJEU in Planet49 provides some much needed certainty about how the ‘cookie banner’ and ‘cookie consent’ provisions in the ePrivacy Directive should be applied, after years of disparate approaches from national transposition laws and supervisory authorities which lead to a lack of effectiveness in enforcement and, hence, compliance. The judgment does leave open on ardent question: what does ‘freely given consent’ mean? It is important to note nonetheless that before reaching the ‘freely given’ question, any consent obtained for placing cookies (or similar technologies) on user devices will have to meet all of the other three conditions. If only one of them is not met, then that consent is invalid.

***

You can refer to this summary by quoting G. Zanfir-Fortuna, ‘Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking’, http://www.pdpecho.com, published on October 3, 2019.

Data retention, only possible under strict necessity: targeted retention and pre-authorised access to retained data

The Court of Justice of the European Union (‘the Court’ or ‘CJEU’) gave a second judgment this week on the compatibility of data retention measures with the fundamental rights of persons as guaranteed by the Charter of Fundamental Rights of the EU (in Joined Cases C-203/15 and C-698/15 Tele2Sverige). The Court confirmed all its findings from the earlier Digital Rights Ireland judgment and took the opportunity to clarify and nuance some of its initial key-findings (for an analysis of the DRI judgment, see my article published in 2015).

The two cases that were joined by the Court emerged in the fallout of the invalidation of the Data Retention Directive by the CJEU in the DRI judgment. Even if that Directive was declared invalid for breaching fundamental rights, most of the national laws that transposed it in the Member States were kept in force invoking Article 15(1) of the ePrivacy Directive. This Article provided for an exception to the rule of ensuring confidentiality of communications, which allowed Member States to “inter alia, adopt legislative measures providing for the retention of data for a limited period justified on the grounds laid down in this paragraph”. What the Member States seem to have disregarded with their decision to keep national data retention laws in force was that the same paragraph, last sentence, provided that “all the measures referred to in this paragraph (including data retention – my note) shall be in accordance with the general principles of Community law” (see §91 and §92 of the judgment). Respect for fundamental rights is one of those principles.

The Tele2Sverige case was initiated by a telecommunications service provider that followed the decision of the Court in DRI and stopped to retain data, because it considered that the national law requiring it do retain data was in breach of EU law. The Swedish authorities did not agree with this interpretation and this is how the Court was given the opportunity to clarify the relationship between national data retention law and EU law after the invalidation of the Data Retention Directive. The Watson case originates in the UK, was initiated by individuals and refers to the Data Retention and Investigatory Powers Act 2014(DRIPA).

In summary, the Court found that “national legislation which, for the purpose of fighting crime, provides for general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” is in breach of Article 7 (right to private life), Article 8 (right to the protection of personal data) and Article 11 (right to freedom of speech) from the Charter of Fundamental Rights of the EU. The Court clarified that such legislation is precluded by Article 15(1) of the ePrivacy Directive. (See §1 from the executive part of the judgment)

Moreover, the Court found that national legislation in the field of the ePrivacy Directive that regulates the access of competent national authorities to retained data is incompatible with the three fundamental rights mentioned above, as long as:

  1. the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime;
  2. access is not subject to prior review by a court or an independent administrative authority;
  3. there is no requirement that the data concerned should be retained within the European Union (§2 of the operative part of the judgment).

There are a couple of remarkable findings of the Court in the Tele2Sverige/Watson judgment, analysed below. Brace yourselves for a long post. But it’s worth it. I’ll be looking at (1) how indiscriminate retention of metadata interferes with freedom of speech, (2) why data retention is merely an exception of the principle of confidentiality of communications and must not become the rule, (3) why the Court considers retaining on a generalised basis metadata is a far-reaching intrusion in the right to private life, (4) what is “targeted retention” and under what conditions the Court sees it acceptable and, finally (5) what is the impact of all of this on the Privacy Shield and PNR schemes.

 

(1) Indiscriminate retention of metadata interferes with freedom of speech

Even though none of the preliminary ruling questions asked the Court to look at compliance of national data retention measures also in the light of Article 11 Charter (freedom of speech), the Court did so by its own motion.

This was needed so that the Court finishes what it began in DRI. In that previous case, the Court referred to Article 11 Charter in §28, replying to a specific preliminary ruling question, by mentioning that:

“it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter”.

However, it never analysed if that was the case. In §70, the Court just stated that, after finding the Directive to be invalid because it was not compliant with Articles 7 and 8 of the Charter, “there is no need to examine the validity of Directive 2006/24 in the light of Article 11 of the Charter”.

This time, the Court developed its argument. It started by underlying that data retention legislation such as that at issue in the main proceedings “raises questions relating to compatibility not only with Articles 7 and 8 of the Charter, which are expressly referred to in the questions referred for a preliminary ruling, but also with the freedom of expression guaranteed in Article 11 of the Charter” (§92).

The Court continued by emphasising that the importance of freedom of expression must be taken into consideration when interpreting Article 15(1) of the ePrivacy Directive “in the light of the particular importance accorded to that freedom in any democratic society” (§93). “That fundamental right (freedom of expression), guaranteed in Article 11 of the Charter, constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded” (§93), it continues.

The Court justifies the link between data retention and freedom of expression by slightly more confidently (compared to DRI) stating that:

“the retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Article 11 of the Charter” (§101)

The operative part of the judgment clearly states that Articles 7, 8 and 11 of the Charter preclude data retention legislation such as that in the main proceedings.

(2) The exception to the “principle of confidentiality” must not become the rule

The Court refers several times to a “principle of confidentiality of communications” (§85, §90, §95, §115). It explains in §85 that this principle is established by the ePrivacy Directive and “implies, inter alia, (…) that, as a general rule, any person other than the users is prohibited from storing, without the consent of the users concerned, the traffic data related to electronic communications. The only exceptions relate to persons lawfully authorised in accordance with Article 15(1) of that directive and to the technical storage necessary for conveyance of a communication.”

With regard to the first exception, the Court recalls that, because Article 15(1) is construed so as “to restrict the scope of the obligation of principle to ensure confidentiality of communications and related traffic data”, it “must, in accordance with the Court’s settled case-law, be interpreted strictly” (§89). The Court adds, using strong language:

“That provision cannot, therefore, permit the exception to that obligation of principle and, in particular, to the prohibition on storage of data, laid down in Article 5 of Directive 2002/58, to become the rule, if the latter provision is not to be rendered largely meaningless” (§89).

In any case, the Court adds, all exceptions adopted pursuant to Article 15(1) of the ePrivacy Directive must be in accordance with the general principles of EU law, which include the fundamental rights guaranteed by the Charter (§91) and must strictly have one of the objectives enumerated in Article 15(1) of the ePrivacy Directive (§90).

As for the second derogation to the principle, the Court looks at recitals 22 and 26 of the ePrivacy Directive and affirms that the retention of traffic data is permitted “only to the extent necessary and for the time necessary for the billing and marketing of services and the provision of value added services. (…) As regards, in particular, the billing of services, that processing is permitted only up to the end of the period during which the bill may be lawfully challenged or legal proceedings brought to obtain payment. Once that period has elapsed, the data processed and stored must be erased or made anonymous” (§85).

(3) A”very far-reaching” and “particularly serious” interference

The Court observed that the national data retention laws at issue in the main proceedings “provides for a general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication, and that it imposes on providers of electronic communications services an obligation to retain that data systematically and continuously, with no exceptions” (§97).

The data retained is metadata and is described in detail in §98. The Court confirmed its assessment in DRI that metadata “taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as everyday habits, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them” (§99). It also added that this data “provides the means (…) of establishing a profile of the individuals concerned, information that is no less sensitive, having regard to the right to privacy, than the actual content of communications” (§99).

The Court went further to emphasise that this kind of undiscriminating gathering of data represents a “very far-reaching” and “particularly serious” interference in the fundamental rights to private life and protection of personal data (§100). Moreover, “he fact that the data is retained without the subscriber or registered user being informed is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance” (§100).

The Court indicates that such a far-reaching interference can only be justified by the objective of fighting serious crime (§102). And even in this case, the objective of fighting serious crime does not justify in itself “general and indiscriminate retention of all traffic and location data” (§103). The measures must, in addition, be strictly necessary to achieve this objective (§106).

The Court found that the national legislation such as that at issue in the main proceedings does not comply with this request, because (§105):

  • it “covers, in a generalised manner, all subscribers and registered users and all means of electronic communication as well as all traffic data, provides for no differentiation, limitation or exception according to the objective pursued”.
  • “It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings”.
  • It “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences”.
  • “it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

(4) Targeted data retention is permissible. Here is a list with all conditions:

The Court spells out that fundamental rights do not prevent a Member State from adopting “legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to:

  • the categories of data to be retained,
  • the means of communication affected,
  • the persons concerned and
  • the retention period adopted, to what is strictly necessary” (§108).

In addition, such legislation must:

  • “lay down clear and precise rules governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse.
  • indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that such a measure is limited to what is strictly necessary” §109().

Other conditions that need to be fulfilled for a data retention legislation to be considered compatible with fundamental rights are indicated directly or indirectly by the Court in further paragraphs.

Such legislation must:

  • be restricted to “retention in relation to data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or
  • persons who could, for other reasons, contribute, through their data being retained, to fighting crime” (§106).
  • “meet objective criteria, that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected” (§110).
  • “be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences, and to contribute in one way or another to fighting serious crime or to preventing a serious risk to public security” (§111).
  • “lay down clear and precise rules indicating in what circumstances and under which conditions the providers of electronic communications services must grant the competent national authorities access to the data. (…) a measure of that kind must be legally binding under domestic law” (§117).
  • “lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data” (§118).
  • provide that data must be “retained within the European Union” (§122).
  • provide for “the irreversible destruction of the data at the end of the data retention period” (§122).
  • must “ensure review, by an independent authority, of compliance with the level of protection guaranteed by EU law with respect to the protection of individuals in relation to the processing of personal data, that control being expressly required by Article 8(3) of the Charter” (§123).

Other specific conditions emerge with regard to access of competent authorities to the retained data. Access:

  • “can be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime” (§119). [The Court refers here to the ECtHR cases of Zacharov and Szabo, after a long series of privacy related cases where it did not refer at all to the ECtHR case-law].
  • must be subject to “a prior review carried out either by a court or by an independent administrative body” (…) “the decision of that court or body should be made following a reasoned request by those authorities submitted, inter alia, within the framework of procedures for the prevention, detection or prosecution of crime” (§120). The only exception for the prior review are “cases of validly established urgency” (§120).
  • must be notified by authorities to the persons affected “under the applicable national procedures, as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities. That notification is, in fact, necessary to enable the persons affected to exercise, inter alia, their right to a legal remedy” (§121).
  • must be restricted solely to fighting serious crime (§125).

(5) Possible effects on the Privacy Shield and on PNR schemes

This judgment could have indirect effects on the “Privacy Shield” and slightly more immediate effects on Passenger Name Records schemes.

The indirect effect on the Privacy Shield and on all other adequacy schemes could only manifest in the context of a challenge of such transfer instruments before the CJEU. The seriousness with which the Court of Justice detailed all conditions that must be met by a legislative measure providing for a particular processing of personal data to be compliant with the fundamental rights to private life and to the protection of personal data strengthen the condition of “essentially equivalence”.

In other words, it will be difficult to convince the Court that a third country that allows collection of metadata (and all the more so content of communications) on a large scale and access to that data which is not made under the supervision of an independent authority, provides an adequate level of protection that would lawfully allow transfers of data from the EU to that third country. (For comparison, the CJEU referred to the Digital Rights Ireland case for 8 times and in key findings in its judgment in Schrems).

As for PNR schemes, the effects may come sooner and more directly, as we are waiting for the Court’s Opinion in Avis 1/15 on the compliance of the EU-PNR Canada agreement with fundamental rights. It is to be expected that the Court will copiously refer back to its new list of conditions for access by authorities to retained personal data when looking at how all PNR data is directly transferred by companies to law enforcement authorities in a third country, with no limitations.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Greek judges asked the CJEU if they should dismiss evidence gathered under the national law that transposed the invalidated Data Retention Directive

Here is a new case at the Court of Justice of the EU that the data protection world will be looking forward to, as it addresses questions about the practical effects of the invalidation of the Data Retention Directive.

old_bailey_microcosm

(licensed under Creative Commons)

Case C-475/16 K. (yes, like those Kafka characters) concerns criminal proceedings against K. before Greek courts, which apparently involve evidence gathered under the Greek national law that transposed the now-invalidated Data Retention Directive. The Directive was invalidated in its entirety by the CJEU in 2014, after the Court found in its Digital Rights Ireland judgment that the provisions of the Directive breached Articles 7 (right to respect for private life) and 8 (right to the protection of personal data) of the Charter of Fundamental Rights.

The Greek judges sent in August a big set out questions for a preliminary ruling to the CJEU (17 questions). Among those, there are a couple of very interesting ones, because they deal with the effects in practice of the invalidation of an EU Directive and what happens with national laws of the Member States that transposed the Directive.

For instance, the national judge asks whether national courts are obliged not to apply legislative measures transposing the annulled Directive and whether this obligation also means that they must dismiss evidence obtained as a consequence of those legislative measures (Question 3). The national judge also wants to know if maintaining the national law that transposes an invalidated Directive constitutes an obstacle to the establishment and functioning of the internal market (Question 16).

Another question raised by the national judge is whether the national legislation that transposed the annulled Data Retention Directive and that remained in force at national level after the annulment is still considered as falling under the scope of EU law (Question 4). The answer to this question is important because the EU Charter and the supremacy of EU law do not apply to situations that fall outside the scope of EU law.

The Greek judge didn’t miss the opportunity to also ask about the effect on the national law transposing the Data Retention Directive of the fact that this Directive was also enacted to implement a harmonised framework at the European level under Article 15(1) of the ePrivacy Directive (Question 5). The question is whether this fact is enough to bring the surviving national data retention laws under the scope of EU law.

As long as the Charter will be considered applicable to the facts of the case, the national judge further wants to know whether national law that complies partly with the criteria set out in the Digital Rights Ireland decision still breaches Articles 7 and 8 of the Charter because it doesn’t comply with all of it (Question 13). For instance, the national judge estimates that the national law doesn’t comply with the request that the persons whose data are retained must be at least indirectly in a situation which is liable to give rise to criminal prosecutions (para 58 DRI), but it complies with the request that the national law must contain substantive and procedural conditions for the access of competent authorities to the retained data and objective criteria by which the number of persons authorised to access these data is limited to what is strictly necessary (paras 61, 62 DRI).

Lastly, it will be also interesting to see whether the Court decides to address the issue of what “serious crime” means in the context of limiting the exercise of fundamental rights (Questions 10 and 11).

If you would like to dwell into some of these topics, have a look at the AG Opinion in the Tele2Sverige case, published on 19 July 2016. The judgment in that case is due on 21 December 2016. Also, have a look at this analysis of the Opinion.

As for a quick “what to expect” in the K. case from my side, here it is:

  • the CJEU will seriously re-organise the 17 questions and regroup them in 4 to 5 topics, also clarifying that it only deals with the interpretation of EU law, not national law or facts in national proceedings;
  • the national laws transposing the Data Retention Directive will probably be considered as being in the field of EU law – as they also regulate within the ambit of the ePrivacy Directive;
  • the Court will restate the criteria in DRI and probably clarify that all criteria must be complied with, no exceptions, in order for national measures to comply with the Charter;
  • the CJEU will probably not give indications to the national courts on whether they should admit or dismiss evidence collected on the bases of national law that does not comply with EU law – it’s too specific and the Court is ‘in the business’ of interpreting EU law; the best case scenario, which is possible, is that the Court will give some guidance on the obligations of Member States (and hopefully their authorities) regarding the effects of their transposing national laws when relevant EU secondary law is annulled;
  • as for what “serious crime” means in the context of limiting fundamental rights, let’s see about that. Probably the Court will give useful guidance.

***

Find what you’re reading useful? Please consider supporting pdpecho.