Tag Archives: GDPR

A US Bill from 1974 shares so much DNA with the GDPR, it could be its ancestor

America’s own GDPR was introduced in Congress in 1974. This Bill applied to government and companies, it restricted international transfers and offered U.S. and foreign “data subjects” rights to access, erasure and even… explanation.

The U.S. has been recently working towards finally adopting comprehensive privacy and data protection rules, with unfolding efforts both at federal and state level. Until now, only Californians can claim they actually achieved something on the road to protecting their rights impacted by the widespread collection and use of personal information. Other serious efforts are undergoing in Washington State, but they may end up being undermined by good intentions.

These developments are possible right now due to a combination of EU’s General Data Protection Regulation’s (GDPR) global reach and notoriety, the countless privacy scandals affecting Americans, and the absence of comprehensive statutory legal protections in the U.S. of privacy and other individual rights that may be affected by the collection and use of personal information.

But did you know this is not the first time the U.S. is having privacy law fever? In the late ’60s and early ’70s, American lawmakers were concerned about the rise of automated data processing and computerized databases. Serious efforts were put into analyzing how the rights of the American people could be protected against misuses and abuses of personal information. The Fair Credit Reporting Act was adopted in 1970. An influential Report was published in 1973 by the Department of Health, Education and Welfare (HEW) proposing a set of Fair Information Practice Principles built on an impressive, meticulous analysis (read it if you haven’t done so yet; bonus: it’s peppered with smart literary mottos in between chapters). The Report called for comprehensive federal privacy legislation applicable both to government and companies.

About six months after the publication of the HEW Report, in January 1974, Bill S.3418 was introduced in the US Senate by three Senators — Ervin, Percy and Muskie, ‘to establish a Federal Privacy Board, to oversee the gathering and disclosure of information concerning individuals, and to provide management systems in all Federal agencies, State and local governments, and other organizations’.

This Bill was clearly ahead of its time and aged astoundingly well, especially when compared to some of the key characteristics of the GDPR — the current global golden standard for comprehensive data protection law:

It applied to both public and private sectors, at federal and state level

The Bill had a very broad scope of application. It covered the activity of “organizations” defined as any Federal agencies; the government of the District of Columbia; any authority of any State, local government, or other jurisdiction; any public or private entity engaged in business for profit. It only exempted from its rules information systems pertaining to Federal agencies that were vital to national defense, as well as criminal investigatory files of Federal, State or local law enforcement and any information maintained by the press or news media, except for information related to their employees.

It created a Federal Privacy Board to oversee its application

The Federal Privacy Board would have been created as part of the Executive branch, composed of five members appointed by the President with the approval of the Senate, for a three year mandate. The Board would have been granted effective powers to investigate violations of the law — including by being granted admission to the premises where any information system or computers are kept, to recommend either criminal or civil penalties, and to actually order any organization found in breach of the law ’to cease and desist such violation’.

It equally protected the rights of Americans and foreigners as data subjects

It’s quite difficult to believe it (especially in the context of the endless Transatlantic debates that ultimately lead to the Judicial Redress Act), but this Bill explicitly protected “any data subject of a foreign nationality, whether residing in the United States or not” by requiring organizations to afford them “the same rights under this Act as are afforded to citizens in the United States”. Such a broad personal scope has been a characteristic of the European data protection law framework even before the GDPR. It also made possible the legal challenges brought in the UK against Cambridge Analytica by David Caroll, a U.S. citizen residing in New York.

It provided restrictions for international data transfers to jurisdictions which did not apply the protections enshrined in the Bill

Under this Bill, organizations were required to “transfer no personal information beyond the jurisdiction of the United States without specific authorization from the data subject or pursuant to a treaty or executive agreement in force guaranteeing that any foreign government or organization receiving personal information will comply with the applicable provisions of this Act with respect to such information”. The idea of restricting transfers of personal data to countries which do not ensure a similar level of protection is a staple of the EU data protection law regime and the source of some of the biggest EU-US tensions related to tech and data governance.

It provided for rights of access to, correction, “purging” of personal information. And for notification of purging to former recipients!

The Bill provided for an extensive right of access to one’s own personal information. It required organizations to grant data subjects “the right to inspect, in a form comprehensible” all personal information related to them, the nature of the sources of the information and the recipients of the personal information. In addition, it also granted individuals the right to challenge and correct information. As part of this right to challenge and correct information, the Bill even provided for a kind of “right to be forgotten”, since it asked organizations to “purge any such information that is found to be incomplete, inaccurate, not pertinent, not timely nor necessary to be retained, or can no longer be verified”. Moreover, the Bill also required organizations to “furnish to past recipients of such information notification that the item has been purged or corrected” at the request of the data subject.

It provided for transparency rights into statistical models and receiving some explanation

The same provision granting a right to challenge and correct personal information referred also to individuals wishing “to explain” information about them in information systems, but it is not clear how organizations should have particularly responded to explanation requests. Elsewhere in the Bill, organizations “maintaining an information system that disseminates statistical reports or research findings based on personal information drawn from the system, or from systems of other organizations” were required to “make available to any data subject (without revealing trade secrets) methodology and materials necessary to validate statistical analyses” (!). Moreover, those organizations were also asked not to make information available for independent analysis “without guarantees that no personal information will be used in a way that might prejudice judgments about any data subject”.

It provided some rules even for collection of personal information

One of the key questions to ask about data protection legislation generally is whether it intervenes at the time of collection of personal data, as opposed to merely regulating its use. This Bill cared about collection too. It provided that organizations must “collect, maintain, use and disseminate only personal information necessary to accomplish a proper purpose of the organization”, “collect information to the greatest extent possible from the data subject directly” and even “collect no personal information concerning the political or religious beliefs, affiliations, and activities of data subjects which is maintained, used or disseminated in or by any information system operated by any governmental agency, unless authorized by law”.

There are other remarkable features of this Bill that remind of features of the GDPR, such as broad definitions of personal information and data subjects (“an individual about whom personal information is indexed or may be located under his name, personal number, or other identifiable particulars, in an information system”) and show sophisticated thinking about managing the impact automated processing of personal data might have on the rights of individuals. Enforcement of the Bill included criminal and civil penalties applied with the help of the U.S. Attorney General and the Federal Privacy Board, as well as a private right of action limited only to breaches of the right to access personal information.

So what happened to it? Throughout the legislative process in Congress, this Bill was almost completely rewritten and it ultimately became the US Privacy Act 1974 — a privacy law quite limited in scope (applicable only to Federal agencies) and ambitions compared to the initial proposal. The answer about what might have happened during this process to fundamentally rewrite the Bill is somewhere in these 1466 pages recording the debates around the US Privacy Act of 1974.

Be it a failed attempt to provide comprehensive data protection and privacy legislation in the U.S., it nonetheless shows how much common thinking is shared by Europe and America. At the same time this Bill was introduced in the U.S. Senate, Europe was having its own data protection law fever, with many legislative proposals being discussed in Western Europe after the first data protection law was adopted in 1970 in the German land of Hesse. But according to Frits Hondius, a Dutch scholar documenting these efforts in his volume “Emerging Data Protection in Europe” published in 1975:

“A factor of considerable influence was the development of data protection on the American scene. Almost every issue that arose in Europe was also an issue in the United States, but at an earlier time and on a more dramatic scale. (…) The writings by American authors about privacy and computers (e.g. Westin and Miller), the 1966 congressional hearings, and the examples set by federal and state legislation, such as the US Fair Credit Reporting Act 1970 and the US Privacy Act 1974, have made a deep impact on data protection legislation in Europe.”

After a shared start in the late ‘60s and early ‘70s, the two privacy and data protection law regimes evolved significantly different. Almost half a century later, it seems to be Europe’s turn to impact the data protection and privacy law debate in the U.S..

Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking

The Court of Justice of the European Union published yesterday its long-awaited judgment in the Planet49 case, referred by a German Court in proceedings initiated by a non-governmental consumer protection organization representing the participants to an online lottery. It dealt with questions which should have been clarified long time ago, after Article 5(3) was introduced in Directive 2002/58 (the ‘ePrivacy Directive’) by an amendment from 2009, with Member States transposing and then applying its requirements anachronistically:

  • Is obtaining consent through a pre-ticked box valid when placing cookies on website users’ devices?
  • Must the notice given to the user when obtaining consent include the duration of the operation of the cookies being placed and whether or not third parties may have access to those cookies?
  • Does it matter for the application of the ePrivacy rules whether the data accessed through the cookies being placed is personal or non-personal?

The Court answered all of the above, while at the same time signaling to Member States that a disparate approach in transposing and implementing the ePrivacy Directive is not consistent with EU law, and setting clear guidance on what ‘specific’, ‘unambiguous’ and ‘informed’ consent means.

The core of the Court findings is that:

  • pre-ticked boxes do not amount to valid consent,
  • expiration date of cookies and third party sharing should be disclosed to users when obtaining consent,
  • different purposes should not be bundled under the same consent ask,
  • in order for consent to be valid ‘an active behaviour with a clear view’ (which I read as ‘intention’) of consenting should be obtained (so claiming in notices that consent is obtained by having users continuing to use the website very likely does not meet this threshold) and,
  • (quite consequential), these rules apply to cookies regardless of whether the data accessed is personal or not.

Unfortunately, though, the Court did not tackle one other very important issue: what does ‘freely given’ consent mean? In other words, would requiring and obtaining consent for placing cookies with the purpose of online tracking for behavioural advertising as a condition to access an online service, such as an online lottery (as in Planet49’s case), be considered as ‘freely given’ consent?

An answer to this question would have affected all online publishers and online service providers that condition access to their services to allowing online behaviour tracking cookies being installed on user devices and rely on ‘cookie walls’ as a source of income for their businesses. What is interesting is that the Court included a paragraph in the judgment specifically enunciating that it does not give its view on this issue because it was not asked to do so by the referring German Court (paragraph 64). Notably, ‘freely given’ is the only of the four conditions for valid consent that the Court did not assess in its judgment and that it specifically singled out as being left out in the open.

Finally, one very important point to highlight is that the entirety of the findings were made under the rules for valid consent as they were provided by Directive 95/46. The Court even specified that its finding concerning ‘unambiguous’ consent is made under the old directive. This is relevant because the definition of consent in Article 2(h) of Directive 95/46 only refers to ‘any freely given specific and informed indication’ of agreement. However, Article 7(a) of the directive provides that the data subject’s consent may make a processing lawful if it was given ‘unambiguously’.

With the GDPR, the four scattered conditions have been gathered under Article 4(11) and have been reinforced by clearer recitals. The fact remains that conditions for valid consent were just as strong under Directive 95/46. The Court almost ostensibly highlights that its interpretation is made on the conditions provided under the old legal regime and they only apply to the GDPR ‘a fortiori‘ (paragraph 60); (see here for what a fortiori means in legal interpretation).

Consequently, it seems that consent obtained for placing cookies with the help of pre-ticked boxes or through inaction or action without intent to give consent, even prior to the GDPR entering into force, has been unlawfully obtained. It remains to be seen if any action by supervisory authorities will follow to tackle some of those collections of data built relying on unlawfully obtained consent, or whether they will take a clean slate approach.

For a deeper dive into the key findings of the Planet49 CJEU judgment, read below:

Discrepancies in applying ePrivacy at Member State level, unjustifiable based on Directive’s text

Before assessing the questions referred on substance, the Court makes some preliminary findings. Among them, it finds that ‘the need for a uniform application of EU law and the principle of equality require that the wording of a provision of EU law which makes no express reference to the law of the Member States for the purpose of determining its meaning and scope must normally be given an autonomous and uniform interpretation throughout the European Union’ (paragraph 47). Article 5(3) of the ePrivacy Directive does not provide any room for Member State law to determine the scope and meaning of its provisions, by being sufficiently clear and precise in what it asks the Member States to do (see paragraph 46 for the Court’s argument).

In practice, divergent transposition and implementation of the ePrivacy Directive has created different regimes across the Union, which had consequences for the effectiveness of its enforcement.

‘Unambiguous’ means ‘active behavior’ and intent to give consent

The Court starts its assessment from a linguistic interpretation of the wording of Article 5(3) of Directive 2002/58. It notes that the provision doesn’t require a specific way of obtaining consent to the storage of and access to cookies on users’ devices. The Court observes that ‘the wording ‘given his or her consent’ does however lend itself to a literal interpretation according to which action is required on the part of the user in order to give his or her consent.

In that regard, it is clear from recital 17 of Directive 2002/58 that, for the purposes of that directive, a user’s consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an internet website‘ (paragraph 49).

The Court highlights that per Article 2(f) of Directive 2002/58 the meaning of a user’s ‘consent’ under the ePrivacy Directive is meant to be the same as that of a data subject’s consent under Directive 95/46 (paragraph 50). By referring to Article 2(h) of the former data protection directive, the Court observes that ‘the requirement of an ‘indication’ of the data subject’s wishes clearly points to active, rather than passive, behaviour’ (paragraph 52). The Court then concludes that ‘consent given in the form of a preselected tick in a checkbox does not imply active behaviour on the part of a website user’ (paragraph 52).

Interestingly, the Court points out that this interpretation of what ‘indication’ means ‘is borne out by Article 7 of Directive 95/46’ (paragraph 53), and in particular Article 7(2) which ‘provides that the data subject’s consent may make such processing lawful provided that the data subject has given his or her consent ‘unambiguously’’ (paragraph 54). So even if the definition of consent in Directive 95/46 does not refer to this condition in particular, the Court nevertheless anchored its main arguments in it.

The Court then made another important interpretation concerning what ‘unambiguous’ consent means: ‘Only active behaviour on the part of the data subject with a view to giving his or her consent may fulfil that requirement’ (paragraph 54). This wording (‘with a view to’) suggests that there is a condition of willfulness, of intent to give consent in order for the indication of consent to be lawful.

In addition, to be even clearer, the Court finds that ‘it would appear impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by not deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed. It is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited” (paragraph 55).

A fortiori, it appears impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by merely continuing with his or her activity on the website visited (continuing browsing or scrolling), nor whether the consent has been informed, provided that the information given to him or her does not even include a pre-ticked checkbox which would at least give the opportunity to uncheck the box. Also, just like the Court points out, it is not inconceivable that a user would not have read the information announcing him or her that by continuing to use the website they give consent.

With these two findings in paragraphs 54 and 55 the Court seems to clarify once and for all that informing users that by continuing their activity on a website signifies consent to placing cookies on their device is not sufficient to obtain valid consent under the ePrivacy Directive read in the light of both Directive 95/46 and the GDPR.

‘Specific’ means consent can’t be inferred from bundled purposes

The following condition that the Court analyzes is that of specificity. In particular, the Court finds that ‘specific’ consent means that ‘it must relate specifically to the processing of the data in question and cannot be inferred from an indication of the data subject’s wishes for other purposes” (paragraph 58). This means that bundled consent will not be considered valid and that consent should be sought granularly for each purpose of processing.

‘Informed’ means being able to determine the consequences of any consent given

One of the questions sent for a preliminary ruling by the German Court concerned specific categories of information that should be disclosed to users in the context of obtaining consent for placing cookies. Article 5(3) of the ePrivacy Directive requires that the user is provided with ‘clear and comprehensive information’ in accordance with Directive 95/46 (now replaced by the GDPR). The question was whether this notice must also include (a) the duration of the operation of cookies and (b) whether or not third parties may have access to those cookies.

The Court clarified that providing ‘clear and comprehensive’ information means ‘that a user is in a position to be able to determine easily the consequences of any consent he or she might give and ensure that the consent given is well informed. It must be clearly comprehensible and sufficiently detailed so as to enable the user to comprehend the functioning of the cookies employed’ (paragraph 74). Therefore, it seems that using language that is easily comprehensible for the user is important, just as it is important painting a full picture of the function of the cookies for which consent is sought.

The Court found specifically with regard to cookies that ‘aim to collect information for advertising purposes’ that ‘the duration of the operation of the cookies and whether or not third parties may have access to those cookies form part of the clear and comprehensive information‘ which must be provided to the user (paragraph 75).

Moreover, the Court adds that ‘information on the duration of the operation of cookies must be regarded as meeting the requirement of fair data processing‘ (paragraph 78). This is remarkable, since the Court doesn’t usually make findings in its data protection case-law with regard to the fairness of processing. Doubling down on its fairness considerations, the Court goes even further and links fairness of the disclosure of the retention time to the fact that ‘a long, or even unlimited, duration means collecting a large amount of information on users’ surfing behaviour and how often they may visit the websites of the organiser of the promotional lottery’s advertising partners’ (paragraph 78).

It is irrelevant if the data accessed by cookies is personal or anonymous, ePrivacy provisions apply regardless

The Court was specifically asked to clarify whether the cookie consent rules in the ePrivacy Directive apply differently depending on the nature of the data being accessed. In other words, does it matter that the data being accessed by cookie is personal or anonymized/aggregated/de-identified?

First of all, the Court points out that in the case at hand, ‘the storage of cookies … amounts to a processing of personal data’ (paragraph 67). That being said, the Court nonetheless notes that the provision analyzed merely refers to ‘information’ and does so ‘without characterizing that information or specifying that it must be personal data’ (paragraph 68).

The Court explained that this general framing of the provision ‘aims to protect the user from interference with his or her private sphere, regardless of whether or not that interference involves personal data’ (paragraph 69). This finding is particularly relevant for the current legislative debate over the revamp of the ePrivacy Directive. It is clear that the core difference between the GDPR framework and the ePrivacy regime is what they protect: the GDPR is concerned with ensuring the protection of personal data and fair data processing whenever personal data is being collected and used, while the ePrivacy framework is concerned with shielding the private sphere of an individual from any unwanted interference. That private sphere/private center of interest may include personal data or not.

The Court further refers to recital 24 of the ePrivacy Directive, which mentions that “any information stored in the terminal equipment of users of electronic communications networks are part of the private sphere of the users requiring protection under the European Convention for the Protection of Human Rights and Fundamental Freedoms. That protection applies to any information stored in such terminal equipment, regardless of whether or not it is personal data, and is intended, in particular, as is clear from that recital, to protect users from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge” (paragraph 70).

Conclusion

The judgment of the CJEU in Planet49 provides some much needed certainty about how the ‘cookie banner’ and ‘cookie consent’ provisions in the ePrivacy Directive should be applied, after years of disparate approaches from national transposition laws and supervisory authorities which lead to a lack of effectiveness in enforcement and, hence, compliance. The judgment does leave open on ardent question: what does ‘freely given consent’ mean? It is important to note nonetheless that before reaching the ‘freely given’ question, any consent obtained for placing cookies (or similar technologies) on user devices will have to meet all of the other three conditions. If only one of them is not met, then that consent is invalid.

***

You can refer to this summary by quoting G. Zanfir-Fortuna, ‘Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking’, http://www.pdpecho.com, published on October 3, 2019.

The CJEU decides lack of access to personal data does not unmake a joint controller: A look at Wirtschaftsakademie

Who is the controller?

The Court of Justice of the EU decided in Case C-210/16 Wirtschaftsakademie that Facebook and the administrator of a fan page created on Facebook are joint controllers under EU data protection law. The decision sent a mini shockwave to organizations that use Facebook Pages, just one week after the GDPR entered into force. What exactly does it mean that they are joint controllers and what exactly do they have to do in order to be compliant? The judgment leaves these questions largely unanswered, but it gives some clues as to finding answers.

Being a joint controller means they have a shared responsibility (with Facebook) to comply with EU data protection law for the processing of personal data occurring through their Facebook Page. As the Court highlighted, they have this responsibility even if they do not have access at all to personal data collected through cookies placed on the devices of visitors of the Facebook page, but just to the aggregated results of the data collection.

The judgment created a great deal of confusion. What has not been yet sufficiently emphasized in the reactions to the Wirtschaftsakademie judgment is that this shared responsibility is not equal: it depends on the stage of the processing the joint controller is involved in and on the actual control it has over the processing. This is, in any case, a better position to be in rather than “controller” on behalf of whom Facebook is processing personal data, or “co-controller” with Facebook. This would have meant full legal liability for complying with data protection obligations for the personal data processed through the page. It is, however, a worse position than being a third party or a recipient that is not involved in any way in establishing purposes and means of the processing. That would have meant there is no legal responsibility for the data being processed through the page. Technically, those were the other options the Court probably looked at before taking the “joint controllership” path.

It is important to note that the Court did not mention at all which are the responsibilities of whom – not even with regard to providing notice. The failure of both Facebook and the page administrator to inform visitors about cookies being placed on their device was the reason invoked by the DPA in the main national proceedings, but the Court remained silent on who is responsible for this obligation.

This summary looks at what the Court found, explaining why it reached its conclusion, and trying to carve out some of the practical consequences of the judgment (also in relation to the GDPR).

This first part of the commentary on the judgment will only cover the findings related to “joint controllership”. The findings related to the competence of the German DPA will be analyzed in a second part. While the judgment interprets Directive 95/46, most of the findings will remain relevant under the GDPR as well, to the extent they interpret identical or very similar provisions of the two laws.

Facts of the Case

Wirtschaftsakademie is an organization that offers educational services and has a Facebook fan page. The Court described that administrators of fan pages can obtain anonymous statistical information available to them free of charge. “That information is collected by means of evidence files (‘cookies’), each containing a unique user code, which are active for two years and are stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages” (#15). The user code “is collected and processed when the fan pages are open” (#15).

The DPA of Schleswig-Holstein ordered Wirtschaftsakademie to close the fan page if it will not be brought to compliance, on the ground that “neither Wirtschaftsakademie, nor Facebook, informed visitors to the Fan Page that Facebook, by means of cookies, collected personal data concerning them and then processed the data” (#16).

The decision of the DPA was challenged by Wirtschaftsakademie, arguing that “it was not responsible under data protection law for the processing of the data by Facebook or the cookies which Facebook installed” (#16). After the DPA lost in lower instances, it appealed these solutions to the Federal Administrative Court, arguing that the main data protection law breach of Wirtschafstakademie was the fact that it commissioned “an inappropriate supplier” because  the supplier “did not comply with data protection law” (#22).

The Federal Administrative Court sent several questions for a preliminary ruling to the CJEU aiming to clarify whether indeed Wirtschaftsakademie had any legal responsibility for the cookies placed by Facebook through its Fan Page and whether the Schleswig Holstein DPA had competence to enforce German data protection law against Facebook, considering that Facebook’s main establishment in the EU is in Ireland and its German presence is only linked to marketing (#24).

“High level of protection” and “effective and complete protection”

The Court starts its analysis by referring again to the aim of the Directive to “ensure a high level of protection of fundamental rights and freedoms, and in particular their right to privacy in respect to processing of personal data” (#26) – and it is to be expected that all analyses under the GDPR would start from the same point. This means that all interpretation of the general data protection law regime will be done in favor of protecting the fundamental rights of data subjects.

Based on the findings in Google Spain, the Court restates that “effective and complete protection of the persons concerned” requires a “broad definition of controller” (#28). Effective and complete protection is another criterion that the Court often takes into account when interpreting data protection law in favor of the individual and his or her rights.

{In fact, one of the afterthoughts of the Court after establishing the administrator is a joint controller, was that “the recognition of joint responsibility of the operator of the social network and the administrator of a fan page hosted on that network in relation to the processing of the personal data of visitors to that page contributes to ensuring more complete protection of the rights of persons visiting a fan page” (#42)}.

The referring Court did not even consider the possibility that the administrator is a controller

Having set up the stage like this, the Court goes on and analyzes the definition of “controller”. To be noted, though, that the referring Court never asked whether the administrator of the fan page is a controller or a joint controller, but asked whether it has any legal responsibility for failing to choose a compliant “operator of its information offering” while being an “entity that does not control the data processing within the meaning of Article 2(d) of Directive 95/46” (#24 question 1).

It seems that the referring Court did not even take into account that the fan page administrator would have any control over the data, but was wondering whether only “controllers” have legal responsibility to comply with data protection law under Directive 95/46, or whether other entities somehow involved in the processing could also have some responsibility.

However, the Court does not exclude the possibility that the administrator may be a controller. First of all, it establishes that processing of personal data is taking place, as described at #15, and that the processing has at least one controller.

Facebook is “primarily” establishing means and purposes of the processing

It recalls the definition of “controller” in Article 2(d) of the Directive and highlights that “the concept does not necessarily refer to a single entity and may concern several actors taking part in that processing, with each of them then being subject to the applicable data protection provisions” (#29). The distribution of responsibilities from the last part of the finding is brought up by the Court without having any such reference in Article 2(d)[1].

This is important, because the next finding of the Court is that, in the present case, “Facebook Ireland must be regarded as primarily determining the purposes and means of processing the personal data of users of Facebook and persons visiting the fan pages hosted on Facebook” (#30). Reading this paragraph together with #29 means that Facebook will have a bigger share of the obligations in a joint controllership situation with fan pages administrators.

This idea is underlined by the following paragraph which refers to identifying the “extent” to which a fan page administrator “contributes… to determining, jointly with Facebook Ireland and Facebook Inc., the purposes and means of processing” (#31). To answer this question, the Court lays out its arguments in three layers:

1) It describes the processing of personal data at issue, mapping the data flows – pointing to the personal data being processed, data subjects and all entities involved:

  • The data processing at issue (placing of cookies on the Fan Page visitors’ device) is “essentially carried out by Facebook” (#33);
  • Facebook “receives, registers and processes” the information stored in the placed cookies not only when a visitor visits the Fan Page, but also when he or she visits services provided by other Facebook family companies and by “other companies that use the Facebook services” (#33);
  • Facebook partners and “even third parties” may use cookies to provide services to Facebook or the business that advertise on Facebook (#33);
  • The creation of a fan page “involves the definition of parameters by the administrator, depending inter alia on the target audience … , which has an influence on the processing of personal data for the purpose of producing statistics based on visits to the fan page” (#36);
  • The administrator can request the “processing of demographic data relating to its target audience, including trends in terms of age, sex, relationship and occupation”, lifestyle, location, online behavior, which tell the administrator where to make special offers and better target the information it offers (#37);
  • The audience statistics compiled by Facebook are transmitted to the administrator “only in anonymized form” (#38);
  • The production of the anonymous statistics “is based on the prior collection, by means of cookies installed by Facebook …, and the processing of personal data of (the fan page) visitors for such statistical purposes” (#38);

2) It identifies the purposes of this processing:

  • There are two purposes of the processing:
    • “to enable Facebook to improve its system of advertising transmitted via its network” and
    • “to enable the fan page administrator to obtain statistics produced by Facebook from the visits of the page”, which is useful for “managing the promotion of its activity and making it aware of the profiles of the visitors who like its fan page or use its applications, so that it can offer them more relevant content” (#34);

3) It establishes a connection between the two entities that define the two purposes of processing:

  • Creating a fan page “gives Facebook the opportunity to place cookies on the computer or other device of a person visiting its fan page, whether or not that person has a Facebook account” (#35);
  • The administrator may “define the criteria in accordance with which the statistics are to be drawn up and even designate the categories of persons whose personal data is to be made use of by Facebook”, “with the help of filters made available by Facebook” (#36);
  • Therefore, the administrator “contributes to the processing of the personal data of visitors to its page” (#36);

One key point: not all joint controllers must have access to the personal data being processed

In what is the most impactful finding of this judgment, the Court uses one of the old general principles of interpreting and applying the law, ubi lex non distinguit, nec nos distinguere debemus, and it states that “Directive 95/46 does not, where several operators are jointly responsible for the same processing, require each of them to have access to the personal data concerned” (#38). Therefore, the fact that administrators have access only to anonymized data will have no impact upon the existence of their legal responsibility as joint controllers, since the criteria that matters is establishing purposes and means of the processing and that at least one of the entities involved in the processing has access to and is processing personal data. The fact that they only have access to anonymized data should nonetheless matter when establishing the degree of responsibility.

Hence, after describing the involvement of fan page administrators in the processing at issue – and in particular their role in defining parameters for processing depending on their target audience and in the determination of the purposes of the processing, the Court finds that “the administrator must be categorized, in the present case, as a controller responsible for that processing within the European Union, jointly with Facebook Ireland” (#39).

Enhanced responsibility for non-users visiting the page

The Court also made the point that fan pages can be visited by non-users of Facebook, implying that were it not for the existence of that specific fan page they accessed because they were looking for information related to the administrator of the page, Facebook would not be able to place cookies on their devices and process personal data related to them for its own purposes and for the purposes of the fan page. “In that case, the fan page responsibility for the processing of the personal data of those persons appears to be even greater, as the mere consultation of the home page by visitors automatically starts the processing of their personal data” (#42).

Jointly responsible, not equally responsible

Finally, after establishing that there is joint controllership and joint responsibility, the Court makes the very important point that the responsibility is not equal and it depends on the degree of involvement of the joint controller in the processing activity:

The existence of joint responsibility does not necessarily imply equal responsibility of the various operators involved in the processing of personal data. On the contrary, those operators may be involved at different stages of that processing of personal data and to different degrees, so that the level of responsibility of each of them must be assessed with regard to all the relevant circumstances of the particular case(#43).

Comments and conclusions

In the present case, the Court found early in the judgment that Facebook “primarily” establishes the means and purposes of the processing. This means that it is primarily responsible for compliance with data protection obligations. At the same time, the administrator of the fan page has responsibility to comply with some data protection provisions, as joint controller. The Court did not clarify, however, what exactly the administrator of the fan page must do in order to be compliant.

For instance, the Court does not go into analyzing how the administrator complies or not with the Directive in this case – therefore, assuming that the judgment requires administrators to provide data protection notice is wrong. The lack of notice was a finding of the DPA in the initial proceedings. Moreover, the DPA ordered Wirtschaftsakademie to close its Facebook page because it found that neither Facebook, nor the page administrator had informed visitors about the cookies being placed on their devices (#16).

The CJEU merely establishes that the administrator is a joint controller and that it shares responsibility for compliance with Facebook depending on the degree of their involvement in the processing.

The only clear message from the Court with regard to the extent of legal responsibility of the administrator as joint controller is that it has enhanced responsibility towards visitors of the fan page that are not Facebook users. This being said, it is very likely that informing data subjects is one of the obligations of the GDPR that can potentially fall on the shoulders of fan page administrators in the absence of Facebook stepping up and providing notice, since they can edit the interface with visitors to a certain extent.

Another message that is not so clear, but can be extracted from the judgment is that the degree of responsibility of the joint controllers “must be assessed with regard to all the relevant circumstances of the particular case” (#43). This could mean that if the two joint controllers were to enter a joint controllership agreement (as the GDPR now requires), the Courts and DPAs may be called to actually look at the reality of the processing in order to determine the responsibilities each of them has, in order to avoid a situation where the joint controller primarily responsible for establishing means and purposes contractually distributes obligations to the other joint controller that the latter could not possibly comply with.

As for the relevance of these findings under the GDPR, all the “joint controllership” part of the judgment is very likely to remain relevant, considering that the language the Court interpreted from Directive 95/46 is very similar to the language used in the GDPR (see Article 2(d) of the Directive and Article 4(7) GDPR). However, the GDPR does add a level of complexity to the situation of joint controllers, in Article 26. The Court could, eventually, add to this jurisprudence an analysis of the extent to which the joint controllership agreement required by Article 26 is relevant to establish the level of responsibility of a joint controller.

Given that the GDPR requires joint controllers to determine in a transparent manner their respective responsibilities for compliance through an arrangement, one consequence of the judgment is that such an arrangement should be concluded between Facebook and fan page administrators (Article 26(1) GDPR). The essence of the arrangement must then be made available to visitors of fan pages (Article 26(2) GDPR).

However, there is one obligation under the GDPR that, when read together with the findings of the Court, results in a conundrum. Article 26(3) GDPR provides that the data subject may exercise his or her rights “in respect of and against each of the controller”, regardless of how the responsibility is shared contractually between them. In the case at hand, the Court acknowledges that the administrator only has access to anonymized data. This means that even if data subjects would make, for example, a request for access or erasure of data to the administrator, it will not be in a position to solve such requests. A possibility is that any requests made to a joint controller that does not have access to data will be forwarded by the latter to the joint controller that does have access (what is important is that the data subject has a point of contact and eventually someone they can claim their rights to). This is yet another reason why a written agreement to establish the responsibility of each joint controller is useful. Practice will solve the conundrum, ultimately, with DPAs and national Courts likely playing their part.

 

 

 

[1] “(d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or Community laws or regulations, the controller or the specific criteria for his nomination may be designated by national or Community law;”

Brief case-law companion for the GDPR professional

This collection of quotes from relevant case-law has been compiled with the purpose of being useful to all those working with EU data protection law. The majority of the selected findings are part of a “Countdown to the GDPR” I conducted on social media, one month before the Regulation became applicable, under #KnowYourCaseLaw. This exercise was prompted by a couple of reasons.

First, data protection in the EU is much older and wider than the General Data Protection Regulation (GDPR) and it has already invited the highest Courts in Europe to weigh in on the protection of this right. Knowing what those Courts have said is essential.

Data protection law in the EU is not only a matter of pure EU law, but also a matter of protecting human rights following the legal framework of the Council of Europe (starting with Article 8 of the European Convention on Human Rights – ‘ECHR’). The interplay between these two legal regimes is very important, given the fact that the EU recognizes fundamental rights protected by the ECHR as general principles of EU law – see Article 6(3) TEU.

Finally, knowing relevant case-law makes the difference between a good privacy professional and a great one.

What to expect

This is not a comprehensive collection of case-law and it does not provide background for the cases it addresses. The Handbook of data protection law, edition 2018, is a great resource if this is what you are looking for.

This is a collection of specific findings of the Court of Justice of the EU (CJEU), the European Court of Human Rights (ECtHR) and one bonus finding of the German Constitutional Court. There are certainly other interesting findings that have not been included here (how about an “Encyclopedia of interesting findings” for the next project?). The ones that have been included provide insight into specific issues, such as the definition of personal data, what constitutes data related to health, what does freely consent mean or what type of interference with fundamental rights is profiling. Readers will even find a quote from a concurring opinion of an ECtHR judge that is prescient, to say the least.

Enjoy the read!

Brief Case-Law Companion for the GDPR Professional

A Conversation with Giovanni Buttarelli about The Future of Data Protection: setting the stage for an EU Digital Regulator

The nature of the digital economy is as such that it will force the creation of multi-competent supervisory authorities sooner rather than later. What if the European Data Protection Board would become in the next 10 to 15 years an EU Digital Regulator, looking at matters concerning data protection, consumer protection and competition law, having “personal data” as common thread? This is the vision Giovanni Buttarelli, the European Data Protection Supervisor, laid out last week in a conversation we had at the IAPP Data Protection Congress in Brussels.

The conversation was a one hour session in front of an over-crowded room in The Arc, a cozy amphitheater-like venue inducing bold ideas being expressed in a stimulating exchange.

To begin with, I reminded the Supervisor that at the very beginning of his mandate, in early 2015, he published the 5-year strategy of the EDPS. At that time the GDPR wasn’t adopted yet and the Internet of Things was taking off. Big Data had been a big thing for a while and questions about the feasibility and effectiveness of a legal regime that is centered around each data item that can be traced back to an individual were popping up. The Supervisor wrote in his Strategy that the benefits brought by new technologies should not happen at the expense of the fundamental rights of individuals and their dignity in the digital society.

Big data will need equally  big data protection, he wrote then, suggesting thus that the answer to Big Data is not less data protection, but enhanced data protection.

I asked the Supervisor if he thinks that the GDPR is the “big data protection” he was expecting or whether we need something more than what the GDPR provides for. And the answer was that “the GDPR is only one piece of the puzzle”. Another piece of the puzzle will be the ePrivacy reform, and another one will be the reform of the regulation that provides data protection rules for the EU institutions and that creates the legal basis for the functioning of the EDPS. I also understood from our exchange that a big part of the puzzle will be effective enforcement of these rules.

The curious fate of the European Data Protection Board

One centerpiece of enforcement is the future European Data Protection Board, which is currently being set up in Brussels so as to be functional on 25 May 2018, when the GDPR becomes applicable. The European Data Protection Board will be a unique EU body, as it will have a European nature, being funded by the EU budget, but it will be composed of commissioners from national data protection authorities who will adopt decisions, that will rely for the day-to-day activity on a European Secretariat. The Secretariat of the Board will be ensured by dedicated staff of the European Data Protection Supervisor.

The Supervisor told the audience that he either already hired or plans to hire a total of “17 geeks” adding to his staff, most of whom will be part of the European Data Protection Board Secretariat. The EDPB will be functional from Day 1 and, apparently, there are plans for some sort of inauguration of the EDPB celebrated at midnight on the 24th to the 25th of May next year.

These are my thoughts here: the nature of the EDPB is as unique as the nature of the EU (those of you who studied EU Law certainly remember from the law school days how we were told that the EU is a sui generis type of economical and political organisation). In fact, the EDPB may very well serve as test model for ensuring supervision and enforcement of other EU policy areas. The European Commission could test the waters to see whether such a mixt national/European enforcement mechanism is feasible.

There is a lot of pressure on effective enforcement when it comes to the GDPR. We dwelled on enforcement, and one question that inevitably appeared was about the trend that starts to shape up in Europe, of having competition authorities and consumer protection authorities engaging in investigations together with, or in parallel with data protection authorities (see herehere and here).

It’s time for a big change, and time for the EU to have a global approach, the Supervisor said. And a change that will require some legislative action. “I’m not saying we will need an European FTC (US Federal Trade Commission – n), but we will need a Digital EU Regulator“, he added. This Digital Regulator would have the powers to also look into competition and consumer protection issues raised by processing of personal data (so, therefore, in addition to data protection issues). Acknowledging that these days there is a legislative fatigue in Brussels surrounding privacy and data protection, the Supervisor said he will not bring this idea to the attention of the EU legislator right now. But he certainly plans to do so, maybe even as soon as next year. The Supervisor thinks that the EDPB could morph into this kind of Digital Regulator sometime in the future.

The interplay among these three fields of law has been on the Supervisor’s mind for some time now. The EDPS issued four Opinions already that set the stage for this proposal – See Preliminary Opinion on “Privacy and competitiveness in the age of Big Data: the interplay between data protection, competition law and consumer protection in the digital economy“, Opinion 4/2015 “Towards a new digital ethics“, Opinion 7/2015 “Meeting the Challenges of Big Data“, and finally Opinion 8/2016 on “coherent enforcement of fundamental rights in the age of Big Data“. So this is certainly something the data protection bubble should keep their eyes on.

Enhanced global enforcement initiatives

Another question that had to be asked on enforcement was whether we should expect more concentrated and coordinated action of privacy commissioners on a global scale, in GPEN-like structures. The Supervisor revealed that the privacy commissioners that meet for the annual International Conference are “trying to complete an exercise about our future”. They are currently analyzing the idea of creating an entity with legal personality that will look into global enforcement cases.

Ethics comes on top of legal compliance

Another topic the conversation went to was “ethics”. The EDPS has been on the forefront of including the ethics approach in privacy and data protection law debates, by creating the Ethics Advisory Group at the beginning of 2016. I asked the Supervisor whether there is a danger that, by bringing such a volatile concept into the realm of data protection, companies would look at this as an opportunity to circumvent strict compliance and rely on sufficient self-assessments that their uses of data are ethical.

“Ethics comes on top of data protection law implementation”, the Supervisor explained. According to my understanding, ethics is brought into the data protection realm only after a controller or processor is already compliant with the law and, if they have to take equally legal decisions, they should rely on ethics to take the right decision.

We did discuss about other things during this session, including the 2018 International Conference of Privacy Commissioners that will take place in Brussels, and the Supervisor received some interesting questions from the public at the end, including about the Privacy Shield. But a blog can only be this long.

 

Note: The Supervisor’s quotes are so short in this blog because, as the moderator, I did my best to follow the discussion and steer it rather than take notes. So the quotes come from the brief notes I managed to take during this conversion.

Highlights of the draft LIBE report on the ePrivacy Reg

The draft Report prepared by MEP Marju Lauristin for the LIBE Committee containing amendments to the ePrivacy Regulation was published last week on the website of the European Parliament.

The MEP announced she will be presenting the Report to her colleagues in the LIBE Committee on 21 June. The draft Report will need to be adopted first by the LIBE Committee and at a later stage by the Plenary of the European Parliament. The Parliament will then sit in the trilogue together with the European Commission and the Council (once it will also adopt an amended text), finding the compromise among the three versions of the text.

Overall, the proposed amendments strengthen privacy protections for individuals. The big debate of whether there should be an additional exemption to confidentiality of communications based on the legitimate interest of service providers and other parties to have access to electronic communications data was solved in the sense that no such exemption was proposed (following calls in this sense by the Article 29 Working Party, the European Data Protection Supervisor and a team of independent academics). The draft report also contains strong wording to support end-to-end encryption, as well as support for Do-Not-Track technology and a new definition of the principle of confidentiality of communications in the age of the Internet of Things.

Without pretending this is a comprehensive analysis, here are 20 points that caught my eye after a first reading of the amendments (added text is bolded and italicised):

1) Clarity regarding what legitimate grounds for processing prevail if both the GDPR and the ePrivacy Reg could apply to a processing operation: those of the ePrivacy Reg. (“Processing of electronic communications data by providers of electronic communications services should only be permitted in accordance with, and on a legal ground specifically provided for under, this Regulation” – Recital 5). The amendment to Recital 5 further clarifies the relationship between the GDPR and the ePrivacy Reg, specifying that the ePrivacy Reg “aims to provide additional and complementary safeguards taking into account the need for additional protection as regards the confidentiality of communications”.

2) The regulation should be applicable not only to information “related to” the terminal equipment of end-users, but also to information “processed by” it. (“…and to information related to or processed by the terminal equipment of end-users” – Article 2; see also the text proposed for Article 3(1)(c)). This clarifies the material scope of the Regulation, leaving less room for interpretation of what “information related to” means.

3) The link to the definitions of the Electronic Communications Code is removed. References to those definitions are replaced by self-standing definitions for “electronic communications network”, “electronic communications service”, “interpersonal communications service”, “number-based interpersonal communications service”, “number -independent interpersonal communications service”, “end-user”. For instance, the new definition proposed for “electronic communications service” is “a service provided via electronic communications networks, whether for remuneration or not, which encompasses one or more of the following: an ‘internet access service’ as defined in Article 2(2) or Regulation (EU) 2015/2120; an interpersonal communications service; a service consisting wholly or mainly in the conveyance of the signals, such as a transmission service used for the provision of a machine-to-machine service and for broadcasting, but excludes information conveyed as part of a broadcasting service to the public over an electronic communications network or service except to the extent that the information can be related to the identifiable subscriber or user receiving the information” (Amendment 49; my underline).

4) Limitation of the personal scope of key provisions of the Regulation to natural persons. The draft report proposes two definitions to delineate the personal scope of the Regulation – “end-users” and “users”. While an “end-user” is defined as “a legal entity or a natural person using or requesting a publicly available electronic communications service“, a “user” is defined as “any natural person using a publicly available electronic communications service (…)“. Key provisions of the Regulation are only applicable to users, and especially the proposed principle of confidentiality of communications. (See Amendments 58 and 59). This proposal may unnecessarily limit the scope of application of the right to respect for private life, which, as opposed to the right to the protection of personal data, is theoretically (the CJEU did not yet explicitly state this) recognised as also protecting the privacy and confidentiality of communications of legal persons (through correspondence with Article 8 ECHR and how it has been interpreted by the European Court of Human Rights; for an analysis, see p. 17 and following HERE). The current ePrivacy Directive equally protects the confidentiality of communications of both natural and legal persons.

5) Enhanced definition of “electronic communications metadata”, to also include “data broadcasted or emitted by the terminal equipment to identify users’ communications and/or the terminal equipment or its location and enable it to connect to a network or to another device“.

6) Enhanced definition of “direct marketing”, to also include advertising in video format, in addition to the written and oral formats, and advertising served or presented to persons, not only “sent”. Could this mean that the definition of direct marketing will cover street advertising panels reacting to the passer-by? Possibly.

7) Extension of the principle of confidentiality of communications to machine-to-machine communications. A new paragraph is added to Article 5 (Amendment 59) “Confidentiality of electronic communications shall also include terminal equipment and machine-to-machine communications when related to a user”.

8) “Permitted” processing of electronic communications data is replaced by “lawful” processing. This change of wording de-emphasises the character of “exemptions to a principle” that the permitted processing had relative to the general principle of confidentiality. This may have consequences when Courts will interpret the law.

9) While proposed wording for the existing lawful grounds for processing is stricter (processing is allowed “only if”; necessity is replaced with “technically strict necessity”), there are additional grounds for processing added (See Amendments 64 to 66, to Article 6; see also Amendments 77, 79, 80 to Article 8).

10) A “household exemption” is introduced, similar to the one provided for by the GDPR, enhanced with a “work purposes exemption”: “For the provision of a service explicitly requested by a user of an electronic communications service for their purely individual or individual work related usage (…)“. In such circumstances, electronic communications data may be processed “without the consent of all users”, but “only where such requested processing produces effects solely in relation to the user who requested the service  and “does not adversely affect the fundamental rights of another user or users“. This exemption raises some questions and the first one is: does anyone use an electronic communications service for purposes other than “purely individual” or “work related” purposes? If you think so, leave a comment with examples. Another question is what does “without the consent of all users” mean (See Amendment 71, to Article 6).

11) An exception for tracking employees is included in the proposal. The collection of information from user’s terminal equipment (for instance, via cookies) would be permitted “if it is necessary in the context of employment relationships“, but only to the extent the employee is using equipment made available by the employer and to the extent this monitoring “is strictly necessary for the functioning of the equipment by the employee” (see Amendment 82). It remains to be seen what “functioning of the equipment by the employee” means. This exemption seems to have the same effect as the one in Article 8(1)(a), which allows such collection of information if “it is strictly technically necessary for the sole purpose of carrying out the transmission of an electronic communication over an electronic communications network”. On another hand, it should be kept in mind that the ePrivacy rules are not intended to apply to closed groups of end-users, such as corporate intranet networks, access to which is limited to members of an organisation (see Recital 13 and Amendment 11).

12) Consent for collecting information from terminal equipment “shall not be mandatory to access the service”. This means, for instance, that even if users do not consent to placing cookies tracking their activity online, they should still be allowed to access the service they are requesting. While this would be considered a consequence of “freely given” consent, enshrining this wording in a legal provision certainly leaves no room for interpretation (see Amendment 78 to Article 8). Moreover, this exception for collecting information is strengthened by a rule introduced as a separate paragraph of Article 8, according to which “No user shall be denied access to any information society service or functionality, regardless of whether this service is remunerated or not, on grounds that he or she has not given his or her consent under Article 8(1)(b) to the processing of personal information and/or the use of storage capabilities of his or her terminal equipment that is not necessary for the provision of that service or functionality.” (see Amendment 83). Such wording would probably put to rest concerns that personal data would be considered as “counter-performance” (equivalent to money) for services.

13) All further use of electronic communications data collected under ePrivacy rules is prohibited. A new paragraph inserted in Article 6 simply states that “Neither providers of electronic communications services, nor any other party, shall further process electronic communications data collected on the basis of this Regulation” (see Amendment 72).

14) Wi-fi tracking and similar practices involving collection of information emitted by terminal equipment would only be possible with the informed consent of the user or if the data are anonymised and the risks are adequately mitigated (a third exception is, of course, when accessing such data is being done for the purposes of establishing a connection). This is a significant change compared to Commission’s text, which allowed such tracking in principle, provided the user is informed and is given the possibility to opt-out (“stop or minimise the collection”) (see Amendments 85, 86). The draft report also proposes a new paragraph to Article 8 containing measures to mitigate risks, including only collecting data for the purpose of “statistical counting”, anonymisation or deletion of data “immediately after the purpose is fulfilled”, and effective opt-out possibilities.

15) Significantly stronger obligations for privacy by default are proposed, with a clear preference for Do-Not-Track mechanism. Article 10 is enhanced so that all software placed on the market must “by default, offer privacy protective settings to prevent other parties from storing information on the terminal equipment of a user and from processing information already stored on that equipment” (see Amendment 95). Opt-outs shall be available upon installation (Amendment 96). What is remarkable is a new obligation that “the settings shall include a signal which is sent to the other parties to inform them about the user’s privacy settings. These settings shall be binding on, and enforceable against, any other party” (see Amendment 99). The rapporteur explains at the end of the Report that “the settings should allow for granulation of consent by the user, taking into account the functionality of cookies and tracking techniques and DNTs should send signals to the other parties informing them of the user’s privacy settings. Compliance with these settings should be legally binding and enforceable against all other parties”.

16) A national Do Not Call register is proposed for opting out of unsolicited voice-to-voice marketing calls (see Amendment 111).

17) End-to-end encryption proposed as security default measure for ensuring confidentiality of communications. Additionally, strong wording is included to prevent Member States from introducing measures amounting to backdoors. Under the title of “integrity of the communications and information about security risks”, Article 17 is amended to include a newly introduced paragraph that states: “The providers of electronic communications services shall ensure that there is sufficient protection in place against unauthorised access or alterations to the electronic communications data, and that the confidentiality and safety of the transmission are also guaranteed by the nature of the means of transmission used or by state-of-the-art end-to-end encryption of the electronic communications data. Furthermore, when encryption of electronic communications data is used, decryption, reverse engineering or monitoring of such communications shall be prohibited. Member States shall not impose any obligations on electronic communications service providers that would result in the weakening of the security and encryption of their networks and services (my highlight) (see Amendment 116).

18) The possibility of class actions for infringement of the ePrivacy reg is introduced. End-users would have “the right to mandate a not-for-profit body, organisation or association” to lodge complaints or to seek judicial remedies on their behalf (see Amendments 125 and 126).

19) Infringement of obligations covered by Article 8 (cookies, wi-fi tracking) would be sanctioned with the first tier of fines (the highest ones – up to 20 mill. EUR or 4% of global annual turnover), which is not the case in the Commission’s proposal (see Amendment 131).

As a bonus, here’s the 20th highlight:

20) Echoing the debate over data analytics using pshycographic measurements to influence elections, the report amends an important recital (Recital 20) to refer to the fact that information on terminal equipments may reveal “very sensitive data”, including “details of the behaviour, psychological features, emotional condition and political and social preferences of an individual“. Among other reasons, this justifies the principle that any interference with the user’s terminal equipment should be allowed only with the user’s consent and for specific and transparent purposes (see Amendment 20; also, watch this video from last week’s Digital Assembly in Malta, where around min. 6 the Rapporteur talks about this and points out that “without privacy there will be no democracy”).

 

Read more:

Other analyses of the LIBE draft report: HERE and HERE.

Overview of the ePrivacy initial proposal by the Commission: HERE.

***
Enjoy what you are reading? Consider supporting pdpEcho

 

 

 

 

Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)

The Conclusions of Advocate General Kokott in C-73/16 Puškár were published on 30 March and remained under the radar, even though they deal with a couple of very important questions for EU data protection law that may have wide implications: effective judicial remedies, lawful grounds for processing other than consent, the right to access one’s own personal data. As a bonus, the AG refers to and analyses Article 79 GDPR – the right to a judicial remedy.

The analysis regarding effective judicial remedies under Article 47 Charter and Directive 95/46 could be relevant for the debate on essentially equivalence when it comes to adequacy decisions for international data transfers (for those of you who don’t remember, one of the two main findings in Schrems was that the Safe Harbor framework touched the essence of the right to effective judicial remedies, breaching thus Article 47 Charter). In this sense, the AG founds that a measure that does not restrict the category of people who could in principle have recourse to judicial review does not touch the essence of this right. Per a contrario, if a measure does restrict these categories of people, it would touch the essence of the right to an effective judicial remedy, and, therefore, it would breach the Charter.

Finally, a question of great importance for EU law in general is also tackled: what should national courts do when the case-law of the CJEU and the case-law of the ECtHR diverge regarding the protection of fundamental rights?

Here is what you will further read:

  1. Facts of the case and questions referred to the CJEU
  2. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy
  3. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents
  4. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)
  5. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection
  6. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed when the ECtHR case-law provides a higher level of protection for the rights of a person
  7. What to expect from the Court

Note that all highlights from the post are made by the author.

  1. Facts of the case and questions referred to the CJEU

C-73/16 Puškár concerns the request of Mr Puškár to have his name removed from a blacklist kept by the Finance Directorate of Slovakia which contains names and national ID numbers for persons “who purport to act, as ‘fronts’, as company directors”. The list associates a legal person or persons with a natural person who supposedly acted on their behalf (§15) and is created for the purposes of tax administration and combating tax fraud (§23 2nd question for a preliminary ruling). It transpires from several paragraphs of the Conclusions that Mr Puskar found out about the list and the fact that he is on the list from a leak (§23 2nd question;§72; §76). Instead of relying on the more straightforward right to erasure or right to object under data protection law, Mr Puškár claimed that “his inclusion in the above mentioned list infringes his personal rights, specifically the right to the protection of his good name, dignity and good reputation” (§16).

The Supreme Court rejected his claims, partly on procedural issues, partly on substantive grounds (§18). Later, the Constitutional Court found that “the Supreme Court infringed the fundamental right to the protection of personal data against unauthorised collection and other abuses, in addition to the right to privacy”, quashed its decision and send back the case to the Supreme Court for retrial, grounding its findings on ECtHR case-law (§20). In the context of these second round proceedings, the Supreme Court sent questions for a preliminary ruling to the CJEU to essentially clarify:

  • whether the right to an effective remedy under Article 47 of the Charter in the context of data protection is compatible with a national law requirement that a claimant must first exhaust the procedures available under administrative law (administrative complaints) before going to Court;
  • whether the legitimate grounds for processing under Directive 95/46 and Articles 7 and 8 of the Charter preclude tax authorities to create such a blacklist without the consent of the individuals on the list;
  • whether the list obtained by the claimant without the consent of the tax authorities is admissible as evidence;
  • whether national courts should give precedence to the case-law of the CJEU or the case-law of the ECtHR on a specific topic where the two diverge.
  1. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy

To reply to the first question, AG Kokott looks at Articles 28(4) and 22 of Directive 95/46 and also at Article 79 of the General Data Protection Regulation, which will replace Directive 95/46 starting with 25 May 2018.

Article 28(4) of Directive 95/46 states that each supervisory authority (Data Protection Authority) is to hear claims lodged by any person concerning the protection of his rights and freedoms with regard to the processing of personal data. Article 22 provides that, without prejudice to the remedy referred to in Article 28(4), every person is to have a right to a judicial remedy for any breach of the rights guaranteed him by the national law applicable to the processing in question (§37, §38).

In practice, this means that an individual who engages in Court proceedings for a breach of data protection law must be able to also initiate administrative proceedings with a DPA (complaints lodged with DPAs).

The same rule is kept under Article 79 GDPR, slightly broadened: the right to a judicial remedy must be effective and must be granted without prejudice to any administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority. 

AG Kokott explains that these rules still do not clarify “whether the bringing of legal proceedings may be made contingent upon exhaustion of another remedy. All that can be taken from Article 79 of the General Data Protection Regulation is that the judicial remedy must be effective. An obligation to exhaust some other remedy before bringing legal proceedings will consequently be impermissible if the judicial remedy is rendered ineffective as a result of this precondition” (§43).

The AG found that Article 47(1) of the Charter and the principle of effectiveness “ultimately embody the same legal principle” and that they can be examined jointly using the rules in Articles 47(1) and 52(1) of the Charter – which is the provision that enshrines the rules for limiting the exercise of the fundamental rights in the Charter (§51). Hence, the question is whether the obligation to exhaust administrative procedures before going to Court amounts to a justified interference with the right to an effective judicial remedy.

AG Kokott remarks that the interference is provided for by Slovakian law and that it does not touch the essence of the right to effective judicial remedy because “it does not restrict the category of people who could in principle have recourse to judicial review” (§56). [Small comment here: this means that a provision which would restrict the category of people who could in principle have recourse to judicial review touches the essence of the right in Article 47 Charter. Check out paragraphs 45 and 46 of the EDPS Opinion on the EU-US Umbrella Agreement commenting on the fact that Article 19 of the Agreement provides for the possibility of judicial redress only for citizens of the EU, excluding thus categories of individuals that would otherwise be covered by the Charter, such as asylum seekers and residents].

It remained to be analysed whether the interference complies with the principle of proportionality, which “requires that a measure be ‘appropriate, necessary and proportionate to the objective it pursues’” (§58). The AG retains the submission of the Supreme Court that “the exhaustion of the administrative remedy represents a gain in efficiency, as it provides the administrative authority with an opportunity to remedy the alleged unlawful intervention, and saves it from unwanted court proceedings” (§59). The AG considers that “obligatory preliminary proceedings are undoubtedly appropriate for achieving the objectives” and that a “less onerous method” does not suggest itself as capable of realising them to the same extent (§62).

However, the AG points out that the “specific form” of the administrative remedy is important to determine the appropriateness of the measure in practice. This condition applies in particular if there is uncertainty “as to whether the time limit for bringing an action begins to run before a decision has been made in the administrative action” (§64). Additionally, Article 47(2) Charter establishes the right of every person to have their case dealt with within a reasonable period of time. “While this right in fact relates to judicial proceedings, naturally it may not be undermined by a condition for the bringing of an action” (§67).

In conclusion, the AG considers that the right to effective judicial review under Article 47 Charter and the principle of effectiveness “do not preclude an obligation to exhaust an administrative remedy being a condition on bringing legal proceedings if the rules governing that remedy do not disproportionately impair the effectiveness of judicial protection. Consequently, the obligatory administrative remedy must not cause unreasonable delay or excessive costs for the overall legal remedy” (§71).

  1. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents

Essentially, the question asked by the Supreme Court is whether the contested list may be excluded as evidence due to the fact that it came into the possession of the claimant without the consent of the competent authorities (§72).

The AG considers that “a review should be carried out to determine whether the person affected has a right of access to the information in question. If this were the case, the interest in preventing unauthorized use would no longer merit protection” (§83).

Further, it is recalled that “under the second sentence of Article 8(2) of the Charter and Article 12 of the Data Protection Directive, everyone has the right of access to data which has been collected concerning him or her. This also applies in principle to data being recorded in the contested list. Furthermore, the persons so affected would, by virtue of the collection of the data, have to be informed of the use of the data, under either Article 10 or Article 11 of the Data Protection Directive” (§85).

While indeed Article 13 of the Directive allows this right to information to be restricted, it also “expressly requires that such restrictions be imposed by legislative measures” (§86). The AG acknowledged that “there is a potential risk that inspection and monitoring activities based on the list would be less effective if it were known who was named on that list” (§87). However, the national Court must examine:

  • “whether a restriction of the right of information of this kind is provided for” (§88) and
  • “where appropriate” if it is “justified” (§88). This is an indication that even if such an exemption would be provided for by law, a further analysis is needed to see whether the exemption is justified.

A key point the AG makes is that “even if there are indications of a legitimate interest in a hypothetical, legally justified non-disclosure of the list in question, the national courts must also examine whether in the individual case these outweigh the legitimate interests of the individual in bringing the proceedings” (§89). This is important because it is a clear indication that when a controller relies on their legitimate interest as a ground for processing, it always has to engage in a balancing exercise with the legitimate interests (and rights) of the data subject.

In conclusion, the AG established that refusing to accept as evidence a document obtained by the claimant without the consent of an authority is not possible under the principle of a fair hearing in Article 47 Charter when the document contains personal data of the claimant, which the authority is required to disclose to the claimant under Article 12 and 13 of the Data Protection Directive.

  1. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)

The Supreme Court wanted to know whether the fundamental right to privacy (Article 7 Charter) and protection of personal data (Article 8 Charter) and the Data Protection Directive prohibit a Member State from creating a list of personal data for the purposes of tax collection without the consent of the persons concerned.

The AG points out that “this question is primarily to be answered in the light of the Data Protection Directive, as this specifies the rights to privacy and data protection” (§95).

The AG further recalls that Article 7 of the Data Protection Directive allows processing of personal data if it is based on one of the six lawful grounds for processing provided for (§99) [NB: of which only one is “consent”!]. While the AG acknowledges that three of the six conditions are applicable in this case (1 – performance of a task in the public interest [Article 7(e)]; 2 – legitimate interest of the controller [Article 7(f)] and 3 – necessity of compliance with a legal obligation [Article 7(c)]), she considers the examination of the latter 2 as “superfluous”: “This is because all parties acknowledge that tax collection and combating tax fraud are tasks in the public interest within the meaning of Article 7(e) of the Data Protection Directive” (§100).

A much-welcomed clarification is further brought by the AG, who specifies that Article 7(e) of the Data Protection Directive “must be read in conjunction with the principles of Article 6. According to Article 6(1)(b), personal data must only be collected for specified, explicit and legitimate purposes. Within the scope of Article 7(e), the purpose of the data processing is inseparably linked to the delegated tasks. Consequently, the transfer of the task must clearly include the purpose of the processing” (§106).

This clarification is welcomed because it reminds controllers that even if they correctly process personal data on one of the lawful grounds for processing (such as consent or legitimate interest) in compliance with Article 7 of the Directive, they still have to comply with all the other safeguards for processing personal data, including the principles for processing in Article 6 of the Directive (purpose limitation, data minimization etc).

The AG remarks that the reference for a preliminary ruling does not specify the purpose of the contested list and leaves it to the Supreme Court to look further into this question (§107). Additionally, the AG also considers that the Supreme Court “will have to examine whether the creation and use of the contested list and in particular the naming of Mr Puškár is necessary for the claimed public interest”. This is yet another reminder how important “necessity” is for personal data protection in the EU legal framework (check out EDPS’s recently published “Necessity Toolkit”).

Another very interesting point that the AG brings forward is how naming a person on this black list constitutes “a considerable interference with the rights of the person concerned”, beyond the right to privacy in Article 7 Charter – it also touches (§110):

  • “his reputation and could lead to serious, practical disadvantages in his dealings with the tax authorities;
  • the presumption of innocence in Article 48(1) of the Charter;
  • the legal persons associated with the person concerned, which will be affected in terms of their freedom to conduct business under Article 16 of the Charter”.

This finding is a testimony of the importance of complying with the right to the protection of personal data, as non-compliance would have various consequences on several other fundamental rights.

As the AG explains, “such a serious interference of this kind can only be proportionate if there are sufficient grounds for the suspicion that the person concerned purported to act as a company director of the legal persons associated with him and in so doing undermined the public interest in the collection of taxes and combating tax fraud” (§111).

In conclusion, the tax authorities can create a blacklist such as the one in the main proceedings on the grounds of Article 7(e) of the Data Protection Directive, but this assumes that (§117):

  • “the task was legally assigned to the tax authorities,
  • the use of the list is appropriate and necessary for the purposes of the tax authorities and
  • there are sufficient grounds to suspect that these persons should be on the list”.
  1. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection

Further, the AG spelled out that “neither the fundamental rights to privacy, Article 7 of the Charter, or data protection, Article 8, would in this case prevent the creation and use of the list” (§117).

The analysis to reach this conclusion was another missed opportunity to persuade the Court of Justice to better delineate the two fundamental rights protected by Article 7 and Article 8 of the Charter. The AG referred to these as “the fundamental rights to privacy and data protection”.

Without a clear analysis of what constitutes interference with the two rights, the AG referred to “naming of a person on the contested list” as “affecting” both fundamental rights (§115). In the same paragraph, she further analysed en masse “these interferences”, writing that they are only justified “if they have a sufficient legal basis, respect the essence of both fundamental rights, and preserve the principle of proportionality” (§ 115). Considering that the legality and proportionality of the measure were addressed in previous sections, the AG merely stated that “the adverse effects associated with inclusion on the contested list, those interferences do not meet the threshold of a breach of the essence of those rights” before concluding that neither of the two Charter articles would prevent the creation of such a blacklist.

  1. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed, even if the ECtHR case-law provides a higher level of protection for the rights of a person

The last question is one that is extremely interesting for EU lawyers in general, not necessarily for EU data protection lawyers, because it tackles the issue of different levels of protection of the same fundamental right emerging from the case-law of the Court of Justice of the EU in Luxembourg, on one hand, and the European Court of Human Rights in Strasbourg, on the other hand.

As the AG summarizes it, “the fourth question is aimed at clarifying whether a national court may follow the case-law of the Court of Justice of the European Union where this conflicts with the case-law of the ECtHR” (§118). This issue is relevant in our field because Article 8 of the European Convention of Human Rights shares partially the same material scope of Article 7 and Article 8 of the EU Charter of Fundamental Rights (Article 8 of the Convention is more complex), and Article 52(3) of the Charter states that “the rights in the Charter, which correspond to rights guaranteed by the European Convention on the Protection of Human Rights and Fundamental Freedoms (ECHR), have the same meaning and scope as conferred by the ECHR” (§122). However, the second sentence of Article 52(3) of the Charter permits EU law to accord more extensive protection (§122).

The AG specifies that “EU law permits the Court of Justice to deviate from the case-law of the ECtHR only to the extent that the former ascribes more extensive protection to specific fundamental rights than the latter. This deviation in turn is only permitted provided that it does not also cause another fundamental right in the Charter corresponding to a right in the ECHR to be accorded less protection than in the case-law of the ECtHR. One thinks, for example, of cases in which a trade-off must be made between specific fundamental rights” (§123).

Not surprisingly, the AG advises that when the case-law of the two Courts comes in conflict, the national courts should directly apply the case-law of the CJEU when it affords more protection to the fundamental rights in question, but they should send a reference for a preliminary ruling to the CJEU to ask which way to go when the case-law of the ECtHR affords enhanced protection to the fundamental right in question (§124 and §125). The argument of the AG is that the latter case “inevitably leads to a question of the interpretation of EU law with regard to the fundamental right in question and Article 52(3) of the Charter” which, if performed by the national Court, could further “amount to the view that the interpretation of the fundamental right in question by the Court of Justice is not compatible with Article 52(3)”.

As for the relevance of this question to the case at hand – it remains a mystery. The AG herself pointed out that “the admissibility of the question in this form is dubious, particularly as the Supreme Court does not state on which issue the two European courts supposedly are in conflict and the extent to which such a conflict is significant for the decision in the main proceedings” (§119).

  1. What to expect from the Court

How will the CJEU reply to these questions? My bet is that, in general, the Court will follow the AG on substance. However, it is possible that the Court will simplify the analysis and reformulate the questions in such a way that the answers will be structured around three main issues:

  • lawfulness of creating such a blacklist (and the lawful grounds for processing in the Data Protection Directive) and compatibility of this interference with both Article 7 and Article 8 of the Charter (I do hope, having low expectations nonetheless, that we will have more clarity of what constitutes interference with each of the two rights from the Court’s perspective);
  • compatibility of procedural law of Slovakia in the field of data protection with Article 47 Charter (in fact, this may be the only point where the Court could lay out a different result than the one proposed by the AG, in the sense that the condition to exhaust first administrative remedies before engaging in litigation may be considered a non-proportionate interference with the right to effective judicial remedy; it is also possible that the Court will refer for the first time directly to the GDPR);
  • the relationship between ECtHR and CJEU case-law on the same fundamental right.

Suggested citation: G. Zanfir-Fortuna, “Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)”, pdpEcho.com, 24 April 2017.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

CNIL publishes GDPR compliance toolkit

CNIL published this week a useful guide for all organisations thinking to start getting ready for GDPR compliance, but asking themselves “where to start?”. The French DPA created a dedicated page for the new “toolkit“, while detailing each of the six proposed steps towards compliance by also referring to available templates (such as a template for the Register of processing operations and a template for data breach notifications – both in FR).

According to the French DPA, “the new ‘accountability’ logic under the GDPR must be translated into a change of organisational culture and should put in motion internal and external competences”.

The six steps proposed are:

  1. Appointing a “pilot”/”orchestra conductor” [n. – metaphors used in the toolkit], famously known as “DPO”, even if the controller is not under the obligation to do so. Having a DPO will make things easier.
  2. Mapping all processing activities (the proposed step goes far beyond data mapping, as it refers to processing operations themselves, not only to the data being processed, it also refers to cataloging the purposes of the processing operations and identifying all sub-contractors relevant for the processing operations);
  3. Prioritising the compliance actions to be taken, using as starting point the Register and structuring the actions on the basis of the risks the processing operations pose to the rights and freedoms of individuals whose data are processed. Such actions could be, for instance, making sure that they process only the personal data necessary to achieve the purposes envisaged or revising/updating the Notice given to individuals whose data are processed (Articles 12, 13 and 14 of the Regulation);
  4. Managing the risks, which means conducting DPIAs for all processing operations envisaged that may potentially result in a high risk for the rights of individuals. CNIL mentions that the DPIA should be done before collecting personal data and before putting in place the processing operation and that it should contain a description of the processing operation and its purposes; an assessment of the necessity and the proportionality of the proposed processing operation; an estimation of the risks posed to the rights and freedoms of the data subjects and the measures proposed to address these risks in order to ensure compliance with the GDPR.
  5. Organising internal procedures that ensure continuous data protection compliance, taking into account all possible scenarios that could intervene in the lifecycle of a processing operation. The procedures could refer to handling complaints, ensuring data protection by design, preparing for possible data breaches and creating a training program for employees.
  6. Finally, and quite importantly, Documenting compliance. “The actions taken and documents drafted for each step should be reviewed and updated periodically in order to ensure continuous data protection”, according to the CNIL. The French DPA  provides a list with documents that should be part of the “GDPR compliance file”, such as the Register of processing operations and the contracts with processors.

While this guidance is certainly helpful, it should be taken into account that the only EU-wide official guidance is the one adopted by the Article 29 Working Party. For the moment, the Working Party published three Guidelines for the application of the GDPR – on the role of the DPO, on the right to data portability and on identifying the lead supervisory authority. The Group is expected to adopt during the next plenary guidance for Data Protection Impact Assessments.

If you are interested in other guidance issued by individual DPAs, here are some links:

NOTE: The guidance issued by CNIL was translated and summarised from French – do not use the translation as an official source. 

***

Find what you’re reading useful? Please consider supporting pdpecho.

WP29 published its 2017 priorities for GDPR guidance

The Article 29 Working Party published in mid January the new set of priorities for providing GDPR guidance for 2017. This happened after WP29 published in December three sets of much awaited Guidelines on the application of the GDPR: on Data Protection Officers, on the right to data portability and on identifying the lead supervisory authority (pdpEcho intends to provide a closer look to all of them in following weeks). So what are the new priorities?

First of all, WP29 committed to finalise what was started in 2016 and was not adopted/finalised by the end of the year:

  • Guidelines on the certification mechanism;
  • Guidelines on processing likely to result in a high risk and Data Protection Impact Assessments;
  • Guidance on administrative fines;
  • Setting up admin details of the European Data Protection Board (e.g. IT, human resources, service level agreements and budget);
  • Preparing the one-stop-shop and the EDPB consistency mechanism

Secondly, WP29 engaged to start assessments and provide guidance for.

  • Consent;
  • Profiling;
  • Transparency.

Lastly, in order to take into account the changes brought by the GDPR, WP29 intends to update the already existing guidance on:

  • International data transfers;
  • Data breach notifications.

If you want to be a part of the process, there are good news. WP29 wants to organise another FabLab on April 5 and 6 on the new priorities for 2017, where “interested stakeholders will be invited to present their views and comments”. For more details, regularly check this link.

It seems we’re going to have a busy year.

 

A million dollar question, literally: Can DPAs fine a controller directly on the basis of the GDPR, or do they need to wait for national laws?

by Gabriela Zanfir-Fortuna

The need to discuss the legal effect of the GDPR emerged as there are some opinions in the privacy bubble informing that it will take at least a couple of years before the GDPR will de facto have legal effect at national level, after the moment it becomes applicable in 2018. The main argument for this thesis is that national parliaments of the Member States will need to take action in a way or another, or that national governments will need to issue executive orders to grant new powers to supervisory authorities, including the power to fine.

This post will bring forward some facts emerging from EU primary law and from the case-law of the Court of Justice of the EU (CJEU) that need to be taken into account before talking about such a de facto grace period.

The conclusion is that, just like all EU regulations, the GDPR is directly applicable and has immediate effect from the date it becomes applicable according to its publication in the EU Official Journal (in this case, 25 May 2018), with no other national measures being required to give it effect in the Member States (not even translations at national level). While it is true that it contains provisions that give a margin of appreciation to Member States if they wish to intervene, most of the articles are sufficiently clear, detailed and straightforward to allow direct application, if need be ( for instance, if a Member State is late in adjusting and adapting its national data protection law).

1) EU regulations enjoy “direct applicability”: the rule is that they are “immediately applicable” and they don’t need national transposition

First and foremost, it is a fact emerging from the EU treaties that EU Regulations enjoy direct applicability, which means that once they become applicable they do not need to be transposed into national law.

This rule is set out in the second paragraph of Article 288 of the Treaty on the European Union, which states that:

“A regulation shall have general application. It shall be binding in its entirety and directly applicable in all Member States.”

On the contrary, according to the third paragraph of Article 288 TFEU, directives “shall be binding, as to the result to be achieved, upon each Member State to which it is addressed, but shall leave to the national authorities the choice of form and methods.”

Therefore, as the CJEU explained in settled case-law, “by virtue of the very nature of regulations and of their function in the system of sources of Community law, the provisions of those regulations generally have immediate effect in the national legal systems without it being necessary for the national authorities to adopt measures of application” (see Case C-278/02 Handlbauer2004, §25 and Case 93/71 Leonesio, 1972, §5) and in addition they also “operate to confer rights on individuals which the national courts have a duty to protect” (Case C-70/15 Lebek, 2016, §51).

However, the CJEU also ruled that “some of their provisions may nonetheless necessitate, for their implementation, the adoption of measures of application by the Member States” (Case C-278/02 Handlbauer2004, §26; C-403/98 Monte Arcosu, 2001, §26). But this is not the case of sufficiently clear and precise provisions, where Member States don’t enjoy any margin of manoeuvre. For instance, the Court found in Handlbauer that “this is not the case as regards Article 3(1) of Regulation No 2988/95 which, by fixing the limitation period for proceedings at four years as from the time when the irregularity is committed, leaves the Member States no discretion nor does it require them to adopt implementation measures” (§27).

Therefore, whenever an EU regulation leaves the Member States no discretion, nor does it require them to adopt implementation measures, the provisions of that regulation are directly and immediately applicable as they are.

2) EU regulations’ direct applicability is not depending on any national measure (not even translation published in national official journals)

The CJEU explained as far back as 1973 that for EU regulations to take effect in national legal systems of Member States there is not even the need to have their texts translated and published in the national official journals.

Asked whether the provisions of a Regulation can be “introduced into the legal order of Member States by internal measures reproducing the contents of Community provisions in such a way that the subject-matter is brought under national law”, the Court replied that “the direct application of a Regulation means that its entry into force and its application in favour of or against those subject to it are independent of any measure of reception into national law” (Case 34/73 Variola, 1973, §9 and §10). AG Kokott explained that such measures include “any publicity by the Member States” (Opinion in C-161/06 Skoma-lux, §54) in an Opinion that was substantially upheld by the Court in a judgment stating that the publication of a regulation in the Official Journal of the EU in an official language of a Member State is the only condition to give it effect and direct applicability in that Member State (Judgment in Case C-161/06).

The Court concluded in Variola that “a legislative measure under national law which reproduces the text of a directly applicable rule of Community law cannot in any way affect such direct applicability, or the Court’s jurisdiction under the Treaty” (operative part of the judgment). The Court also explained in Variola that “by virtue of the obligations arising from the Treaty and assumed on ratification, Member States are under a duty not to obstruct the direct applicability inherent in Regulations and other rules of Community law. Strict compliance with this obligation is an indispensable condition of simultaneous and uniform application of Community Regulations throughout the Community” (Case 34/73 Variola, 1973, §10).

3) National authorities could impose administrative penalties directly on the basis of a provision of a Regulation, where necessary 

The Court dealt with the question of national authorities imposing administrative fines directly on the basis of the provisions of an EU regulation in Case C-367/09 Belgish Interventie en Restitutie Bureau  on the interpretation of provisions from Regulation 2988/95.

After recalling its case-law on direct applicability of EU regulations (§32), including the exemption that some provisions of a Regulation necessitate for their implementation the adoption of measures of application (§33), the CJEU found that in that specific case national authorities cannot impose fines directly on the basis of Articles 5 and 7 of Regulation 2988/95 because “those provisions merely lay down general rules for supervision and penalties for the purpose of safeguarding the EU’s financial interests (…). In particular, those provisions do not specify which of the penalties listed in Article 5 of Regulation No 2988/95 should be applied in the case of an irregularity detrimental to the EU’s financial interests nor the category of operators on whom such penalties are to be imposed in such cases” (§36).

Therefore, the Court did not question the possibility of a national authority to impose fines directly on the legal basis provided by a regulation. The CJEU went directly to analyse the content of the relevant provision and found that fines could not be imposed because of the general character of that provision, which required additional measures to be adopted both at Member State and at EU level (were the provisions more clear, the authorities could have directly issued fines on the basis of the regulation).

One look at Article 83 GDPR and one can easily tell that this is not the case of that provision – it is clear who imposes fines, for what, against whom, on what criteria and what is the maximum amount for each category of fines. Neither is it the case of Article 58 on the powers of supervisory authorities. Article 83 GDPR allows Member States some discretion only if they wish to provide specific rules for fining public authorities (paragraph 7) and only if their legal system does not provide for administrative fines – in this case, the states are allowed to apply Article 83 in such a manner that the fine is initiated by the competent supervisory authority and imposed by competent national courts (paragraph 9).

4) Conclusion: beware of the GDPR from day 1

The GDPR, like all EU regulations, is directly applicable and has immediate effect in the legal order of Member States by virtue of its publication in the Official Journal of the EU and the conditions of applicability in time expressed therein, no additional national measures being required to give it effect.

While there are provisions that give Member States a margin of appreciation and a discretion to implement national measures, most of the provisions are sufficiently clear and precise to be applied as they are.

Of course there will be national data protection laws that will specify additional rules to the GDPR, giving effect to that margin of appreciation. But the national laws that will complement an EU regulation, such as the GDPR, are valid only as long as “they do not obstruct its direct applicability and do not conceal its [EU] nature, and if they specify that a discretion granted to them by that regulation is being exercised, provided that they adhere to the parameters laid down under it” (CJEU, Case C‑316/10 Danske Svineproducenter Justitsministeriet, §41).

As always, here is the fine print (or the caveat) whenever we are discussing about the interpretation of EU law: only the CJEU has the authority to interpret EU law in a binding manner.

(Note: The author is grateful to dr. Mihaela Mazilu-Babel, who provided support with preliminary research for this post)

***

Find what you’re reading useful? Please consider supporting pdpecho.