Door-to-door gathering of data by religious group goes to the CJEU

Non-automated processing | Filing system | Household Exemption | Controller | Religious community

The Court of Justice of the EU received questions for a preliminary ruling from Finland regarding the practice of a religious group (Jehova’s Witnesses) to gather and record data after door-to-door visits, without informing the concerned individuals about this practice. The questions referred in Case C-25/17 Tietosuojavaltuutettu v Jehovah’s Witnesses concern the interpretation of several key points of Directive 95/45:

  1. Exceptions from the application of the Directive – and particularly Article 3(2) second paragraph, which excludes processing “by a natural person in the course of a purely personal or household activity” from the material scope of the Directive. The referring court wants the CJEU to clarify whether this exception applies to gathering data and writing observations in paper file connected to the door-to-door activity, by members of the religious group (Question 1).
  2. The concept of “filing system” as defined in Article 2(d) of the Directive.The question referred by the national Court is whether, taken as a whole, the manual collection of personal data (name and address and other information and characteristics of a person) carried out in connection with door-to-door evangelical work constitutes a filing system, being thus subject to the application of the Directive (Question 2).
  3. The concept of “controller” under Article 2(d) of the Directive. In particular, the referring court wants the CJEU to clarify whether in this situation the controller is considered to be the religious community as a whole, “even though the religious community claims that only the individual members carrying out evangelical work have access to the data collected” (Questions 3 and 4).

Without knowing the details of the case, and based only on the information available in the questions referred by the national Court, here is my bet on how the CJEU will reply:

  • The definition of “purely household activity” does not extend to the door-to-door evangelical work of a religious community; this exemption is to be interpreted strictly (“must be narrowly construed”; “must apply only in so far as is strictly necessary”), according to the CJEU in C-212/13 Rynes (§28 and §29). The CJEU also explained that this exception applies “only where it is carried out in the purely personal or household setting of the person processing the data” (§31) – which is not the case of representatives of a religious community gathering information during evangelical work.
  • The records the evangelical workers keep should be considered as constituting a “filing system”. This concept is defined as “any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis”. According to Recital 15 of the Directive, data in a filing system is “structured according to specific criteria relating to individuals, so as to permit easy access to the personal data in question”. If the religious community would claim that their records are not structured according to specific criteria – e.g. ZIP codes; members of the community/non-members; individuals interested in the community/individuals not interested, and that they don’t allow easy access to the personal data in question, then the purpose of having a detailed record would not be achieved. In other words, having an unstructured file is incongruent with the purpose of the activity. While it is true that the Member States have been given a margin of appreciation to lay down different criteria for determining the constituents of a structured set of personal data and the different criteria governing access to such a set, the criteria must be compatible with the definition in the Directive. Moreover, applying “loosely” the definition would amount to a limitation in relation to the protection of personal data, which must apply “only in so far as is strictly necessary” (Rynes §28, DRI §52).
  • The controller of this processing operation should be considered the religious community, as this entity establishes the purposes of the processing activity (the records are probably meant to facilitate the evangelical work of the community – there is no reference in the questions sent to the declared purpose of this activity, but it is only logical that such records are kept to facilitate the evangelical work) and the means of this activity (“by dividing up the areas in which the activity is carried out among members involved in evangelical work, supervising the work of those members and maintaining a list of individuals who do not wish to receive visits from evangelists” – according to the referring Court)

Since this new case provided an opportunity to discuss processing of personal data done by a religious community, there are a couple of additional points to be made.

First of all, according to Recital 35 of the Directive, “processing of personal data by official authorities for achieving aims, laid down in constitutional law or international public law, of officially recognized religious associations is carried out on important grounds of public interest“. This means that the religious associations do not need to rely on consent or on their legitimate interest as lawful grounds for processing. However, relying on public interest for the lawful ground of processing does not mean that they don’t have to comply with all the other obligations under data protection law. For instance, they still have to comply with the data quality principles, they still have to inform data subjects about the details of the processing activity and they still have to reply to requests of access, correction, erasure.

Second, some of the data gathered in such circumstances is sensitive data, as it refers to “religious beliefs” (Article 8 of the Directive, Article 9 of the GDPR). This means that the data should be processed with additional care and strengthened safeguards.

In case you are wondering whether the GDPR specifically addresses processing of data by religious communities, churches, Recital 35 of the Directive was transplanted to the GDPR, in Recital 55. In addition, the GDPR enshrines a specific provision that covers “existing data protection rules of churches and religious associations” – Article 91. This provision allows Member States that have specific legislation (“comprehensive rules”) in place dedicated to churches and religious communities, at the time of entry into force of the GDPR, to continue to apply those rules, but only if “they are brought into line with this Regulation”. In addition, according to the second paragraph, processing of personal data done by churches and religious associations that apply comprehensive national rules according to the first paragraph “shall be subject to the supervision of an independent supervisory authority, which may be specific”. Again, the conditions for this to happen is that this specific supervisory authority must fulfil the conditions laid down for independent supervisory authorities in the GDPR.

***

Note: Thanks to Dr. Mihaela Mazilu-Babel for pointing out this new case.

Find what you’re reading useful? Please consider supporting pdpecho.

 

CNIL publishes GDPR compliance toolkit

CNIL published this week a useful guide for all organisations thinking to start getting ready for GDPR compliance, but asking themselves “where to start?”. The French DPA created a dedicated page for the new “toolkit“, while detailing each of the six proposed steps towards compliance by also referring to available templates (such as a template for the Register of processing operations and a template for data breach notifications – both in FR).

According to the French DPA, “the new ‘accountability’ logic under the GDPR must be translated into a change of organisational culture and should put in motion internal and external competences”.

The six steps proposed are:

  1. Appointing a “pilot”/”orchestra conductor” [n. – metaphors used in the toolkit], famously known as “DPO”, even if the controller is not under the obligation to do so. Having a DPO will make things easier.
  2. Mapping all processing activities (the proposed step goes far beyond data mapping, as it refers to processing operations themselves, not only to the data being processed, it also refers to cataloging the purposes of the processing operations and identifying all sub-contractors relevant for the processing operations);
  3. Prioritising the compliance actions to be taken, using as starting point the Register and structuring the actions on the basis of the risks the processing operations pose to the rights and freedoms of individuals whose data are processed. Such actions could be, for instance, making sure that they process only the personal data necessary to achieve the purposes envisaged or revising/updating the Notice given to individuals whose data are processed (Articles 12, 13 and 14 of the Regulation);
  4. Managing the risks, which means conducting DPIAs for all processing operations envisaged that may potentially result in a high risk for the rights of individuals. CNIL mentions that the DPIA should be done before collecting personal data and before putting in place the processing operation and that it should contain a description of the processing operation and its purposes; an assessment of the necessity and the proportionality of the proposed processing operation; an estimation of the risks posed to the rights and freedoms of the data subjects and the measures proposed to address these risks in order to ensure compliance with the GDPR.
  5. Organising internal procedures that ensure continuous data protection compliance, taking into account all possible scenarios that could intervene in the lifecycle of a processing operation. The procedures could refer to handling complaints, ensuring data protection by design, preparing for possible data breaches and creating a training program for employees.
  6. Finally, and quite importantly, Documenting compliance. “The actions taken and documents drafted for each step should be reviewed and updated periodically in order to ensure continuous data protection”, according to the CNIL. The French DPA  provides a list with documents that should be part of the “GDPR compliance file”, such as the Register of processing operations and the contracts with processors.

While this guidance is certainly helpful, it should be taken into account that the only EU-wide official guidance is the one adopted by the Article 29 Working Party. For the moment, the Working Party published three Guidelines for the application of the GDPR – on the role of the DPO, on the right to data portability and on identifying the lead supervisory authority. The Group is expected to adopt during the next plenary guidance for Data Protection Impact Assessments.

If you are interested in other guidance issued by individual DPAs, here are some links:

NOTE: The guidance issued by CNIL was translated and summarised from French – do not use the translation as an official source. 

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

The right to be forgotten goes back to the CJEU (with Google, CNIL, sensitive data, freedom of speech)

The Conseil d’Etat announced today that it referred several questions to the Court of Justice of the EU concerning the interpretation of the right to be forgotten, pursuant to Directive 95/46 and following the CJEU’s landmark decision in the Google v Spain case.

The questions were raised within proceedings involving the application of four individuals to the Conseil d’Etat to have decisions issued by the CNIL (French DPA) quashed. These decisions rejected their requests for injunctions against Google to have certain Google Search results delisted.

According to the press release of the Conseil d’Etat, “these requests were aimed at removing links relating to various pieces of information :a video that explicitly revealed the nature of the relationship that an applicant was deemed to have entertained with a person holding a public office; a press article relating to the suicide committed by a member of the Church of Scientology, mentioning that one of the applicants was the public relations manager of that Church; various articles relating to criminal proceedings concerning an applicant; and articles relating the conviction of another applicant for having sexually aggressed minors.

The Conseil d’Etat further explained that in order to rule on these claims, it has deemed necessary to answer a number of questions “raising serious issues with regard to the interpretation of European law in the light of the European Court of Justice’s judgment in its Google Spain case.

Such issues are in relation with the obligations applying to the operator of a search engine with regard to web pages that contain sensitive data, when collecting and processing such information is illegal or very narrowly framed by legislation, on the grounds of its content relating to sexual orientations, political, religious or philosophical opinions, criminal offences, convictions or safety measures. On that point, the cases brought before the Conseil d’Etat raise questions in close connection with the obligations that lie on the operator of a search engine, when such information is embedded in a press article or when the content that relates to it is false or incomplete”.

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU case to follow: purpose limitation, processing sensitive data, non-material damage

A new case received by the General Court of the CJEU was published in the Official Journal of the EU in February, Case T-881/16 HJ v EMA.

A British citizen seeks to engage the non-contractual liability of the European Medicines Agency for breaching data protection law. The applicant claims that “the documents in his personal file, which were made public and accessible to any member of staff of the European Medicines Agency for a period of time, were not processed fairly and lawfully but were processed for purposes other than those for which they were collected without that change in purpose having been expressly authorised by the applicant”.

Further, the applicant claims that “the dissemination of that sensitive data consequently called into question the applicant’s integrity, causing him real and certain non-material harm”.

The applicant asks the Court to “order the defendant to pay the applicant the symbolic sum of EUR 1 by way of compensation for the non-material harm suffered”.

Even if in the published summary there is no mention of the applicable law, it is clear that Regulation 45/2001 is relevant in this case – the data protection regulation applicable to EU institutions and bodies (EMA is an EU body). The rules of Regulation 45/2001 are fairly similar to those of Directive 95/46.

(Thanks dr. Mihaela Mazilu-Babel for bringing this case to my attention)

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

Will the ePrivacy Regulation overshadow the GDPR in the age of IoT?

by Gabriela Zanfir-Fortuna

The ePrivacy draft regulation, published by the European Commission on Jan. 10, updates and upgrades Directive 2002/58/EC (the “ePrivacy directive”), the source of the infamous “cookies banner.” Under its official name – Proposal for a Regulation Concerning the Respect for Private Life and the Protection of Personal Data in Electronic Communications, the draft ePrivacy regulation reorganizes and even re-conceptualizes the system of protecting the privacy of electronic communications.

Armed with equally large fines and equally wide territorial application, the future ePrivacy rules may end up overshadowing the GDPR in the age of the internet of things due to their wide material scope of application which could potentially cover all data related to connected devices.

Protecting the fundamental right to confidentiality

With the proposal for an ePrivacy regulation distinct from the GDPR, the EU makes it clear that the two sets of rules correspond to different fundamental rights: The GDPR is primarily an expression of the fundamental right to the protection of personal data as enshrined in Article 8 of the EU Charter of Fundamental Rights, while the ePrivacy draft regulation details the right to respect for private life, as enshrined in Article 7 of the Charter (see Recital 1 of the proposal).

This differentiation is of great consequence, affecting the manner in which EU courts will interpret and apply the rules. The protection of the right to private life is construed so as to restrict interferences to the private life to the minimum, whereas the right to the protection of personal data is construed so as to provide for “rules of the road” on how personal data must be used.

A telling example of ePrivacy rules intended to protect private life is the reference in the draft to the fact that “terminal equipment of end-users of electronic communications networks and any information relating to the usage of such terminal equipment … are part of the private sphere of the end-users requiring protection under the Charter of Fundamental Rights of the EU and the European Convention of Human Rights” (emphasis added). (Recital 20)

In addition, many member states (e.g., Germany, Italy, Netherlands, Greece, Romania, Denmark, Belgium, Poland, Estonia, Bulgaria, Czech Republic) provide in their Constitutions for a separate right to the “secret of correspondence” (distinct from the right to respect for private life), which can be restricted under limited situations and usually only for objectives of public safety. The rules of the proposal fall under the scope of this particular right, being thus capable of igniting national constitutional reviews – for instance, concerning the rules that allow access to content of communications.

Electronic communications data and information on smart devices, the centerpiece of the ePrivacy framework

The new system of protecting the confidentiality of communications is built around two concepts:

  • “Electronic communications data,” which includes electronic communications content and electronic communications metadata; and
  • “Information related to the terminal equipment of end-users.”

Article 2(1) of the proposal establishes that the regulation “applies to processing of electronic communications data carried out in connection with the provision and the use of electronic communications services and to information related to the terminal equipment of end-users”.

This represents an important change compared to the current regime of the ePrivacy directive, which is centered on processing personal data (see Article 3(1) of Directive 2002/58). For the new rules to be applicable, it will not matter whether the data going through electronic communications channels fall under the GDPR definition of personal data or not. Recital 4 of the proposal explains that “electronic communications data may include personal data as defined in [the GDPR],” meaning they may also not include personal data.

This article was originally published on iapp.org. Read the rest of the article HERE.

WP29 published its 2017 priorities for GDPR guidance

The Article 29 Working Party published in mid January the new set of priorities for providing GDPR guidance for 2017. This happened after WP29 published in December three sets of much awaited Guidelines on the application of the GDPR: on Data Protection Officers, on the right to data portability and on identifying the lead supervisory authority (pdpEcho intends to provide a closer look to all of them in following weeks). So what are the new priorities?

First of all, WP29 committed to finalise what was started in 2016 and was not adopted/finalised by the end of the year:

  • Guidelines on the certification mechanism;
  • Guidelines on processing likely to result in a high risk and Data Protection Impact Assessments;
  • Guidance on administrative fines;
  • Setting up admin details of the European Data Protection Board (e.g. IT, human resources, service level agreements and budget);
  • Preparing the one-stop-shop and the EDPB consistency mechanism

Secondly, WP29 engaged to start assessments and provide guidance for.

  • Consent;
  • Profiling;
  • Transparency.

Lastly, in order to take into account the changes brought by the GDPR, WP29 intends to update the already existing guidance on:

  • International data transfers;
  • Data breach notifications.

If you want to be a part of the process, there are good news. WP29 wants to organise another FabLab on April 5 and 6 on the new priorities for 2017, where “interested stakeholders will be invited to present their views and comments”. For more details, regularly check this link.

It seems we’re going to have a busy year.

 

What’s new in research: networks of control built on digital tracking, new models of internet governance

pdpEcho is kicking off 2017 with a brief catalogue of interesting recently published research that sets the tone for the new year.

1474043869-800pxFirst, Wolfie Christl and Sarah Spiekermann‘s report on “Networks of Control”, published last month, is a must read for anyone that wants to understand how the digital economy functions on the streams of data we all generate, while reflecting on the ethical implications of this economic model and proposing new models that would try keep the surveillance society afar. Second, a new report of the Global Commission of Internet Governance explores global governance gaps created by existing global governance structures developed in the analog age. Third, the American Academy of Sciences recently published a report with concrete proposals on how to reconcile the use of different public and private sources of data for government statistics with privacy and confidentiality. Last, a volume by Angela Daly that was recently published by Hart Publishing explores how EU competition law, sector specific regulation, data protection and human rights law could tackle concentrations of power for the benefit of users.

 

  1. “Networks of control. A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy”, by Wolfie Christl, Sarah Spiekermann  [OPEN ACCESS]

“Around the same time as Apple introduced its first smartphone and Facebook reached 30 million users in 2007, online advertisers started to use individual-level data to profile and target users individually (Deighton and Johnson 2013, p. 45). Less than ten years later, ubiquitous and real-time corporate surveillance has become a “convenient byproduct of ordinary daily transactions and interactions” (De Zwart et al 2014, p. 746). We have entered a surveillance society as David Lyon foresaw it already in the early 1990s; a society in which the practices of “social sorting”, the permanent monitoring and classification of the whole population through information technology and software algorithms, have silently become an everyday reality” (p. 118).

One of the realities we need to take into account when assessing this phenomenon is that “Opting out of digital tracking becomes increasingly difficult. Individuals can hardly avoid consenting to data collection without opting out of much of modern life. In addition, persons who don’t participate in data collection, who don’t have social networking accounts or too thin credit reports, could be judged as “suspicious” and “too risky” in advance” (p. 129).

The authors of the report explain that the title “Networks of Control” is justified “by the fact that there is not one single corporate entity that by itself controls today’s data flows. Many companies co-operate at a large scale to complete their profiles about us through various networks they have built up” (p. 7). They also explain that they want to close a gap created by the fact that “the full degree and scale of personal data collection, use and – in particular – abuse has not been scrutinized closely enough”, despite the fact that “media and special interest groups are aware of these developments for a while now” (p. 7).

What I found valuable in the approach of the study is that it also brings forward a topic that is rarely discussed when analysing Big Data, digital tracking and so on: the attempt of such practices to change behaviour at scale. “Data richness is increasingly used to correct us or incentivize us to correct ourselves. It is used to “nudge” us to act differently. As a result of this continued nudging, influencing and incentivation, our autonomy suffers (p. 7)”.

A chapter authored by Professor Sarah Spiekermann explores the ethical implications of the networks of control. She applies three ethical normative theories to personal data markets: “The Utilitarian calculus, which is the original philosophy underlying modern economics (Mill 1863/1987). The Kantian duty perspective, which has been a cornerstone for what we historically call “The Enlightenment” (Kant 1784/2009), and finally Virtue Ethics, an approach to life that originates in Aristotle’s thinking about human flourishing and has seen considerable revival over the past 30 years (MacIntyre 1984)” (p. 131).

Methodologically, the report is based on “a systematic literature review and analysis of hundreds of documents and builds on previous research by scholars in various disciplines such as computer science, information technology, data security, economics, marketing, law, media studies, sociology and surveillance studies” (p. 10).

2. Global Commission on Internet Governance “Corporate Accountability for a Free and Open Internet”, by Rebecca MacKinnon, Nathalie Maréchal and Priya Kumar  [OPEN ACCESS]

The report shows that “as of July 2016, more than 3.4 billion people were estimated to have joined the global population of Internet users, a population with fastest one-year growth in India (a stunning 30 percent) followed by strong double digit growth in an assortment of countries across Africa (Internet Live Stats 2016a; 2016b)” (p. 1).

“Yet the world’s newest users have less freedom to speak their minds, gain access to information or organize around civil, political and religious interests than those who first logged on to the Internet five years ago” (p. 1).

Within this framework, the report explores the fact that “ICT sector companies have played a prominent role in Internet governance organizations, mechanisms and processes over the past two decades. Companies in other sectors also play an expanding role in global governance. Multinational companies wield more power than many governments over not only digital information flows but also the global flow of goods, services and labour: onethird of world trade is between corporations, and another third is intra-firm, between subsidiaries of the same multinational enterprise” (p. 5).

The authors also look at the tensions between governments and global companies with regard to requests for access to data, to weaken encryption and facilitate censorship in ways that contravene international human rights standards.

3. “Innovations in Federal Statistics: Combining Data Sources While Protecting Privacy”, by National Academy of Sciences [OPEN ACCESS]. 

The tension between privacy on one hand and statistical data and censuses on the other hand compelled the German Constitutional Court to create in the ’80s “the right to informational self-determination”. Could statistics bring a significant reform of such sort to the US? Never say never.

According to epic.org, the US National Academy of Sciences recently published a report that examines how disparate federal data sources can be used for policy research while protecting privacy.

The study shows that in the decentralised US statistical system, there are 13 agencies whose mission is primarily the creation and dissemination of statistics and more than 100 agencies who engage in statistical activities. There is a need for stronger coordination and collaboration to enable access to and evaluation of administrative and private-sector data sources for federal statistics. For this purpose, the report advices that “a new entity or an existing entity should be designated to facilitate secure access to data for statistical purposes to enhance the quality of federal statistics. Privacy protections would have to be fundamental to the mission of this entity“. Moreover, “the data for which it has responsibility would need to have legal protections for confidentiality and be protected using the strongest privacy protocols offered to personally identifiable information while permitting statistical use”.

One of the conclusions of the report is that “Federal statistical agencies should adopt modern database, cryptography, privacy-preserving and privacy-enhancing technologies”. 

4. Private Power, Online Information Flows and EU Law. Mind The Gap, by Angela Daly, Hart Publishing [50 pounds]

“This monograph examines how European Union law and regulation address concentrations of private economic power which impede free information flows on the Internet to the detriment of Internet users’ autonomy. In particular, competition law, sector specific regulation (if it exists), data protection and human rights law are considered and assessed to the extent they can tackle such concentrations of power for the benefit of users.

Using a series of illustrative case studies, of Internet provision, search, mobile devices and app stores, and the cloud, the work demonstrates the gaps that currently exist in EU law and regulation. It is argued that these gaps exist due, in part, to current overarching trends guiding the regulation of economic power, namely neoliberalism, by which only the situation of market failure can invite ex ante rules, buoyed by the lobbying of regulators and legislators by those in possession of such economic power to achieve outcomes which favour their businesses.

Given this systemic, and extra-legal, nature of the reasons as to why the gaps exist, solutions from outside the system are proposed at the end of each case study. This study will appeal to EU competition lawyers and media lawyers.”

Enjoy the read! (Unless the reform of the EU e-Privacy rules is taking much of your time these days – in this case, bookmark the reports of interest and save them for later).
***
Enjoy what you are reading? Consider supporting pdpEcho

Some end-of-the-year good news: People genuinely care about their privacy

Dear followers,

First, I would like to thank you for making this the most successful year in the 5 years life of pdpEcho (I would especially like to thank those who supported the blog and helped me cover, thus, the cost of renting the blog’s .com name). I started this blog when I was in my first year as a PhD student to gather all information I find interesting related to privacy and data protection. At that time I was trying to convince my classic “civilist” supervisor that data protection is also a matter of civil law. And that I could write a civil law thesis on this subject in Romanian, even though Romanian literature on it only counted one book title from 2004. In the five years that followed another book title was added to it and the blog and I grew together (be it at different paces).

In the recent months it offered me a way to keep myself connected to the field while transitioning from Brussels to the US. But most importantly it reminded me constantly that privacy is really not dead, as it has been claimed numerous times. I cared about it, people that daily found this blog cared about it and as long as we care about privacy, it will never die.

I am writing this end-of-the-year post with some very good news from Europe: you and I are not the only ones that care about privacy. A vast majority of Europeans also does. The European Commission published some days ago a Eurobarometer on ePrivacy, as a step towards the launch of the ePrivacy Directive reform later in January.

The results could not have been clearer:

More than nine in ten respondents said it is important that personal information (such as their pictures, contact lists, etc.) on their computer, smartphone or tablet can only be accessed with their permission, and that it is important that the confidentiality of their e-mails and online instant messaging is guaranteed (both 92%)” (source, p. 2).

“More than seven in ten think both of these aspects are very important. More than eight in ten (82%) also say it is important that tools for monitoring their activities online (such as cookies) can only be used with their permission (82%), with 56% of the opinion this is very important” (source, p. 2).

Overwhelming support for encryption

Remarkably, 90% of those asked agreed “they should be able to encrypt their messages and calls, so they can only be read by the recipient”. Almost as many (89%) agree the default settings of their browser should stop their information from being shared (source, p. 3).

Respondents thought it is unacceptable to have their online activities monitored in exchange for unrestricted access to a certain website (64%), or to pay in order not to be monitored when using a website (74%). Almost as many (71%) say it is unacceptable for companies to share information about them without their permission (71%), even if it helps companies provide new services they may like (source, p. 4).

You can find here the detailed report.

Therefore, there is serious cause to believe that our work and energy is well spent in this field.

The new year brings me several publishing projects that I am very much looking forward to, as well as two work projects on this side of the Atlantic. Nevertheless, I hope I will be able to keep up the work on pdpEcho, for which I hope to receive more feedback and even input from you.

In this note, I wish you all a Happy New Year, where all our fundamental rights will be valued and protected!

Gabriela

 

Data retention, only possible under strict necessity: targeted retention and pre-authorised access to retained data

The Court of Justice of the European Union (‘the Court’ or ‘CJEU’) gave a second judgment this week on the compatibility of data retention measures with the fundamental rights of persons as guaranteed by the Charter of Fundamental Rights of the EU (in Joined Cases C-203/15 and C-698/15 Tele2Sverige). The Court confirmed all its findings from the earlier Digital Rights Ireland judgment and took the opportunity to clarify and nuance some of its initial key-findings (for an analysis of the DRI judgment, see my article published in 2015).

The two cases that were joined by the Court emerged in the fallout of the invalidation of the Data Retention Directive by the CJEU in the DRI judgment. Even if that Directive was declared invalid for breaching fundamental rights, most of the national laws that transposed it in the Member States were kept in force invoking Article 15(1) of the ePrivacy Directive. This Article provided for an exception to the rule of ensuring confidentiality of communications, which allowed Member States to “inter alia, adopt legislative measures providing for the retention of data for a limited period justified on the grounds laid down in this paragraph”. What the Member States seem to have disregarded with their decision to keep national data retention laws in force was that the same paragraph, last sentence, provided that “all the measures referred to in this paragraph (including data retention – my note) shall be in accordance with the general principles of Community law” (see §91 and §92 of the judgment). Respect for fundamental rights is one of those principles.

The Tele2Sverige case was initiated by a telecommunications service provider that followed the decision of the Court in DRI and stopped to retain data, because it considered that the national law requiring it do retain data was in breach of EU law. The Swedish authorities did not agree with this interpretation and this is how the Court was given the opportunity to clarify the relationship between national data retention law and EU law after the invalidation of the Data Retention Directive. The Watson case originates in the UK, was initiated by individuals and refers to the Data Retention and Investigatory Powers Act 2014(DRIPA).

In summary, the Court found that “national legislation which, for the purpose of fighting crime, provides for general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” is in breach of Article 7 (right to private life), Article 8 (right to the protection of personal data) and Article 11 (right to freedom of speech) from the Charter of Fundamental Rights of the EU. The Court clarified that such legislation is precluded by Article 15(1) of the ePrivacy Directive. (See §1 from the executive part of the judgment)

Moreover, the Court found that national legislation in the field of the ePrivacy Directive that regulates the access of competent national authorities to retained data is incompatible with the three fundamental rights mentioned above, as long as:

  1. the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime;
  2. access is not subject to prior review by a court or an independent administrative authority;
  3. there is no requirement that the data concerned should be retained within the European Union (§2 of the operative part of the judgment).

There are a couple of remarkable findings of the Court in the Tele2Sverige/Watson judgment, analysed below. Brace yourselves for a long post. But it’s worth it. I’ll be looking at (1) how indiscriminate retention of metadata interferes with freedom of speech, (2) why data retention is merely an exception of the principle of confidentiality of communications and must not become the rule, (3) why the Court considers retaining on a generalised basis metadata is a far-reaching intrusion in the right to private life, (4) what is “targeted retention” and under what conditions the Court sees it acceptable and, finally (5) what is the impact of all of this on the Privacy Shield and PNR schemes.

 

(1) Indiscriminate retention of metadata interferes with freedom of speech

Even though none of the preliminary ruling questions asked the Court to look at compliance of national data retention measures also in the light of Article 11 Charter (freedom of speech), the Court did so by its own motion.

This was needed so that the Court finishes what it began in DRI. In that previous case, the Court referred to Article 11 Charter in §28, replying to a specific preliminary ruling question, by mentioning that:

“it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter”.

However, it never analysed if that was the case. In §70, the Court just stated that, after finding the Directive to be invalid because it was not compliant with Articles 7 and 8 of the Charter, “there is no need to examine the validity of Directive 2006/24 in the light of Article 11 of the Charter”.

This time, the Court developed its argument. It started by underlying that data retention legislation such as that at issue in the main proceedings “raises questions relating to compatibility not only with Articles 7 and 8 of the Charter, which are expressly referred to in the questions referred for a preliminary ruling, but also with the freedom of expression guaranteed in Article 11 of the Charter” (§92).

The Court continued by emphasising that the importance of freedom of expression must be taken into consideration when interpreting Article 15(1) of the ePrivacy Directive “in the light of the particular importance accorded to that freedom in any democratic society” (§93). “That fundamental right (freedom of expression), guaranteed in Article 11 of the Charter, constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded” (§93), it continues.

The Court justifies the link between data retention and freedom of expression by slightly more confidently (compared to DRI) stating that:

“the retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Article 11 of the Charter” (§101)

The operative part of the judgment clearly states that Articles 7, 8 and 11 of the Charter preclude data retention legislation such as that in the main proceedings.

(2) The exception to the “principle of confidentiality” must not become the rule

The Court refers several times to a “principle of confidentiality of communications” (§85, §90, §95, §115). It explains in §85 that this principle is established by the ePrivacy Directive and “implies, inter alia, (…) that, as a general rule, any person other than the users is prohibited from storing, without the consent of the users concerned, the traffic data related to electronic communications. The only exceptions relate to persons lawfully authorised in accordance with Article 15(1) of that directive and to the technical storage necessary for conveyance of a communication.”

With regard to the first exception, the Court recalls that, because Article 15(1) is construed so as “to restrict the scope of the obligation of principle to ensure confidentiality of communications and related traffic data”, it “must, in accordance with the Court’s settled case-law, be interpreted strictly” (§89). The Court adds, using strong language:

“That provision cannot, therefore, permit the exception to that obligation of principle and, in particular, to the prohibition on storage of data, laid down in Article 5 of Directive 2002/58, to become the rule, if the latter provision is not to be rendered largely meaningless” (§89).

In any case, the Court adds, all exceptions adopted pursuant to Article 15(1) of the ePrivacy Directive must be in accordance with the general principles of EU law, which include the fundamental rights guaranteed by the Charter (§91) and must strictly have one of the objectives enumerated in Article 15(1) of the ePrivacy Directive (§90).

As for the second derogation to the principle, the Court looks at recitals 22 and 26 of the ePrivacy Directive and affirms that the retention of traffic data is permitted “only to the extent necessary and for the time necessary for the billing and marketing of services and the provision of value added services. (…) As regards, in particular, the billing of services, that processing is permitted only up to the end of the period during which the bill may be lawfully challenged or legal proceedings brought to obtain payment. Once that period has elapsed, the data processed and stored must be erased or made anonymous” (§85).

(3) A”very far-reaching” and “particularly serious” interference

The Court observed that the national data retention laws at issue in the main proceedings “provides for a general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication, and that it imposes on providers of electronic communications services an obligation to retain that data systematically and continuously, with no exceptions” (§97).

The data retained is metadata and is described in detail in §98. The Court confirmed its assessment in DRI that metadata “taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as everyday habits, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them” (§99). It also added that this data “provides the means (…) of establishing a profile of the individuals concerned, information that is no less sensitive, having regard to the right to privacy, than the actual content of communications” (§99).

The Court went further to emphasise that this kind of undiscriminating gathering of data represents a “very far-reaching” and “particularly serious” interference in the fundamental rights to private life and protection of personal data (§100). Moreover, “he fact that the data is retained without the subscriber or registered user being informed is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance” (§100).

The Court indicates that such a far-reaching interference can only be justified by the objective of fighting serious crime (§102). And even in this case, the objective of fighting serious crime does not justify in itself “general and indiscriminate retention of all traffic and location data” (§103). The measures must, in addition, be strictly necessary to achieve this objective (§106).

The Court found that the national legislation such as that at issue in the main proceedings does not comply with this request, because (§105):

  • it “covers, in a generalised manner, all subscribers and registered users and all means of electronic communication as well as all traffic data, provides for no differentiation, limitation or exception according to the objective pursued”.
  • “It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings”.
  • It “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences”.
  • “it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy”.

(4) Targeted data retention is permissible. Here is a list with all conditions:

The Court spells out that fundamental rights do not prevent a Member State from adopting “legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to:

  • the categories of data to be retained,
  • the means of communication affected,
  • the persons concerned and
  • the retention period adopted, to what is strictly necessary” (§108).

In addition, such legislation must:

  • “lay down clear and precise rules governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse.
  • indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that such a measure is limited to what is strictly necessary” §109().

Other conditions that need to be fulfilled for a data retention legislation to be considered compatible with fundamental rights are indicated directly or indirectly by the Court in further paragraphs.

Such legislation must:

  • be restricted to “retention in relation to data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or
  • persons who could, for other reasons, contribute, through their data being retained, to fighting crime” (§106).
  • “meet objective criteria, that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected” (§110).
  • “be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences, and to contribute in one way or another to fighting serious crime or to preventing a serious risk to public security” (§111).
  • “lay down clear and precise rules indicating in what circumstances and under which conditions the providers of electronic communications services must grant the competent national authorities access to the data. (…) a measure of that kind must be legally binding under domestic law” (§117).
  • “lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data” (§118).
  • provide that data must be “retained within the European Union” (§122).
  • provide for “the irreversible destruction of the data at the end of the data retention period” (§122).
  • must “ensure review, by an independent authority, of compliance with the level of protection guaranteed by EU law with respect to the protection of individuals in relation to the processing of personal data, that control being expressly required by Article 8(3) of the Charter” (§123).

Other specific conditions emerge with regard to access of competent authorities to the retained data. Access:

  • “can be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime” (§119). [The Court refers here to the ECtHR cases of Zacharov and Szabo, after a long series of privacy related cases where it did not refer at all to the ECtHR case-law].
  • must be subject to “a prior review carried out either by a court or by an independent administrative body” (…) “the decision of that court or body should be made following a reasoned request by those authorities submitted, inter alia, within the framework of procedures for the prevention, detection or prosecution of crime” (§120). The only exception for the prior review are “cases of validly established urgency” (§120).
  • must be notified by authorities to the persons affected “under the applicable national procedures, as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities. That notification is, in fact, necessary to enable the persons affected to exercise, inter alia, their right to a legal remedy” (§121).
  • must be restricted solely to fighting serious crime (§125).

(5) Possible effects on the Privacy Shield and on PNR schemes

This judgment could have indirect effects on the “Privacy Shield” and slightly more immediate effects on Passenger Name Records schemes.

The indirect effect on the Privacy Shield and on all other adequacy schemes could only manifest in the context of a challenge of such transfer instruments before the CJEU. The seriousness with which the Court of Justice detailed all conditions that must be met by a legislative measure providing for a particular processing of personal data to be compliant with the fundamental rights to private life and to the protection of personal data strengthen the condition of “essentially equivalence”.

In other words, it will be difficult to convince the Court that a third country that allows collection of metadata (and all the more so content of communications) on a large scale and access to that data which is not made under the supervision of an independent authority, provides an adequate level of protection that would lawfully allow transfers of data from the EU to that third country. (For comparison, the CJEU referred to the Digital Rights Ireland case for 8 times and in key findings in its judgment in Schrems).

As for PNR schemes, the effects may come sooner and more directly, as we are waiting for the Court’s Opinion in Avis 1/15 on the compliance of the EU-PNR Canada agreement with fundamental rights. It is to be expected that the Court will copiously refer back to its new list of conditions for access by authorities to retained personal data when looking at how all PNR data is directly transferred by companies to law enforcement authorities in a third country, with no limitations.

***

Find what you’re reading useful? Please consider supporting pdpecho.