Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)

The Conclusions of Advocate General Kokott in C-73/16 Puškár were published on 30 March and remained under the radar, even though they deal with a couple of very important questions for EU data protection law that may have wide implications: effective judicial remedies, lawful grounds for processing other than consent, the right to access one’s own personal data. As a bonus, the AG refers to and analyses Article 79 GDPR – the right to a judicial remedy.

The analysis regarding effective judicial remedies under Article 47 Charter and Directive 95/46 could be relevant for the debate on essentially equivalence when it comes to adequacy decisions for international data transfers (for those of you who don’t remember, one of the two main findings in Schrems was that the Safe Harbor framework touched the essence of the right to effective judicial remedies, breaching thus Article 47 Charter). In this sense, the AG founds that a measure that does not restrict the category of people who could in principle have recourse to judicial review does not touch the essence of this right. Per a contrario, if a measure does restrict these categories of people, it would touch the essence of the right to an effective judicial remedy, and, therefore, it would breach the Charter.

Finally, a question of great importance for EU law in general is also tackled: what should national courts do when the case-law of the CJEU and the case-law of the ECtHR diverge regarding the protection of fundamental rights?

Here is what you will further read:

  1. Facts of the case and questions referred to the CJEU
  2. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy
  3. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents
  4. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)
  5. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection
  6. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed when the ECtHR case-law provides a higher level of protection for the rights of a person
  7. What to expect from the Court

Note that all highlights from the post are made by the author.

  1. Facts of the case and questions referred to the CJEU

C-73/16 Puškár concerns the request of Mr Puškár to have his name removed from a blacklist kept by the Finance Directorate of Slovakia which contains names and national ID numbers for persons “who purport to act, as ‘fronts’, as company directors”. The list associates a legal person or persons with a natural person who supposedly acted on their behalf (§15) and is created for the purposes of tax administration and combating tax fraud (§23 2nd question for a preliminary ruling). It transpires from several paragraphs of the Conclusions that Mr Puskar found out about the list and the fact that he is on the list from a leak (§23 2nd question;§72; §76). Instead of relying on the more straightforward right to erasure or right to object under data protection law, Mr Puškár claimed that “his inclusion in the above mentioned list infringes his personal rights, specifically the right to the protection of his good name, dignity and good reputation” (§16).

The Supreme Court rejected his claims, partly on procedural issues, partly on substantive grounds (§18). Later, the Constitutional Court found that “the Supreme Court infringed the fundamental right to the protection of personal data against unauthorised collection and other abuses, in addition to the right to privacy”, quashed its decision and send back the case to the Supreme Court for retrial, grounding its findings on ECtHR case-law (§20). In the context of these second round proceedings, the Supreme Court sent questions for a preliminary ruling to the CJEU to essentially clarify:

  • whether the right to an effective remedy under Article 47 of the Charter in the context of data protection is compatible with a national law requirement that a claimant must first exhaust the procedures available under administrative law (administrative complaints) before going to Court;
  • whether the legitimate grounds for processing under Directive 95/46 and Articles 7 and 8 of the Charter preclude tax authorities to create such a blacklist without the consent of the individuals on the list;
  • whether the list obtained by the claimant without the consent of the tax authorities is admissible as evidence;
  • whether national courts should give precedence to the case-law of the CJEU or the case-law of the ECtHR on a specific topic where the two diverge.
  1. Requiring claimants to exhaust administrative remedies before going to Court can be compatible with the right to effective judicial remedy

To reply to the first question, AG Kokott looks at Articles 28(4) and 22 of Directive 95/46 and also at Article 79 of the General Data Protection Regulation, which will replace Directive 95/46 starting with 25 May 2018.

Article 28(4) of Directive 95/46 states that each supervisory authority (Data Protection Authority) is to hear claims lodged by any person concerning the protection of his rights and freedoms with regard to the processing of personal data. Article 22 provides that, without prejudice to the remedy referred to in Article 28(4), every person is to have a right to a judicial remedy for any breach of the rights guaranteed him by the national law applicable to the processing in question (§37, §38).

In practice, this means that an individual who engages in Court proceedings for a breach of data protection law must be able to also initiate administrative proceedings with a DPA (complaints lodged with DPAs).

The same rule is kept under Article 79 GDPR, slightly broadened: the right to a judicial remedy must be effective and must be granted without prejudice to any administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority. 

AG Kokott explains that these rules still do not clarify “whether the bringing of legal proceedings may be made contingent upon exhaustion of another remedy. All that can be taken from Article 79 of the General Data Protection Regulation is that the judicial remedy must be effective. An obligation to exhaust some other remedy before bringing legal proceedings will consequently be impermissible if the judicial remedy is rendered ineffective as a result of this precondition” (§43).

The AG found that Article 47(1) of the Charter and the principle of effectiveness “ultimately embody the same legal principle” and that they can be examined jointly using the rules in Articles 47(1) and 52(1) of the Charter – which is the provision that enshrines the rules for limiting the exercise of the fundamental rights in the Charter (§51). Hence, the question is whether the obligation to exhaust administrative procedures before going to Court amounts to a justified interference with the right to an effective judicial remedy.

AG Kokott remarks that the interference is provided for by Slovakian law and that it does not touch the essence of the right to effective judicial remedy because “it does not restrict the category of people who could in principle have recourse to judicial review” (§56). [Small comment here: this means that a provision which would restrict the category of people who could in principle have recourse to judicial review touches the essence of the right in Article 47 Charter. Check out paragraphs 45 and 46 of the EDPS Opinion on the EU-US Umbrella Agreement commenting on the fact that Article 19 of the Agreement provides for the possibility of judicial redress only for citizens of the EU, excluding thus categories of individuals that would otherwise be covered by the Charter, such as asylum seekers and residents].

It remained to be analysed whether the interference complies with the principle of proportionality, which “requires that a measure be ‘appropriate, necessary and proportionate to the objective it pursues’” (§58). The AG retains the submission of the Supreme Court that “the exhaustion of the administrative remedy represents a gain in efficiency, as it provides the administrative authority with an opportunity to remedy the alleged unlawful intervention, and saves it from unwanted court proceedings” (§59). The AG considers that “obligatory preliminary proceedings are undoubtedly appropriate for achieving the objectives” and that a “less onerous method” does not suggest itself as capable of realising them to the same extent (§62).

However, the AG points out that the “specific form” of the administrative remedy is important to determine the appropriateness of the measure in practice. This condition applies in particular if there is uncertainty “as to whether the time limit for bringing an action begins to run before a decision has been made in the administrative action” (§64). Additionally, Article 47(2) Charter establishes the right of every person to have their case dealt with within a reasonable period of time. “While this right in fact relates to judicial proceedings, naturally it may not be undermined by a condition for the bringing of an action” (§67).

In conclusion, the AG considers that the right to effective judicial review under Article 47 Charter and the principle of effectiveness “do not preclude an obligation to exhaust an administrative remedy being a condition on bringing legal proceedings if the rules governing that remedy do not disproportionately impair the effectiveness of judicial protection. Consequently, the obligatory administrative remedy must not cause unreasonable delay or excessive costs for the overall legal remedy” (§71).

  1. Internal documents of a tax authority obtained without the consent of the authority must be admitted as evidence if they contain personal data of the person who obtained the documents

Essentially, the question asked by the Supreme Court is whether the contested list may be excluded as evidence due to the fact that it came into the possession of the claimant without the consent of the competent authorities (§72).

The AG considers that “a review should be carried out to determine whether the person affected has a right of access to the information in question. If this were the case, the interest in preventing unauthorized use would no longer merit protection” (§83).

Further, it is recalled that “under the second sentence of Article 8(2) of the Charter and Article 12 of the Data Protection Directive, everyone has the right of access to data which has been collected concerning him or her. This also applies in principle to data being recorded in the contested list. Furthermore, the persons so affected would, by virtue of the collection of the data, have to be informed of the use of the data, under either Article 10 or Article 11 of the Data Protection Directive” (§85).

While indeed Article 13 of the Directive allows this right to information to be restricted, it also “expressly requires that such restrictions be imposed by legislative measures” (§86). The AG acknowledged that “there is a potential risk that inspection and monitoring activities based on the list would be less effective if it were known who was named on that list” (§87). However, the national Court must examine:

  • “whether a restriction of the right of information of this kind is provided for” (§88) and
  • “where appropriate” if it is “justified” (§88). This is an indication that even if such an exemption would be provided for by law, a further analysis is needed to see whether the exemption is justified.

A key point the AG makes is that “even if there are indications of a legitimate interest in a hypothetical, legally justified non-disclosure of the list in question, the national courts must also examine whether in the individual case these outweigh the legitimate interests of the individual in bringing the proceedings” (§89). This is important because it is a clear indication that when a controller relies on their legitimate interest as a ground for processing, it always has to engage in a balancing exercise with the legitimate interests (and rights) of the data subject.

In conclusion, the AG established that refusing to accept as evidence a document obtained by the claimant without the consent of an authority is not possible under the principle of a fair hearing in Article 47 Charter when the document contains personal data of the claimant, which the authority is required to disclose to the claimant under Article 12 and 13 of the Data Protection Directive.

  1. The performance of a task in the public interest allows a tax authority to create a black list without the consent of the persons concerned, if this task was legally assigned to the tax authority and the list’s use is appropriate and necessary (Article 7 and 8 Charter are not breached in this case)

The Supreme Court wanted to know whether the fundamental right to privacy (Article 7 Charter) and protection of personal data (Article 8 Charter) and the Data Protection Directive prohibit a Member State from creating a list of personal data for the purposes of tax collection without the consent of the persons concerned.

The AG points out that “this question is primarily to be answered in the light of the Data Protection Directive, as this specifies the rights to privacy and data protection” (§95).

The AG further recalls that Article 7 of the Data Protection Directive allows processing of personal data if it is based on one of the six lawful grounds for processing provided for (§99) [NB: of which only one is “consent”!]. While the AG acknowledges that three of the six conditions are applicable in this case (1 – performance of a task in the public interest [Article 7(e)]; 2 – legitimate interest of the controller [Article 7(f)] and 3 – necessity of compliance with a legal obligation [Article 7(c)]), she considers the examination of the latter 2 as “superfluous”: “This is because all parties acknowledge that tax collection and combating tax fraud are tasks in the public interest within the meaning of Article 7(e) of the Data Protection Directive” (§100).

A much-welcomed clarification is further brought by the AG, who specifies that Article 7(e) of the Data Protection Directive “must be read in conjunction with the principles of Article 6. According to Article 6(1)(b), personal data must only be collected for specified, explicit and legitimate purposes. Within the scope of Article 7(e), the purpose of the data processing is inseparably linked to the delegated tasks. Consequently, the transfer of the task must clearly include the purpose of the processing” (§106).

This clarification is welcomed because it reminds controllers that even if they correctly process personal data on one of the lawful grounds for processing (such as consent or legitimate interest) in compliance with Article 7 of the Directive, they still have to comply with all the other safeguards for processing personal data, including the principles for processing in Article 6 of the Directive (purpose limitation, data minimization etc).

The AG remarks that the reference for a preliminary ruling does not specify the purpose of the contested list and leaves it to the Supreme Court to look further into this question (§107). Additionally, the AG also considers that the Supreme Court “will have to examine whether the creation and use of the contested list and in particular the naming of Mr Puškár is necessary for the claimed public interest”. This is yet another reminder how important “necessity” is for personal data protection in the EU legal framework (check out EDPS’s recently published “Necessity Toolkit”).

Another very interesting point that the AG brings forward is how naming a person on this black list constitutes “a considerable interference with the rights of the person concerned”, beyond the right to privacy in Article 7 Charter – it also touches (§110):

  • “his reputation and could lead to serious, practical disadvantages in his dealings with the tax authorities;
  • the presumption of innocence in Article 48(1) of the Charter;
  • the legal persons associated with the person concerned, which will be affected in terms of their freedom to conduct business under Article 16 of the Charter”.

This finding is a testimony of the importance of complying with the right to the protection of personal data, as non-compliance would have various consequences on several other fundamental rights.

As the AG explains, “such a serious interference of this kind can only be proportionate if there are sufficient grounds for the suspicion that the person concerned purported to act as a company director of the legal persons associated with him and in so doing undermined the public interest in the collection of taxes and combating tax fraud” (§111).

In conclusion, the tax authorities can create a blacklist such as the one in the main proceedings on the grounds of Article 7(e) of the Data Protection Directive, but this assumes that (§117):

  • “the task was legally assigned to the tax authorities,
  • the use of the list is appropriate and necessary for the purposes of the tax authorities and
  • there are sufficient grounds to suspect that these persons should be on the list”.
  1. A missed opportunity to better define the difference between the right to privacy and the right to personal data protection

Further, the AG spelled out that “neither the fundamental rights to privacy, Article 7 of the Charter, or data protection, Article 8, would in this case prevent the creation and use of the list” (§117).

The analysis to reach this conclusion was another missed opportunity to persuade the Court of Justice to better delineate the two fundamental rights protected by Article 7 and Article 8 of the Charter. The AG referred to these as “the fundamental rights to privacy and data protection”.

Without a clear analysis of what constitutes interference with the two rights, the AG referred to “naming of a person on the contested list” as “affecting” both fundamental rights (§115). In the same paragraph, she further analysed en masse “these interferences”, writing that they are only justified “if they have a sufficient legal basis, respect the essence of both fundamental rights, and preserve the principle of proportionality” (§ 115). Considering that the legality and proportionality of the measure were addressed in previous sections, the AG merely stated that “the adverse effects associated with inclusion on the contested list, those interferences do not meet the threshold of a breach of the essence of those rights” before concluding that neither of the two Charter articles would prevent the creation of such a blacklist.

  1. Where ECtHR and CJEU case-law diverge, national courts have to ask the CJEU on how to proceed, even if the ECtHR case-law provides a higher level of protection for the rights of a person

The last question is one that is extremely interesting for EU lawyers in general, not necessarily for EU data protection lawyers, because it tackles the issue of different levels of protection of the same fundamental right emerging from the case-law of the Court of Justice of the EU in Luxembourg, on one hand, and the European Court of Human Rights in Strasbourg, on the other hand.

As the AG summarizes it, “the fourth question is aimed at clarifying whether a national court may follow the case-law of the Court of Justice of the European Union where this conflicts with the case-law of the ECtHR” (§118). This issue is relevant in our field because Article 8 of the European Convention of Human Rights shares partially the same material scope of Article 7 and Article 8 of the EU Charter of Fundamental Rights (Article 8 of the Convention is more complex), and Article 52(3) of the Charter states that “the rights in the Charter, which correspond to rights guaranteed by the European Convention on the Protection of Human Rights and Fundamental Freedoms (ECHR), have the same meaning and scope as conferred by the ECHR” (§122). However, the second sentence of Article 52(3) of the Charter permits EU law to accord more extensive protection (§122).

The AG specifies that “EU law permits the Court of Justice to deviate from the case-law of the ECtHR only to the extent that the former ascribes more extensive protection to specific fundamental rights than the latter. This deviation in turn is only permitted provided that it does not also cause another fundamental right in the Charter corresponding to a right in the ECHR to be accorded less protection than in the case-law of the ECtHR. One thinks, for example, of cases in which a trade-off must be made between specific fundamental rights” (§123).

Not surprisingly, the AG advises that when the case-law of the two Courts comes in conflict, the national courts should directly apply the case-law of the CJEU when it affords more protection to the fundamental rights in question, but they should send a reference for a preliminary ruling to the CJEU to ask which way to go when the case-law of the ECtHR affords enhanced protection to the fundamental right in question (§124 and §125). The argument of the AG is that the latter case “inevitably leads to a question of the interpretation of EU law with regard to the fundamental right in question and Article 52(3) of the Charter” which, if performed by the national Court, could further “amount to the view that the interpretation of the fundamental right in question by the Court of Justice is not compatible with Article 52(3)”.

As for the relevance of this question to the case at hand – it remains a mystery. The AG herself pointed out that “the admissibility of the question in this form is dubious, particularly as the Supreme Court does not state on which issue the two European courts supposedly are in conflict and the extent to which such a conflict is significant for the decision in the main proceedings” (§119).

  1. What to expect from the Court

How will the CJEU reply to these questions? My bet is that, in general, the Court will follow the AG on substance. However, it is possible that the Court will simplify the analysis and reformulate the questions in such a way that the answers will be structured around three main issues:

  • lawfulness of creating such a blacklist (and the lawful grounds for processing in the Data Protection Directive) and compatibility of this interference with both Article 7 and Article 8 of the Charter (I do hope, having low expectations nonetheless, that we will have more clarity of what constitutes interference with each of the two rights from the Court’s perspective);
  • compatibility of procedural law of Slovakia in the field of data protection with Article 47 Charter (in fact, this may be the only point where the Court could lay out a different result than the one proposed by the AG, in the sense that the condition to exhaust first administrative remedies before engaging in litigation may be considered a non-proportionate interference with the right to effective judicial remedy; it is also possible that the Court will refer for the first time directly to the GDPR);
  • the relationship between ECtHR and CJEU case-law on the same fundamental right.

Suggested citation: G. Zanfir-Fortuna, “Summary of the Opinion of AG Kokott in Puškár (on effective judicial remedies and lawful grounds for processing other than consent)”, pdpEcho.com, 24 April 2017.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

Door-to-door gathering of data by religious group goes to the CJEU

Non-automated processing | Filing system | Household Exemption | Controller | Religious community

The Court of Justice of the EU received questions for a preliminary ruling from Finland regarding the practice of a religious group (Jehova’s Witnesses) to gather and record data after door-to-door visits, without informing the concerned individuals about this practice. The questions referred in Case C-25/17 Tietosuojavaltuutettu v Jehovah’s Witnesses concern the interpretation of several key points of Directive 95/45:

  1. Exceptions from the application of the Directive – and particularly Article 3(2) second paragraph, which excludes processing “by a natural person in the course of a purely personal or household activity” from the material scope of the Directive. The referring court wants the CJEU to clarify whether this exception applies to gathering data and writing observations in paper file connected to the door-to-door activity, by members of the religious group (Question 1).
  2. The concept of “filing system” as defined in Article 2(d) of the Directive.The question referred by the national Court is whether, taken as a whole, the manual collection of personal data (name and address and other information and characteristics of a person) carried out in connection with door-to-door evangelical work constitutes a filing system, being thus subject to the application of the Directive (Question 2).
  3. The concept of “controller” under Article 2(d) of the Directive. In particular, the referring court wants the CJEU to clarify whether in this situation the controller is considered to be the religious community as a whole, “even though the religious community claims that only the individual members carrying out evangelical work have access to the data collected” (Questions 3 and 4).

Without knowing the details of the case, and based only on the information available in the questions referred by the national Court, here is my bet on how the CJEU will reply:

  • The definition of “purely household activity” does not extend to the door-to-door evangelical work of a religious community; this exemption is to be interpreted strictly (“must be narrowly construed”; “must apply only in so far as is strictly necessary”), according to the CJEU in C-212/13 Rynes (§28 and §29). The CJEU also explained that this exception applies “only where it is carried out in the purely personal or household setting of the person processing the data” (§31) – which is not the case of representatives of a religious community gathering information during evangelical work.
  • The records the evangelical workers keep should be considered as constituting a “filing system”. This concept is defined as “any structured set of personal data which are accessible according to specific criteria, whether centralized, decentralized or dispersed on a functional or geographical basis”. According to Recital 15 of the Directive, data in a filing system is “structured according to specific criteria relating to individuals, so as to permit easy access to the personal data in question”. If the religious community would claim that their records are not structured according to specific criteria – e.g. ZIP codes; members of the community/non-members; individuals interested in the community/individuals not interested, and that they don’t allow easy access to the personal data in question, then the purpose of having a detailed record would not be achieved. In other words, having an unstructured file is incongruent with the purpose of the activity. While it is true that the Member States have been given a margin of appreciation to lay down different criteria for determining the constituents of a structured set of personal data and the different criteria governing access to such a set, the criteria must be compatible with the definition in the Directive. Moreover, applying “loosely” the definition would amount to a limitation in relation to the protection of personal data, which must apply “only in so far as is strictly necessary” (Rynes §28, DRI §52).
  • The controller of this processing operation should be considered the religious community, as this entity establishes the purposes of the processing activity (the records are probably meant to facilitate the evangelical work of the community – there is no reference in the questions sent to the declared purpose of this activity, but it is only logical that such records are kept to facilitate the evangelical work) and the means of this activity (“by dividing up the areas in which the activity is carried out among members involved in evangelical work, supervising the work of those members and maintaining a list of individuals who do not wish to receive visits from evangelists” – according to the referring Court)

Since this new case provided an opportunity to discuss processing of personal data done by a religious community, there are a couple of additional points to be made.

First of all, according to Recital 35 of the Directive, “processing of personal data by official authorities for achieving aims, laid down in constitutional law or international public law, of officially recognized religious associations is carried out on important grounds of public interest“. This means that the religious associations do not need to rely on consent or on their legitimate interest as lawful grounds for processing. However, relying on public interest for the lawful ground of processing does not mean that they don’t have to comply with all the other obligations under data protection law. For instance, they still have to comply with the data quality principles, they still have to inform data subjects about the details of the processing activity and they still have to reply to requests of access, correction, erasure.

Second, some of the data gathered in such circumstances is sensitive data, as it refers to “religious beliefs” (Article 8 of the Directive, Article 9 of the GDPR). This means that the data should be processed with additional care and strengthened safeguards.

In case you are wondering whether the GDPR specifically addresses processing of data by religious communities, churches, Recital 35 of the Directive was transplanted to the GDPR, in Recital 55. In addition, the GDPR enshrines a specific provision that covers “existing data protection rules of churches and religious associations” – Article 91. This provision allows Member States that have specific legislation (“comprehensive rules”) in place dedicated to churches and religious communities, at the time of entry into force of the GDPR, to continue to apply those rules, but only if “they are brought into line with this Regulation”. In addition, according to the second paragraph, processing of personal data done by churches and religious associations that apply comprehensive national rules according to the first paragraph “shall be subject to the supervision of an independent supervisory authority, which may be specific”. Again, the conditions for this to happen is that this specific supervisory authority must fulfil the conditions laid down for independent supervisory authorities in the GDPR.

***

Note: Thanks to Dr. Mihaela Mazilu-Babel for pointing out this new case.

Find what you’re reading useful? Please consider supporting pdpecho.

 

CNIL publishes GDPR compliance toolkit

CNIL published this week a useful guide for all organisations thinking to start getting ready for GDPR compliance, but asking themselves “where to start?”. The French DPA created a dedicated page for the new “toolkit“, while detailing each of the six proposed steps towards compliance by also referring to available templates (such as a template for the Register of processing operations and a template for data breach notifications – both in FR).

According to the French DPA, “the new ‘accountability’ logic under the GDPR must be translated into a change of organisational culture and should put in motion internal and external competences”.

The six steps proposed are:

  1. Appointing a “pilot”/”orchestra conductor” [n. – metaphors used in the toolkit], famously known as “DPO”, even if the controller is not under the obligation to do so. Having a DPO will make things easier.
  2. Mapping all processing activities (the proposed step goes far beyond data mapping, as it refers to processing operations themselves, not only to the data being processed, it also refers to cataloging the purposes of the processing operations and identifying all sub-contractors relevant for the processing operations);
  3. Prioritising the compliance actions to be taken, using as starting point the Register and structuring the actions on the basis of the risks the processing operations pose to the rights and freedoms of individuals whose data are processed. Such actions could be, for instance, making sure that they process only the personal data necessary to achieve the purposes envisaged or revising/updating the Notice given to individuals whose data are processed (Articles 12, 13 and 14 of the Regulation);
  4. Managing the risks, which means conducting DPIAs for all processing operations envisaged that may potentially result in a high risk for the rights of individuals. CNIL mentions that the DPIA should be done before collecting personal data and before putting in place the processing operation and that it should contain a description of the processing operation and its purposes; an assessment of the necessity and the proportionality of the proposed processing operation; an estimation of the risks posed to the rights and freedoms of the data subjects and the measures proposed to address these risks in order to ensure compliance with the GDPR.
  5. Organising internal procedures that ensure continuous data protection compliance, taking into account all possible scenarios that could intervene in the lifecycle of a processing operation. The procedures could refer to handling complaints, ensuring data protection by design, preparing for possible data breaches and creating a training program for employees.
  6. Finally, and quite importantly, Documenting compliance. “The actions taken and documents drafted for each step should be reviewed and updated periodically in order to ensure continuous data protection”, according to the CNIL. The French DPA  provides a list with documents that should be part of the “GDPR compliance file”, such as the Register of processing operations and the contracts with processors.

While this guidance is certainly helpful, it should be taken into account that the only EU-wide official guidance is the one adopted by the Article 29 Working Party. For the moment, the Working Party published three Guidelines for the application of the GDPR – on the role of the DPO, on the right to data portability and on identifying the lead supervisory authority. The Group is expected to adopt during the next plenary guidance for Data Protection Impact Assessments.

If you are interested in other guidance issued by individual DPAs, here are some links:

NOTE: The guidance issued by CNIL was translated and summarised from French – do not use the translation as an official source. 

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object

by Gabriela Zanfir-Fortuna

The recent judgment of the CJEU in Case C-398/15 Manni (9 March 2017) brings a couple of significant points to the EU data protection case-law:

  • Clarifies that an individual seeking to limit the access to his/her personal data published in a Companies Register does not have the right to obtain erasure of that data, not even after his/her company ceased to exist;
  • Clarifies that, however, that individual has the right to object to the processing of that data, based on his/her particular circumstances and on justified grounds;
  • Clarifies the link between the purpose of the processing activity and the data retention period, and underlines how important is the purpose of the processing activity when analysing whether a data subject can obtain erasure or blocking of data.
  • Provides insight into the balancing exercise between interests of third parties to have access to data published in the Companies Register and the rights of the individual to obtain erasure of the data and to object to its processing.

This commentary will highlight all points enumerated above.

1. Facts of the case

Mr Manni had requested his regional Chamber of Commerce to erase his personal data from the Public Registry of Companies, after he found out that he was losing clients who performed background checks on him through a private company that specialised in finding information in the Public Registry. This happened because Mr Manni had been an administrator of a company that was declared bankrupt more than 10 years before the facts in the main proceedings. In fact, the former company itself was radiated from the Public Registry (§23 to §29).

2. The question in Manni

The question that the CJEU had to answer in Manni was whether the obligation of Member States to keep public Companies Registers[1] and the requirement that personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected[2] must be interpreted as meaning that individuals must be allowed to “request the authority responsible for maintaining the Companies Register to limit, after a certain period has elapsed from the dissolution of the company concerned and on the basis of a case-by-case assessment, access to personal data concerning them and entered in that register” (§30).

3. Applicability of Directive 95/46 (Data Protection Directive – ‘DPD’)

First, CJEU clarified that its analysis does not concern processing of data by the specialized rating company, and it only refers to the obligations of the public authority keeping the companies register (§31). Second, the CJEU ascertained that the provisions of the DPD are applicable in this case:

  • the identification data of Mr Manni recorded in the Register is personal data[3] – “the fact that information was provided as part of a professional activity does not mean that it cannot be characterized as personal data” (§34);
  • the authority keeping the register is a “controller”[4] that carries out “processing of personal data”[5] by “transcribing and keeping that information in the register and communicating it, where appropriate, on request to third parties” (§35).

4. The role of the data quality principles and the legitimate grounds for processing in ensuring a high level of protection of fundamental rights

Further, CJEU recalls its case-law stating that the DPD “seeks to ensure a high level of protection of the fundamental rights and freedoms of natural persons” (§37) and that the provisions of the DPD “must necessarily be interpreted in the light of the fundamental rights guaranteed by the Charter”, and especially Articles 7 – respect for private life and 8 – protection of personal data (§39). The Court recalls the content of Articles 7 and 8 and specifically lays out that the requirements under Article 8 Charter “are implemented inter alia in Articles 6, 7, 12, 14 and 28 of Directive 95/46” (§40).

The Court highlights the significance of the data quality principles and the legitimate grounds for processing under the DPD in the context of ensuring a high level of protection of fundamental rights:

“[S]ubject to the exceptions permitted under Article 13 of that directive, all processing of personal data must comply, first, with the principles relating to data quality set out in Article 6 of the directive and, secondly, with one of the criteria for making data processing legitimate listed in Article 7 of the directive” (§41 and case-law cited).

The Court applies this test in reverse order, which is, indeed, more logical. A processing activity should, first, be legitimate under one of the lawful grounds for processing and only after ascertaining that this is the case, the question of compliance with the data quality principles should arise.

CJEU finds that in the case at hand the processing activity is legitimized by three lawful grounds (§42, §43):

  • compliance with a legal obligation [Article 7(c)];
  • the exercise of official authority or the performance of a task carried out in the public interest [Article 7(e)] and
  • the realization of a legitimate interest pursued by the controller or by the third parties to whom the data are disclosed [Article 7(f)].

5. The link between the data retention principle, the right to erasure and the right to object

Article 6(1)(e) of the DPD requires that personal data are kept in a form which permits identification of data subjects for no longer than what is necessary for the purposes for which the data were collected or for which they are further processed. This means that controllers should only retain personal data up until it serves the purpose for which it was processed and automatically anonymise, erase or otherwise make unavailable that data. If the controller does not comply with this obligation, the data subject has two possible avenues to stop the processing: he/she can either ask for erasure of that data, or they can object to the processing based on their particular situation and a justified objection.

CJEU explains that “in the event of failure to comply with the condition laid down in Article 6(1)(e)” of the DPD, “Member States guarantee the person concerned, pursuant to Article 12(b) thereof, the right to obtain from the controller, as appropriate, the erasure or blocking of the data concerned” (§46 and C-131/12 Google/Spain §70).

In addition, the Court explains, Member States also must “grant the data subject the right, inter alia in the cases referred to in Article 7(e) and (f) of that directive, to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation”, pursuant to Article 14(a) DPD (§47).

The CJEU further explains that “the balancing to be carried out under subparagraph (a) of the first paragraph of Article 14 … enables account to be taken in a more specific manner of all the circumstances surrounding the data subject’s particular situation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data” (§47).

6. The pivotal role of the purpose of the processing activity in granting the right to erasure and the right to object

After establishing these general rules, the Court decides that in order to establish where data subjects have the “right to apply to the authority responsible for keeping the register to erase or block the personal data entered in that register after a certain period of time, or to restrict access to it, it is first necessary to ascertain the purpose of that registration” (§48).

The pivotal role of the purpose of the processing operation should not come as a surprise, given the fact that the data retention principle is tightly linked to accomplishing the purpose of the processing operation.

In this case, the Court looked closely at Directive 68/151 and explained at length that the purpose of the disclosure provided for by it is “to protect in particular the interests of third parties in relation to joint stock companies and limited liability companies, since the only safeguards they offer to third parties are their assets” (§49) and “to guarantee legal certainty in relation to dealings between companies and third parties in view of the intensification of trade between Member States” (§50). CJEU also referred to primary EU law, and specifically to Article 54(3)(g) EEC, one of the legal bases of the directive, which “refers to the need to protect the interests of third parties generally, without distinguishing or excluding any categories falling within the ambit of that term” (§51).

The Court further noted that Directive 68/151 makes no express provision regarding the necessity of keeping personal data in the Companies Register “also after the activity has ceased and the company concerned has been dissolved” (§52). However, the Court notes that “it is common ground that even after the dissolution of a company, rights and legal relations relating to it continue to exist” (§53) and “questions requiring such data may arise for many years after a company has ceased to exist” (§54).

Finally, CJEU declared:

“in view of the range of possible scenarios … it seems impossible, at present, to identify a single time limit, as from the dissolution of a company, at the end of which the inclusion of such data in the register and their disclosure would no longer be necessary” (§55).

7. Conclusion A: there is no right to erasure

The Court concluded that “in those circumstances” the data retention principle in Article 6(1)(e) DPD and the right to erasure in Article 12(b) DPD do not guarantee for the data subjects referred to in Directive 68/151 a right to obtain “as a matter of principle, after a certain period of time from the dissolution of the company concerned, the erasure of personal data concerning them” (§56).

After already reaching this conclusion, the Court also explained that this interpretation of the provisions in question does not result in “disproportionate interference with the fundamental rights of the persons concerned, and particularly their right to respect for private life and their right to protection of personal data as guaranteed by Articles 7 and 8 of the Charter” (§57).

To this end, the Court took into account:

  • that Directive 68/151 requires “disclosure only for a limited number of personal data items” (§58) and
  • that “it appears justified that natural persons who choose to participate in trade through such a company are required to disclose the data relating to their identity and functions within that company, especially since they are aware of that requirement when they decide to engage in such activity” (§59).

8. Conclusion B: but there is a right to object

After acknowledging that, in principle, the need to protect the interests of third parties in relation to joint-stock companies and limited liability companies and to ensure legal certainty, fair trading and thus the proper functioning of the internal market take precedence over the right of the data subject to object under Article 14 DPD, the Court points out that

it cannot be excluded, however, that there may be specific situations in which the overriding and legitimate reasons relating to the specific case of the person concerned justify exceptionally that access to personal data entered in the register is limited, upon expiry of a sufficiently long period after the dissolution of the company in question, to third parties who can demonstrate a specific interest in their consultation” (§60).

While the Court leaves it to the national courts to assess each case “having regard to all the relevant circumstances and taking into account the time elapsed since the dissolution of the company concerned”, it also points out that, in the case of Mr Manni, “the mere fact that, allegedly, the properties of a tourist complex built … do not sell because of the fact that potential purchasers of those properties have access to that data in the company register, cannot be regarded as constituting such a reason, in particular in view of the legitimate interest of those purchasers in having that information” (§63).

9. Post Scriptum

The Court took a very pragmatic approach in dealing with the case of Mr Manni. The principles of interpretation it laid down are solid – such an analysis indeed requires looking at the legitimate grounds for processing and the relevant data quality principle. Having the Court placing strong emphasis on the significance of the purpose of the processing activity is welcome, just like having more guidance on the balancing exercise of the rights and interests in question. In addition, a separate assessment of the right to obtain erasure and of the right to object is very helpful with a view towards the future – the full entering into force of the GDPR and its heightened rights of the data subject.

The aspect of the judgment that leaves some room for improvement is analysing the proportionality of the interference of the virtually unlimited publishing of personal data in the Companies Register with Articles 7 and 8 of the Charter. The Court does tackle this, but lightly – and it brings two arguments only after already declaring that the interference is not disproportionate. Moreover, the Court does not distinguish between interferences with Article 7 and interferences with Article 8.

Finally, I was happy to see that the predicted outcome of the case, as announced in the pdpEcho commentary on the Opinion of the Advocate General Bot, proved to be mainly correct: “the Court will follow the AG’s Opinion to a large extent. However, it may be more focused on the fundamental rights aspect of balancing the two Directives and it may actually analyse the content of the right to erasure and its exceptions. The outcome, however, is likely to be the same.”

Suggested citation: G. Zanfir-Fortuna, “CJEU in Manni: data subjects do not have the right to obtain erasure from the Companies Register, but they do have the right to object”, pdpEcho.com, 13 March 2017.


[1] Article 3 of Directive 68/151.

[2] Article 6(1)(e) of Directive 95/46.

[3] Article 2(a) of Directive 95/46.

[4] Article 2(d) of Directive 95/46.

[5] Article 2(b) of Directive 95/46.

***

If you find information on this blog useful and would like to read more of it, consider supporting pdpecho here: paypal.me/pdpecho.

The right to be forgotten goes back to the CJEU (with Google, CNIL, sensitive data, freedom of speech)

The Conseil d’Etat announced today that it referred several questions to the Court of Justice of the EU concerning the interpretation of the right to be forgotten, pursuant to Directive 95/46 and following the CJEU’s landmark decision in the Google v Spain case.

The questions were raised within proceedings involving the application of four individuals to the Conseil d’Etat to have decisions issued by the CNIL (French DPA) quashed. These decisions rejected their requests for injunctions against Google to have certain Google Search results delisted.

According to the press release of the Conseil d’Etat, “these requests were aimed at removing links relating to various pieces of information :a video that explicitly revealed the nature of the relationship that an applicant was deemed to have entertained with a person holding a public office; a press article relating to the suicide committed by a member of the Church of Scientology, mentioning that one of the applicants was the public relations manager of that Church; various articles relating to criminal proceedings concerning an applicant; and articles relating the conviction of another applicant for having sexually aggressed minors.

The Conseil d’Etat further explained that in order to rule on these claims, it has deemed necessary to answer a number of questions “raising serious issues with regard to the interpretation of European law in the light of the European Court of Justice’s judgment in its Google Spain case.

Such issues are in relation with the obligations applying to the operator of a search engine with regard to web pages that contain sensitive data, when collecting and processing such information is illegal or very narrowly framed by legislation, on the grounds of its content relating to sexual orientations, political, religious or philosophical opinions, criminal offences, convictions or safety measures. On that point, the cases brought before the Conseil d’Etat raise questions in close connection with the obligations that lie on the operator of a search engine, when such information is embedded in a press article or when the content that relates to it is false or incomplete”.

***

Find what you’re reading useful? Please consider supporting pdpecho.

CJEU case to follow: purpose limitation, processing sensitive data, non-material damage

A new case received by the General Court of the CJEU was published in the Official Journal of the EU in February, Case T-881/16 HJ v EMA.

A British citizen seeks to engage the non-contractual liability of the European Medicines Agency for breaching data protection law. The applicant claims that “the documents in his personal file, which were made public and accessible to any member of staff of the European Medicines Agency for a period of time, were not processed fairly and lawfully but were processed for purposes other than those for which they were collected without that change in purpose having been expressly authorised by the applicant”.

Further, the applicant claims that “the dissemination of that sensitive data consequently called into question the applicant’s integrity, causing him real and certain non-material harm”.

The applicant asks the Court to “order the defendant to pay the applicant the symbolic sum of EUR 1 by way of compensation for the non-material harm suffered”.

Even if in the published summary there is no mention of the applicable law, it is clear that Regulation 45/2001 is relevant in this case – the data protection regulation applicable to EU institutions and bodies (EMA is an EU body). The rules of Regulation 45/2001 are fairly similar to those of Directive 95/46.

(Thanks dr. Mihaela Mazilu-Babel for bringing this case to my attention)

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

Will the ePrivacy Regulation overshadow the GDPR in the age of IoT?

by Gabriela Zanfir-Fortuna

The ePrivacy draft regulation, published by the European Commission on Jan. 10, updates and upgrades Directive 2002/58/EC (the “ePrivacy directive”), the source of the infamous “cookies banner.” Under its official name – Proposal for a Regulation Concerning the Respect for Private Life and the Protection of Personal Data in Electronic Communications, the draft ePrivacy regulation reorganizes and even re-conceptualizes the system of protecting the privacy of electronic communications.

Armed with equally large fines and equally wide territorial application, the future ePrivacy rules may end up overshadowing the GDPR in the age of the internet of things due to their wide material scope of application which could potentially cover all data related to connected devices.

Protecting the fundamental right to confidentiality

With the proposal for an ePrivacy regulation distinct from the GDPR, the EU makes it clear that the two sets of rules correspond to different fundamental rights: The GDPR is primarily an expression of the fundamental right to the protection of personal data as enshrined in Article 8 of the EU Charter of Fundamental Rights, while the ePrivacy draft regulation details the right to respect for private life, as enshrined in Article 7 of the Charter (see Recital 1 of the proposal).

This differentiation is of great consequence, affecting the manner in which EU courts will interpret and apply the rules. The protection of the right to private life is construed so as to restrict interferences to the private life to the minimum, whereas the right to the protection of personal data is construed so as to provide for “rules of the road” on how personal data must be used.

A telling example of ePrivacy rules intended to protect private life is the reference in the draft to the fact that “terminal equipment of end-users of electronic communications networks and any information relating to the usage of such terminal equipment … are part of the private sphere of the end-users requiring protection under the Charter of Fundamental Rights of the EU and the European Convention of Human Rights” (emphasis added). (Recital 20)

In addition, many member states (e.g., Germany, Italy, Netherlands, Greece, Romania, Denmark, Belgium, Poland, Estonia, Bulgaria, Czech Republic) provide in their Constitutions for a separate right to the “secret of correspondence” (distinct from the right to respect for private life), which can be restricted under limited situations and usually only for objectives of public safety. The rules of the proposal fall under the scope of this particular right, being thus capable of igniting national constitutional reviews – for instance, concerning the rules that allow access to content of communications.

Electronic communications data and information on smart devices, the centerpiece of the ePrivacy framework

The new system of protecting the confidentiality of communications is built around two concepts:

  • “Electronic communications data,” which includes electronic communications content and electronic communications metadata; and
  • “Information related to the terminal equipment of end-users.”

Article 2(1) of the proposal establishes that the regulation “applies to processing of electronic communications data carried out in connection with the provision and the use of electronic communications services and to information related to the terminal equipment of end-users”.

This represents an important change compared to the current regime of the ePrivacy directive, which is centered on processing personal data (see Article 3(1) of Directive 2002/58). For the new rules to be applicable, it will not matter whether the data going through electronic communications channels fall under the GDPR definition of personal data or not. Recital 4 of the proposal explains that “electronic communications data may include personal data as defined in [the GDPR],” meaning they may also not include personal data.

This article was originally published on iapp.org. Read the rest of the article HERE.

WP29 published its 2017 priorities for GDPR guidance

The Article 29 Working Party published in mid January the new set of priorities for providing GDPR guidance for 2017. This happened after WP29 published in December three sets of much awaited Guidelines on the application of the GDPR: on Data Protection Officers, on the right to data portability and on identifying the lead supervisory authority (pdpEcho intends to provide a closer look to all of them in following weeks). So what are the new priorities?

First of all, WP29 committed to finalise what was started in 2016 and was not adopted/finalised by the end of the year:

  • Guidelines on the certification mechanism;
  • Guidelines on processing likely to result in a high risk and Data Protection Impact Assessments;
  • Guidance on administrative fines;
  • Setting up admin details of the European Data Protection Board (e.g. IT, human resources, service level agreements and budget);
  • Preparing the one-stop-shop and the EDPB consistency mechanism

Secondly, WP29 engaged to start assessments and provide guidance for.

  • Consent;
  • Profiling;
  • Transparency.

Lastly, in order to take into account the changes brought by the GDPR, WP29 intends to update the already existing guidance on:

  • International data transfers;
  • Data breach notifications.

If you want to be a part of the process, there are good news. WP29 wants to organise another FabLab on April 5 and 6 on the new priorities for 2017, where “interested stakeholders will be invited to present their views and comments”. For more details, regularly check this link.

It seems we’re going to have a busy year.

 

What’s new in research: networks of control built on digital tracking, new models of internet governance

pdpEcho is kicking off 2017 with a brief catalogue of interesting recently published research that sets the tone for the new year.

1474043869-800pxFirst, Wolfie Christl and Sarah Spiekermann‘s report on “Networks of Control”, published last month, is a must read for anyone that wants to understand how the digital economy functions on the streams of data we all generate, while reflecting on the ethical implications of this economic model and proposing new models that would try keep the surveillance society afar. Second, a new report of the Global Commission of Internet Governance explores global governance gaps created by existing global governance structures developed in the analog age. Third, the American Academy of Sciences recently published a report with concrete proposals on how to reconcile the use of different public and private sources of data for government statistics with privacy and confidentiality. Last, a volume by Angela Daly that was recently published by Hart Publishing explores how EU competition law, sector specific regulation, data protection and human rights law could tackle concentrations of power for the benefit of users.

 

  1. “Networks of control. A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy”, by Wolfie Christl, Sarah Spiekermann  [OPEN ACCESS]

“Around the same time as Apple introduced its first smartphone and Facebook reached 30 million users in 2007, online advertisers started to use individual-level data to profile and target users individually (Deighton and Johnson 2013, p. 45). Less than ten years later, ubiquitous and real-time corporate surveillance has become a “convenient byproduct of ordinary daily transactions and interactions” (De Zwart et al 2014, p. 746). We have entered a surveillance society as David Lyon foresaw it already in the early 1990s; a society in which the practices of “social sorting”, the permanent monitoring and classification of the whole population through information technology and software algorithms, have silently become an everyday reality” (p. 118).

One of the realities we need to take into account when assessing this phenomenon is that “Opting out of digital tracking becomes increasingly difficult. Individuals can hardly avoid consenting to data collection without opting out of much of modern life. In addition, persons who don’t participate in data collection, who don’t have social networking accounts or too thin credit reports, could be judged as “suspicious” and “too risky” in advance” (p. 129).

The authors of the report explain that the title “Networks of Control” is justified “by the fact that there is not one single corporate entity that by itself controls today’s data flows. Many companies co-operate at a large scale to complete their profiles about us through various networks they have built up” (p. 7). They also explain that they want to close a gap created by the fact that “the full degree and scale of personal data collection, use and – in particular – abuse has not been scrutinized closely enough”, despite the fact that “media and special interest groups are aware of these developments for a while now” (p. 7).

What I found valuable in the approach of the study is that it also brings forward a topic that is rarely discussed when analysing Big Data, digital tracking and so on: the attempt of such practices to change behaviour at scale. “Data richness is increasingly used to correct us or incentivize us to correct ourselves. It is used to “nudge” us to act differently. As a result of this continued nudging, influencing and incentivation, our autonomy suffers (p. 7)”.

A chapter authored by Professor Sarah Spiekermann explores the ethical implications of the networks of control. She applies three ethical normative theories to personal data markets: “The Utilitarian calculus, which is the original philosophy underlying modern economics (Mill 1863/1987). The Kantian duty perspective, which has been a cornerstone for what we historically call “The Enlightenment” (Kant 1784/2009), and finally Virtue Ethics, an approach to life that originates in Aristotle’s thinking about human flourishing and has seen considerable revival over the past 30 years (MacIntyre 1984)” (p. 131).

Methodologically, the report is based on “a systematic literature review and analysis of hundreds of documents and builds on previous research by scholars in various disciplines such as computer science, information technology, data security, economics, marketing, law, media studies, sociology and surveillance studies” (p. 10).

2. Global Commission on Internet Governance “Corporate Accountability for a Free and Open Internet”, by Rebecca MacKinnon, Nathalie Maréchal and Priya Kumar  [OPEN ACCESS]

The report shows that “as of July 2016, more than 3.4 billion people were estimated to have joined the global population of Internet users, a population with fastest one-year growth in India (a stunning 30 percent) followed by strong double digit growth in an assortment of countries across Africa (Internet Live Stats 2016a; 2016b)” (p. 1).

“Yet the world’s newest users have less freedom to speak their minds, gain access to information or organize around civil, political and religious interests than those who first logged on to the Internet five years ago” (p. 1).

Within this framework, the report explores the fact that “ICT sector companies have played a prominent role in Internet governance organizations, mechanisms and processes over the past two decades. Companies in other sectors also play an expanding role in global governance. Multinational companies wield more power than many governments over not only digital information flows but also the global flow of goods, services and labour: onethird of world trade is between corporations, and another third is intra-firm, between subsidiaries of the same multinational enterprise” (p. 5).

The authors also look at the tensions between governments and global companies with regard to requests for access to data, to weaken encryption and facilitate censorship in ways that contravene international human rights standards.

3. “Innovations in Federal Statistics: Combining Data Sources While Protecting Privacy”, by National Academy of Sciences [OPEN ACCESS]. 

The tension between privacy on one hand and statistical data and censuses on the other hand compelled the German Constitutional Court to create in the ’80s “the right to informational self-determination”. Could statistics bring a significant reform of such sort to the US? Never say never.

According to epic.org, the US National Academy of Sciences recently published a report that examines how disparate federal data sources can be used for policy research while protecting privacy.

The study shows that in the decentralised US statistical system, there are 13 agencies whose mission is primarily the creation and dissemination of statistics and more than 100 agencies who engage in statistical activities. There is a need for stronger coordination and collaboration to enable access to and evaluation of administrative and private-sector data sources for federal statistics. For this purpose, the report advices that “a new entity or an existing entity should be designated to facilitate secure access to data for statistical purposes to enhance the quality of federal statistics. Privacy protections would have to be fundamental to the mission of this entity“. Moreover, “the data for which it has responsibility would need to have legal protections for confidentiality and be protected using the strongest privacy protocols offered to personally identifiable information while permitting statistical use”.

One of the conclusions of the report is that “Federal statistical agencies should adopt modern database, cryptography, privacy-preserving and privacy-enhancing technologies”. 

4. Private Power, Online Information Flows and EU Law. Mind The Gap, by Angela Daly, Hart Publishing [50 pounds]

“This monograph examines how European Union law and regulation address concentrations of private economic power which impede free information flows on the Internet to the detriment of Internet users’ autonomy. In particular, competition law, sector specific regulation (if it exists), data protection and human rights law are considered and assessed to the extent they can tackle such concentrations of power for the benefit of users.

Using a series of illustrative case studies, of Internet provision, search, mobile devices and app stores, and the cloud, the work demonstrates the gaps that currently exist in EU law and regulation. It is argued that these gaps exist due, in part, to current overarching trends guiding the regulation of economic power, namely neoliberalism, by which only the situation of market failure can invite ex ante rules, buoyed by the lobbying of regulators and legislators by those in possession of such economic power to achieve outcomes which favour their businesses.

Given this systemic, and extra-legal, nature of the reasons as to why the gaps exist, solutions from outside the system are proposed at the end of each case study. This study will appeal to EU competition lawyers and media lawyers.”

Enjoy the read! (Unless the reform of the EU e-Privacy rules is taking much of your time these days – in this case, bookmark the reports of interest and save them for later).
***
Enjoy what you are reading? Consider supporting pdpEcho

Some end-of-the-year good news: People genuinely care about their privacy

Dear followers,

First, I would like to thank you for making this the most successful year in the 5 years life of pdpEcho (I would especially like to thank those who supported the blog and helped me cover, thus, the cost of renting the blog’s .com name). I started this blog when I was in my first year as a PhD student to gather all information I find interesting related to privacy and data protection. At that time I was trying to convince my classic “civilist” supervisor that data protection is also a matter of civil law. And that I could write a civil law thesis on this subject in Romanian, even though Romanian literature on it only counted one book title from 2004. In the five years that followed another book title was added to it and the blog and I grew together (be it at different paces).

In the recent months it offered me a way to keep myself connected to the field while transitioning from Brussels to the US. But most importantly it reminded me constantly that privacy is really not dead, as it has been claimed numerous times. I cared about it, people that daily found this blog cared about it and as long as we care about privacy, it will never die.

I am writing this end-of-the-year post with some very good news from Europe: you and I are not the only ones that care about privacy. A vast majority of Europeans also does. The European Commission published some days ago a Eurobarometer on ePrivacy, as a step towards the launch of the ePrivacy Directive reform later in January.

The results could not have been clearer:

More than nine in ten respondents said it is important that personal information (such as their pictures, contact lists, etc.) on their computer, smartphone or tablet can only be accessed with their permission, and that it is important that the confidentiality of their e-mails and online instant messaging is guaranteed (both 92%)” (source, p. 2).

“More than seven in ten think both of these aspects are very important. More than eight in ten (82%) also say it is important that tools for monitoring their activities online (such as cookies) can only be used with their permission (82%), with 56% of the opinion this is very important” (source, p. 2).

Overwhelming support for encryption

Remarkably, 90% of those asked agreed “they should be able to encrypt their messages and calls, so they can only be read by the recipient”. Almost as many (89%) agree the default settings of their browser should stop their information from being shared (source, p. 3).

Respondents thought it is unacceptable to have their online activities monitored in exchange for unrestricted access to a certain website (64%), or to pay in order not to be monitored when using a website (74%). Almost as many (71%) say it is unacceptable for companies to share information about them without their permission (71%), even if it helps companies provide new services they may like (source, p. 4).

You can find here the detailed report.

Therefore, there is serious cause to believe that our work and energy is well spent in this field.

The new year brings me several publishing projects that I am very much looking forward to, as well as two work projects on this side of the Atlantic. Nevertheless, I hope I will be able to keep up the work on pdpEcho, for which I hope to receive more feedback and even input from you.

In this note, I wish you all a Happy New Year, where all our fundamental rights will be valued and protected!

Gabriela