Category Archives: Europe

WP29 published its 2017 priorities for GDPR guidance

The Article 29 Working Party published in mid January the new set of priorities for providing GDPR guidance for 2017. This happened after WP29 published in December three sets of much awaited Guidelines on the application of the GDPR: on Data Protection Officers, on the right to data portability and on identifying the lead supervisory authority (pdpEcho intends to provide a closer look to all of them in following weeks). So what are the new priorities?

First of all, WP29 committed to finalise what was started in 2016 and was not adopted/finalised by the end of the year:

  • Guidelines on the certification mechanism;
  • Guidelines on processing likely to result in a high risk and Data Protection Impact Assessments;
  • Guidance on administrative fines;
  • Setting up admin details of the European Data Protection Board (e.g. IT, human resources, service level agreements and budget);
  • Preparing the one-stop-shop and the EDPB consistency mechanism

Secondly, WP29 engaged to start assessments and provide guidance for.

  • Consent;
  • Profiling;
  • Transparency.

Lastly, in order to take into account the changes brought by the GDPR, WP29 intends to update the already existing guidance on:

  • International data transfers;
  • Data breach notifications.

If you want to be a part of the process, there are good news. WP29 wants to organise another FabLab on April 5 and 6 on the new priorities for 2017, where “interested stakeholders will be invited to present their views and comments”. For more details, regularly check this link.

It seems we’re going to have a busy year.

 

EU Commission’s leaked plan for the data economy: new rules for IoT liability and sharing “non-personal data”

It seems that it’s the season of EU leaks on internet and digital policy. One day after the draft new e-Privacy regulation was leaked (to Politico), another document appeared online (published by Euractiv) before its adoption and release – a Communication from the European Commission on “Building a European data economy”.

It announces at least two revisions of existing legal acts: the Database Copyright Directive (96/9) and the Product Liability Directive (85/374). New legislative measures may also be needed to achieve the objectives announced in the draft Communication. However, the Commission is not clear about this and leaves a lot of the decision-making for after the results of wide stakeholder and public consultations are processed.

The common thread of most of the policy areas covered by the Communication is “non-personal data”. The Commission starts from the premise that while the GDPR allows for the free movement of personal data within the EU, there are currently no common rules among Member States for sharing, accessing, transferring “non-personal data”. Moreover, the Commission notes that the number of national measures for data localisation is growing.

“The issue of the free movement of data concerns all types of data: enterprises and actors in the data economy deal with a mixture of personal and non-personal data, machine generated or created by individuals, and data flows and data sets regularly combine these different types of data”, according to the draft Communication.

And what is truly challenging is that “enterprises and actors in the data economy will be dealing with a mixture of personal and non-personal data; data flows and datasets will regularly combine both. Any policy measure must take account of this economic reality”.

If you are wondering what is meant by “non-personal data”, the draft Communication provides some guidance to understand what it refers to. For instance, the draft Communication mentions that “personal data can be turned into non-personal data through the process of anonymisation” and that “the bulk of machine-generated data are not personal data”. Therefore, anonymisation and de-identification techniques will gain even more importance.

While the GDPR covers how personal data are used in the EU, the proposals that will be made on the basis of this Communication envisage the use of all the other data.

So what does the Commission propose?

Several objectives are announced, most of them dealing with the free flow of and access to “non-personal data”, while another objective looks at reforming liability rules to accommodate algorithms, Artificial Intelligence and the Internet of Things.

Free flow of and access to non-personal data

  • According to the draft Communication, any Member State action affecting data storage or processing should be guided by a ‘principle of free movement of data within the EU’.
  • Broader use of open, well-documented Application Programming Interfaces (APIs) could be considered, through technical guidance, including identification and spreading of best practice for companies and public sector bodies.
  • The Commission could issue guidance based on the Trade Secrets Directive, copyright legislation and the Database Directive on how data control rights should be addressed in contracts. The Commission intends to launch the review of the Database Directive in 2017.
  • Access for public interest purposes – public authorities could be granted access to data where this would be in the general interest and would considerably improve the functioning of the public sector, for example access for statistical offices to business data or the optimization of traffic management systems on the basis of real-time data from private vehicles.
  • Selling and acquiring databases could be regulated. “Access against remuneration”: a framework based on fair, non-discriminatory terms could be developed for data holders, such as manufacturers, service providers or other parties, to provide access to the data they hold against remuneration. The Communication is not clear whether this proposal could also cover personal data. In any case, on several occasions throughout the draft Communication, it is mentioned or implied that the GDPR takes precedence over any new rules that would impact the protection of personal data.
  • A data producer’s right to use and licence the use of data could be introduced; by “data producer”, COM understands “the owner or long-term user of the device”. This approach would “open the possibility for users to exploit their data and thereby contribute to unlocking machine-generated data”.
  • Developing further rights to data portability (building on the GDPR data portability right and on the proposed rules on contract for the supply of digital content, further rights to portability of non-personal data could be introduced). The initiatives for data portability would be accompanied by sector specific experiments on standards (which would involve a multi-stakeholder collaboration including standard setters, industry, the technical community, and public authorities).

Rethinking liability rules for the IoT and AI era

Even though Artificial Intelligence is not mentioned as such in the draft Communication, it is clear that the scenario of algorithms making decisions is also envisaged by the announced objective to reform product liability rules, alongside IoT. As the draft Communication recalls, currently, the Products Liability Directive establishes the principle of strict liability, i.e. liability without fault: where a defective product causes damage to a consumer, the manufacturers may be liable even without negligence or fault on their part. The current rules are only addressed to the producer, always require a defect and that the causality between the defect and the damage has to be proven.

The Commission proposed two approaches, which will be subject to consultation:

  • “Risk-generating or risk-management approaches: liability would be assigned to the market players generating a major risk for others and benefitting from the relevant device, product or service or to those which are best placed to minimize or avoid the realization of the risk.”
  • Voluntary or mandatory insurance schemes: they would compensate the parties who suffered the damage; this approach would need to provide legal protection to investments made by business while reassuring victims regarding fair compensation or appropriate insurance in case of damage.”

“Connected and automated driving” – used as test case

The Commission intends to test all the proposed legal solutions, after engaging in wide consultations, in a real life scenario and proposes “connected and automated driving” as the test case.

Finally, read all of these objectives and proposals having in mind that they come from a draft document that was leaked to Euractiv. It is possible that by the time of adoption and publication of this Communication (and there is no indication as to when it will be officially published) its content will be altered.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Greek judges asked the CJEU if they should dismiss evidence gathered under the national law that transposed the invalidated Data Retention Directive

Here is a new case at the Court of Justice of the EU that the data protection world will be looking forward to, as it addresses questions about the practical effects of the invalidation of the Data Retention Directive.

old_bailey_microcosm

(licensed under Creative Commons)

Case C-475/16 K. (yes, like those Kafka characters) concerns criminal proceedings against K. before Greek courts, which apparently involve evidence gathered under the Greek national law that transposed the now-invalidated Data Retention Directive. The Directive was invalidated in its entirety by the CJEU in 2014, after the Court found in its Digital Rights Ireland judgment that the provisions of the Directive breached Articles 7 (right to respect for private life) and 8 (right to the protection of personal data) of the Charter of Fundamental Rights.

The Greek judges sent in August a big set out questions for a preliminary ruling to the CJEU (17 questions). Among those, there are a couple of very interesting ones, because they deal with the effects in practice of the invalidation of an EU Directive and what happens with national laws of the Member States that transposed the Directive.

For instance, the national judge asks whether national courts are obliged not to apply legislative measures transposing the annulled Directive and whether this obligation also means that they must dismiss evidence obtained as a consequence of those legislative measures (Question 3). The national judge also wants to know if maintaining the national law that transposes an invalidated Directive constitutes an obstacle to the establishment and functioning of the internal market (Question 16).

Another question raised by the national judge is whether the national legislation that transposed the annulled Data Retention Directive and that remained in force at national level after the annulment is still considered as falling under the scope of EU law (Question 4). The answer to this question is important because the EU Charter and the supremacy of EU law do not apply to situations that fall outside the scope of EU law.

The Greek judge didn’t miss the opportunity to also ask about the effect on the national law transposing the Data Retention Directive of the fact that this Directive was also enacted to implement a harmonised framework at the European level under Article 15(1) of the ePrivacy Directive (Question 5). The question is whether this fact is enough to bring the surviving national data retention laws under the scope of EU law.

As long as the Charter will be considered applicable to the facts of the case, the national judge further wants to know whether national law that complies partly with the criteria set out in the Digital Rights Ireland decision still breaches Articles 7 and 8 of the Charter because it doesn’t comply with all of it (Question 13). For instance, the national judge estimates that the national law doesn’t comply with the request that the persons whose data are retained must be at least indirectly in a situation which is liable to give rise to criminal prosecutions (para 58 DRI), but it complies with the request that the national law must contain substantive and procedural conditions for the access of competent authorities to the retained data and objective criteria by which the number of persons authorised to access these data is limited to what is strictly necessary (paras 61, 62 DRI).

Lastly, it will be also interesting to see whether the Court decides to address the issue of what “serious crime” means in the context of limiting the exercise of fundamental rights (Questions 10 and 11).

If you would like to dwell into some of these topics, have a look at the AG Opinion in the Tele2Sverige case, published on 19 July 2016. The judgment in that case is due on 21 December 2016. Also, have a look at this analysis of the Opinion.

As for a quick “what to expect” in the K. case from my side, here it is:

  • the CJEU will seriously re-organise the 17 questions and regroup them in 4 to 5 topics, also clarifying that it only deals with the interpretation of EU law, not national law or facts in national proceedings;
  • the national laws transposing the Data Retention Directive will probably be considered as being in the field of EU law – as they also regulate within the ambit of the ePrivacy Directive;
  • the Court will restate the criteria in DRI and probably clarify that all criteria must be complied with, no exceptions, in order for national measures to comply with the Charter;
  • the CJEU will probably not give indications to the national courts on whether they should admit or dismiss evidence collected on the bases of national law that does not comply with EU law – it’s too specific and the Court is ‘in the business’ of interpreting EU law; the best case scenario, which is possible, is that the Court will give some guidance on the obligations of Member States (and hopefully their authorities) regarding the effects of their transposing national laws when relevant EU secondary law is annulled;
  • as for what “serious crime” means in the context of limiting fundamental rights, let’s see about that. Probably the Court will give useful guidance.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Even if post Brexit-UK adopts the GDPR, it will be left without its “heart”

Gabriela Zanfir Fortuna

brexit

There has been lately a wave of optimism of those looking for legal certainty that the GDPR will be adopted by the UK even after the country leaves the European Union. This wave was prompted by a declaration of the British Secretary of State, Karen Bradley, at the end of October, when she stated before a Committee of the Parliament that “We will be members of the EU in 2018 and therefore it would be expected and quite normal for us to opt into the GDPR and then look later at how best we might be able to help British business with data protection while maintaining high levels of protection for members of the publicThe information commissioner of the UK, Elisabeth Denham, welcomed the news. On another hand, as Amberhawk explained in detail, this will not mean that the UK will automatically be considered as ensuring an adequate level of protection.

The truth is that as long as the UK is still a Member of the EU, it can’t opt in or opt out, for that matter, from regulations (other than the ones subject to the exemptions negotiated by the UK when it entered the Union – but this is not the case for the GDPR). They are “binding in their entirety” and “directly applicable”, according to Article 288 of the Treaty on the Functioning of the EU. So, yes, quite normally, if the UK is still a Member State of the EU on 25 May 2018, then the GDPR will start applying in the UK just as it will be applying in Estonia or France.

The fate of the GDPR after Brexit becomes effective will be as uncertain as the fate of all other EU legislative acts transposed in the UK or directly applicable in the UK. But let’s imagine the GDPR will remain national law after Brexit, in a form or another. If this happens, it is likely that it will take a life of its own, departing from harmonised application throughout the EU. First and foremost, the GDPR in the UK will not be applied in the light of the Charter of Fundamental Rights of the EU and especially its Article 8 – the right to the protection of personal data. The Charter played an extraordinary role in the strengthening of data protection in the EU after it became binding, in 2009, being invoked by the Court of Justice of the EU in its landmark judgments – Google v Spain,  Digital Rights Ireland and Schrems.

The Court held as far back as 2003 that “the provisions of Directive 95/46, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy, must necessarily be interpreted in the light of fundamental rights” (Österreichischer Rundfunk, para 68). This principle was repeated in most of the following cases interpreting Directive 95/46 and other relevant secondary law for this field, perhaps with the most notable results in Digital Rights Ireland and Schrems. 

See, for instance:

“As far as concerns the rules relating to the security and protection of data retained by providers of publicly available electronic communications services or of public communications networks, it must be held that Directive 2006/24 does not provide for sufficient safeguards, as required by Article 8 of the Charter, to ensure effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data” (Digital Rights Ireland, para. 66).

“As regards the level of protection of fundamental rights and freedoms that is guaranteed within the European Union, EU legislation involving interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must, according to the Court’s settled case-law, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data. The need for such safeguards is all the greater where personal data is subjected to automatic processing and where there is a significant risk of unlawful access to that data” (Schrems, para. 91).

Applying data protection law outside the spectrum of fundamental rights will most likely not ensure sufficient protection to the person. While the UK will still remain under the legal effect of the European Convention of Human Rights and its Article 8 – respect for private life – this by far does not equate to the specific protection ensured to personal data by Article 8 of the Charter as interpreted and applied by the CJEU.

Not only the Charter will not be binding for the UK post-Brexit, but the Court of Justice of the EU will not have jurisdiction anymore on the UK territory (unless some sort of spectacular agreement is negotiated for Brexit). Moreover, EU law will not enjoy supremacy over national law, as there is the case right now. This means that the British data protection law will be able to depart from the European standard (GDPR) to the extent desirable by the legislature. For instance, there will be nothing staying in the way of the British legislature to adopt permissive exemptions to the rights of the data subject, pursuant to Article 23 GDPR.

So when I mentioned in the title that the GDPR in the post-Brexit UK will in any case be left without its “heart”, I was referring to its application and interpretation in the light of the Charter of the Fundamental Rights of the EU.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Interested in the GDPR? See the latest posts:

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

Gabriela Zanfir Fortuna

The French Data Protection Authority, CNIL, made public this week the report of the public consultation it held between 16 and 19 July 2016 among professionals about the General Data Protection Regulation (GDPR). The public consultation gathered 540 replies from 225 contributors.

The main issues the CNIL focused on in the consultation were four:

  • the data protection officer;
  • the right to data portability;
  • the data protection impact assessments;
  • the certification mechanism.

These are also the four themes in the action plan of the Article 29 Working Party for 2016.

This post (Part I) will summarise the results and action plan for the first two themes, while the last two will be dealt with in a second post (Part II). [Disclaimer: all quotations are translated from French].

1) On the data protection officer

According to Article 37 GDPR, both the controller and the processor must designate a data protection officer where the processing is carried out by a public authority (1)(a), where their core activities consist of processing operations which require regular and systematic monitoring of data subjects on a large scale (1)(b) and where their core activities consist of processing sensitive data on a large scale (1)(c).

The report reveals that there are many more questions than answers or opinions about how Article 37 should be applied in practice. In fact, most of the contributions are questions from the contributors (see pages 2 to 4). They raise interesting points, such as:

  • What is considered to be a conflict of interest – who will not be able to be appointed?
  • Should the DPO be appointed before May 2018 (when GDPR becomes applicable)?
  • Will the CNIL validate the mandatory or the optional designation of a DPO?
  • Which will exactly be the role of the DPO in the initiative for and in the drafting of the data protection impact assessments?
  • Which are the internal consequences if the recommendations of the DPO are not respected?
  • Is it possible that the DPO becomes liable under Criminal law for how he/she monitors compliance with the GDPR?
  • Should the DPO be in charge of keeping the register of processing operations and Should the register be communicated to the public?
  • Should only the contact details of the DPO be published, or also his/her identity?
  • Must the obligations in the GDPR be applied also for the appointment of the DPO that is made voluntarily (outside the three scenarios in Article37(1))?
  • Can a DPO be, in fact, a team? Can a DPO be a legal person?
  • Are there any special conditions with regard to the DPO for small and medium enterprises?

The CNIL underlines that for this topic an important contribution was brought by large professional associations during discussions, in addition to the large number of replies received online.

In fact, according to the report, the CNIL acknowledges “the big expectations of professional associations  and federations to receive clarifications with regard to the function of the DPO, as they want to prepare as soon as possible and in a sustainable way for the new obligations” (p. 5).

As for future steps, the CNIL recalls that the Article 29 Working Party will publish Guidelines to help controllers in a practical manner, according to the 2016 action plan. (There’s not much left of 2016, so hopefully we’ll see the Guidelines soon!). The CNIL announces they will also launch some national communication campaigns and they will intensify the training sessions and workshops with the current CILs (Correspondants Informatique et Libertés – a role similar to that of a DPO).

2) On the right to data portability

new-note-2

Article 20 GDPR provides that the data subject has the right to receive a copy of their data in a structured, commonly used and machine-readable format and has the right to transmit those data to another controller only if the processing is based on consent or on a contract.

First, the CNIL notes that there was “a very strong participation of the private sector submitting opinions or queries regarding the right to data portability, being interesting especially about the field of application of the new right, the expenses its application will require and about its consequences on competition” (p. 6).

According to the report, the right to data portability it’s perceived as an instrument that allows regaining the trust of persons about processing of their personal data, bringing more transparency and more control over the processing operation (p. 6).

On another hand, the organisations that replied to the public consultation are concerned about the additional investments they will need to make to implement this right. They are also concerned about (p. 6):

  • “the risk of creating an imbalance in competition between European and American companies, as European companies are directly under the obligation to comply with this right, whereas American companies may try to circumvent the rules”. My comment here would be that they should not be concerned about that, because if they target the same European public to offer services, American companies will also be under a direct obligation to comply with this right.
  • “the immediate cost of implementing this right (for instance, the development of automatic means to extract data from databases), which cannot be charged to the individuals, but which will be a part of the management costs and will increase the costs for the services”.
  • “the level of responsibility if the data are mishandled or if the data handed over to the person are not up to date”.

The respondents to the public consultation seem to be a good resource for technical options to use in terms of the format needed to transfer data. Respondents argued in favor of open source formats, which will make reusing the data easier and which will be cheaper compared to proprietary solutions. Another suggested solution is the development of Application Program Interfaces (APIs) based on open standards, without a specific licence key. This way the persons will be able to use the tools of their choice.

One of the needs that emerged from the consultation was to clarify whether the data that are subject to the right to portability must be raw data, or whether transferring a “summary” of the data would suffice. Another question was whether the data could be asked for by a competing company, with a mandate from the data subject. There were also questions regarding the interplay of the right to data portability and the right of access, or asking how could data security be ensured for the transfer of the “ported” data.

In the concluding part, the CNIL acknowledges that two trends could already be seen within the replies: on the one hand, companies tend to want to limit as much as possible the applicability of the right to data portability, while on the other hand, the representatives of the civil society are looking to encourage persons to take their data in their own hands and to reinvent their use (p. 10).

According to the report, the Technology Subgroup of the Article 29 Working Party is currently drafting guidelines with regard to the right to data portability. “They will clarify the field of application of this right, taking into account all the questions raised by the participants to the consultation, and they will also details ways to reply to portability requests”, according to the report (p. 10).

***

Find what you’re reading useful? Consider supporting pdpecho.

Click HERE for Part II of this post.

A look at political psychological targeting, EU data protection law and the US elections

Cambridge Analytica, a company that uses “data modeling and psychographic profiling” (according to its website), is credited with having decisively contributed to the outcome of the presidential election in the U.S.. They did so by using “a hyper-targeted psychological approach” allowing them to see trends among voters that no one else saw and thus to model the speech of the candidate to resonate with those trends. According to Mashable, the same company also assisted the Leave. EU campaign that leaded to Brexit.

How do they do it?

“We collect up to 5,000 data points on over 220 million Americans, and use more than 100 data variables to model target audience groups and predict the behavior of like-minded people” (my emphasis), states their website (for comparison, the US has a 324 million population). They further explain that “when you go beneath the surface and learn what people really care about you can create fully integrated engagement strategies that connect with every person at the individual level” (my emphasis).

According to Mashable, the company “uses a psychological approach to polling, harvesting billions of data from social media, credit card histories, voting records, consumer data, purchase history, supermarket loyalty schemes, phone calls, field operatives, Facebook surveys and TV watching habits“. This data “is bought or licensed from brokers or sourced from social media”.

(For a person who dedicated their professional life to personal data protection this sounds chilling.)

Legal implications

Under US privacy law this kind of practice seems to have no legal implications, as it doesn’t involve processing by any authority of the state, it’s not a matter of consumer protection and it doesn’t seem to fall, prima facie, under any piece of the piecemeal legislation dealing with personal data in the U.S. (please correct me if I’m wrong).

Under EU data protection law, this practice would raise a series of serious questions (see below), without even getting into the debate of whether this sort of intimate profiling would also breach the right to private life as protected by Article 7 of the EU Charter of Fundamental Rights and Article 8 of the European Convention of Human Rights (the right to personal data protection and the right to private life are protected separately in the EU legal order). Put it simple, the right to data protection enshrines the “rules of the road” (safeguards) for data that is being processed on a lawful ground, while the right to private life protects the inner private sphere of a person altogether, meaning that it can prohibit the unjustified interferences in the person’s private life. This post will only look at mass psychological profiling from the data protection perspective.

Does EU data protection law apply to the political profilers targeting US voters?

But why would EU data protection law even be applicable to a company creating profiles of 220 million Americans? Surprisingly, EU data protection law could indeed be relevant in this case, if it turns out that the company carrying out the profiling is based in the UK (London-based), as several websites claim in their articles (here, here and here).

Under Article 4(1)(a) of Directive 95/46, the national provisions adopted pursuant to the directive shall apply “where the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State“. Therefore, the territorial application of Directive 95/46 is triggered by the place of establishment of the controller.  Moreover, Recital 18 of the Directive’s Preamble explains that “in order to ensure that individuals are not deprived of the protection to which they are entitled under this Directive, any processing of personal data in the Community (EU – n.) must be carried out in accordance with the law of one of the Member States” and that “in this connection, processing carried out under the responsibility of a controller who is established in a Member State should be governed by the law of that State” (see also CJEU Case C-230/14 Weltimmo, paras. 24, 25, 26).

There are, therefore, no exceptions to applying EU data protection rules to any processing of personal data that is carried out under the responsibility of a controller established in a Member State. Is it relevant here whether the data subjects are not European citizens, and whether they would not even be physically located within Europe? The answer is probably in the negative. Directive 95/46 provides that the data subjects it protects are “identified or identifiable natural persons“, without differentiating them based on their nationality. Neither does the Directive link its application to any territorial factor concerning the data subjects. Moreover, according to Article 8 of the EU Charter of Fundamental Rights, “everyone has the right to the protection of personal data concerning him or her”.

I must emphasise here that the Court of Justice of the EU is the only authority that can interpret EU law in a binding manner and that until the Court decides how to interpret EU law in a specific case, we can only engage in argumentative exercises. If the interpretation proposed above would be found to have some merit, it would indeed be somewhat ironic to have the data of 220 million Americans protected by EU data protection rules.

What safeguards do persons have against psychological profiling for political purposes?

This kind of psychological profiling for political purposes would raise a number of serious questions. First of all, there is the question of whether this processing operation involves processing of “special categories of data”. According to Article 8(1) of Directive 95/46, “Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.” There are several exceptions to this prohibition, of which only two would conceivably be applicable to this kind of profiling:

  • if the data subject has given his explicit consent to the processing of those data (letter a) or
  • the processing relates to data which are manifestly made public by the data subject (letter e).

In order for this kind of psychological profiling to be lawful, the controller must obtain explicit consent to process all the points of data used for every person profiled. Or the controller must only use those data points that were manifestly made public by a person.

Moreover, under Article 15(1) of Directive 95/46, the person has the right “not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”. It is of course to be interpreted to what extent psychological profiling for political purposes produces legal effects or significantly affects the person.

Another problem concerns the obligation of the controller to inform every person concerned that this kind of profiling is taking place (Articles 10 and 11 of Directive 95/46) and to give them details about the identity of the controller, the purposes of the processing and all the personal data that is being processed. In addition, the person should be informed that he or she has the right to ask for a copy of the data the controller holds about him or her and the right to ask for the erasure of that data if it was processed unlawfully (Article 12 of Directive 95/46).

Significantly, the person has the right to opt-out of a processing operation, at any time, without giving reasons, if that data is being processed for the purposes of direct marketing (Article 14(b) of Directive 95/46). For instance, in the UK, the supervisory authority – the Information Commissioner’s Office, issued Guidance for political campaigns in 2014 and gave the example of “a telephone call which seeks an individual’s opinions in order to use that data to identify those people likely to support the political party or referendum campaign at a future date in order to target them with marketing” as constituting direct marketing.

Some thoughts

  • The analysis of how EU data protection law is relevant for this kind of profiling would be more poignant if it would be made under the General Data Protection Regulation, which will become applicable on 25 May 2018 and which has a special provision for profiling.
  • The biggest ever fine issued by the supervisory authority in the UK is 350.000 pounds, this year. Under the GDPR, breaches of data protection rules will lead to fines up to 20 million euro or 4% of the controller’s global annual turnover for the previous year, whichever is higher.
  • If any company based in the UK used this kind of psychological profiling and micro-targeting for the Brexit campaign, that processing operation would undoubtedly fall under the rules of EU data protection law. This stands true of any analytics company that provides these services to political parties anywhere in the EU using personal data of EU persons. Perhaps this is a good time to revisit the discussion we had at CPDP2016 on political behavioural targeting (who would have thought the topic will gain so much momentum this year?)
  • I wonder if data protection rules should be the only “wall (?)” between this sort of targeted-political-message-generating campaign profiling and the outcome of democratic elections.
  • Talking about ethics, data protection and big data together is becoming more urgent everyday.

***

Find what you’re reading useful? Consider supporting pdpecho.

Analysis of the AG Opinion in the “PNR Canada” Case: unlocking an “unprecedented and delicate” matter

AG Mengozzi delivered his Opinion in the EU-Canada PNR case (Opinion 1/15) on 8 September 2016. While his conclusions clearly indicate that, in part, the current form of the agreement between Canada and the EU “on the transfer and processing of Passenger Name Record data” is not compliant with EU primary law – and in particular with Articles 7, 8 and 52(1) of the Charter[1] and Article 16(2) TFEU[2], the AG seems to accept that PNR schemes in general (involving indiscriminate targeting, profiling, preemptive policing) are compatible with fundamental rights in the EU.

In summary, it seems to me that the AG’s message is: “if you do it unambiguously and transparently, under independent supervision, and without sensitive data, you can process PNR data of all travellers, creating profiles and targeting persons matching patterns of suspicious behaviour”.

This is problematic for the effectiveness of the right to the protection of personal data and the right to respect for private life. Even though the AG agrees that the scrutiny of an international agreement such as the EU-Canada PNR Agreement should not be looser than that of an ordinary adequacy decision or that of an EU Directive, and considers that both Schrems and Digital Rights Ireland should apply in this case, he doesn’t apply in all instances the rigorous scrutiny the Court uses in those two landmark judgments. One significant way in which he is doing this is by enriching the ‘strict necessity test’ so that it comprises a “fair balance” criterion and an “equivalent effectiveness” threshold (See Section 5).

On another hand, AG Mengozzi is quite strict with the safeguards he sees as essential in order to make PNR agreements such as the one in this case compatible with fundamental rights in the EU.

Data protection authorities have warned time and again that PNR schemes are not strictly necessary to fight terrorism, serious and transnational crimes – they are too invasive and their effectiveness has not yet been proven. The European Data Protection Supervisor – the independent advisor of the EU institutions on all legislation concerning processing of personal data, has issued a long series of Opinions on PNR schemes – be it in the form of international agreements on data transfers, adequacy decisions or EU legislation, always questioning their necessity and proportionality[3]. In the latest Opinion from this series, on the EU PNR Directive, the EDPS clearly states that the non-targeted and bulk collection and processing of data of the PNR scheme amount to a measure of general surveillance” (§63) and in the lack of appropriate and unambiguous evidence that such a scheme is necessary, the PNR scheme is not compliant with Articles 7, 8 and 52 of the Charter, Article 16 TFEU and Article 8 ECHR (§64).

The Article 29 Working Party also has a long tradition in questioning the idea itself of a PNR system. A good reflection of this is Opinion 7/2010, where the WP states that “the usefulness of large-scale profiling on the basis of passenger data must be questioned thoroughly, based on both scientific elements and recent studies” (p. 4) and declares that it is not satisfied with the evidence for the necessity of such systems.

The European Parliament suspended the procedure to conclude the Agreement and decided to use one of its new powers granted by the Treaty of Lisbon and asked the CJEU to issue an Opinion on the compliance of the Agreement with EU primary law (TFEU and the Charter).

Having the CJEU finally look at PNR schemes is a matter of great interest for all EU travellers, and not only them. Especially at a time like this, when it feels like surveillance is served to the people by states all over the world – from liberal democracies to authoritarian states, as an acceptable social norm.

General remarks: first-timers and wide implications

The AG acknowledges in the introductory part of the Opinion that the questions this case brought before the Court are “unprecedented and delicate” (§5). In fact, the AG observes later on in the Opinion that the “methods” applied to PNR data, once transferred, in order to identify individuals on the basis of patterns of behavior of concern are not at all provided for in the agreement and “seem to be entirely at the discretion of the Canadian authorities” (§164). This is why the AG states that one of the greatest difficulties of this case is that it “entails ascertaining … not merely what the agreement envisaged makes provision for, but also, and above all, what it has failed to make provision for” (§164).

The AG also makes it clear in the beginning of the Opinion that the outcome of this case has implications on the other “PNR” international agreements the EU concluded with Australia and the US and on the EU PNR Directive (§4). A straightforward example of a possible impact on these other international agreements, beyond analyzing their content, is the finding that the legal basis on which they were adopted is incomplete (they must be also based on Article 16 TFEU) and wrong (Article 82(1)(d) TFEU on judicial cooperation is incompatible as legal basis with PNR agreements).

The implications are even wider than the AG acknowledged. For instance, a legal instrument that could be impacted is the EU-US Umbrella Agreement – another international agreement on transfers of personal data from the EU to the US in the law enforcement area, which has both similarities and differences compared to the PNR agreements. In addition, an immediately affected legal process will be the negotiations that the European Commission is currently undertaking with Mexico for a PNR Agreement.

Even if it is not an international agreement, the adequacy decision based on the EU-US Privacy Shield deal could be impacted as well, especially with regard to the findings on the independence of the supervisory authority in the third country where data are transferred (See Section 6 for more on this topic).

Finally, the AG also mentions that this case allows the Court to “break the ice” in two matters:

  • It will examine for the first time the scope of Article 16(2) TFEU (§6) and
  • rule for the first time on the compatibility of a draft international agreement with the fundamental rights enshrined in the Charter, and more particularly with those in Article 7 and Article 8 (§7).

Therefore, the complexity and novelty of this case are considerable. And they are also a good opportunity for the CJEU to create solid precedents in such delicate matters.

I structured this post around the main ideas I found notable to look at and summarize, after reading the 328-paragraphs long Opinion. In order to make it easier to read, I’ve split it into 6 Sections, which you can find following the links below.

  1. De-mystifying Article 16 TFEU: yes, it is an appropriate legal basis for international agreements on transfers of personal data
  2. A look at the surface: it is not an adequacy decision, but it establishes adequacy
  3. An interference of “a not insignificant gravity”: systematic, transforming all passengers into potential suspects and amounting to preemptive policing
  4. Innovative thinking: Article 8(2) + Article 52(1) = conditions for justification of interference with Article 8(1)
  5. The awkward two level necessity test that convinced the AG the PNR scheme is acceptable
  6. The list of reasons why the Agreement is incompatible with the Charter and the Treaty

……………………………………………………….

[1] Article 7 – the right to respect for private life, Article 8 – the right to the protection of personal data, Article 52(1) – limitations of the exercise of fundamental rights.

[2] With regard to the obligation to have independent supervision of processing of personal data.

[3] See the latest one, Opinion 5/2015 on the EU PNR Directive and see the Opinion on the EU-Canada draft agreement.

***

Find what you’re reading useful? Consider supporting pdpecho.

Section 1. De-mystifying Article 16 TFEU: yes, it is an appropriate legal basis for concluding international agreements on transfers of personal data

(Section 1 of the Analysis of the AG Opinion in the “PNR Canada” Case: unlocking an “unprecedented and delicate” matter)

Currently, the Council decision adopted for concluding the EU-Canada PNR agreement rests on two legal bases: Article 82(1)(d) TFEU – on judicial cooperation in criminal matters within the Union[1] and Article 87(2)(a) TFEU – on police cooperation in criminal matters within the Union[2], in conjunction with Articles 218(5) and 218(6)(a) TFEU – procedure to negotiate international agreements. In his Opinion on the EU-Canada PNR Agreement  in 2013, the European Data Protection Supervisor questioned the choice of the legal basis and recommended that the proposal be based on Article 16 TFEU “as a comprehensive legal basis”, in conjunction with the Articles on the procedure to conclude international agreements, considering that:

According to Article 1 of the Agreement, its purpose is to set out the conditions for the transfer and use of PNR data in order to, on the one hand, “ensure the security and safety of the public” and, on the other hand, “prescribe the means by which the data shall be protected”. In addition, the vast majority of provisions of the Agreement relate to the latter objective, i.e. the protection of personal data, including data security and integrity. (EDPS Opinion on EU-Canada PNR, §8).

The European Parliament asked the Court in its request for an Opinion if the police cooperation and judicial cooperation articles are an appropriate legal basis, or if the act should be based on Article 16 TFEU.

  1. Why it matters to have a correct legal basis

As the AG acknowledges, the choice of the appropriate legal basis for concluding an international agreement has “constitutional significance” (§40). “The use of an incorrect legal basis is therefore apt to invalidate the act concluding the agreement and thus to vitiate the European Union’s consent to be bound by that agreement” (§40). Therefore, an act adopted on the wrong legal basis can be invalidated by the Court.

First of all, the AG recalled the settled case-law of the Court that the choice of legal basis for an EU measure “must rest on objective factors amenable to judicial review, which include the purpose and the content of that measure” (§61). He also recalled that if the measure pursues a twofold purpose, which can be differentiated into a predominant and an incidental purpose, “the act must be based on a single legal basis, namely, that required by the main or predominant purpose or component” (§61). The Court accepts only as an exception that an act may be founded on various legal bases corresponding to the number of objectives, if those are “inseparably linked, without one being incidental in relation to the other” (§62).

2. Are the two objectives of the Agreement inseparable?

The AG identifies the two objectives of the agreement – combating terrorism and other serious transnational crimes and respecting private life and the protection of personal data and he struggles to argue that the agreement “pursues two objectives and has two components that are inseparable” (§78) and he finds it difficult “to determine which of those objectives prevails over the other” (§79).

In my view, it is not difficult to identify the protection of personal data as the predominant purpose (think of causa proxima in legal theory) and the fight against terrorism as the incidental purpose (think of causa remota in legal theory).

In the Agreement, according to Article 1, “the Parties set out the conditions for the transfer and use of PNR data to ensure the security and safety of the public and prescribe the means by which the data is protected”. In other words, first and foremost, the Agreement sets out rules for transferring and using PNR data, including by prescribing the means by which the data is protected (causa proxima). This is done to ultimately ensure the security and safety of the public (causa remota).

This conclusion is reinforced by the content of the Agreement, which manifestly contains rules mainly relating to the processing of personal data – Article 2 Definitions, Article 3 – Use of PNR data, Article 5 – Adequacy and in the Chapter titled Safeguards applicable to the use of PNR data”, with Articles from 7 to 21, while the last 9 articles concern “implementing and final provisions” of a technical nature. It is also reinforced by the fact that the transfer of PNR data on the EU side is done from private companies and by the fact that, contrary to what the AG argues, the Agreement itself does not establish an obligation to transfer data.

The AG explains that “it is incorrect to claim that the agreement envisaged lays down no obligation for the airlines to transfer the PNR data to the Canadian competent authority” (§92). While he acknowledges that it is true that Article 4(1) of the Agreement states that the Union is to ensure only that air carriers “are not prevented” from transferring PNR data to the Canadian competent authority, he interprets that Article “in conjunction with Articles 5, 20 and 21 of the Agreement” in the sense that “air carriers are entitled and in practice required to provide the Canadian competent authority systematically with access to the PNR data for the purposes defined in Article 3 of the agreement envisaged” (§92).

In fact, Article 5 of the Agreement establishes that the Canadian Competent Authority “is deemed to ensure” an adequate level of data protection (therefore, indeed, air carriers would not be prevented to transfer data because of data protection concerns), Article 20 obliges the air carriers to use the “push method” when they transfer data and Article 21 sets out rules on the frequency of the requests of PNR data by the Canadian Competent Authority. While it is true that the last two articles set out rules for how the data should be transferred, neither contains a positive obligation for the air carriers to transfer the data.

Therefore, it seems to be in fact clear that the purpose of PNR arrangements like the one in the present case is to make sure that EU data protection law does not prevent air carriers to send data of travellers to authorities of third countries systematically, in bulk and without an ex ante control.

As the AG points out, “if Article 16 TFEU were taken as the sole legal basis of the act concluding the agreement envisaged, that would alter the status of the Kingdom of Denmark, Ireland and the United Kingdom of Great Britain and Northern Ireland, as those Member States would then be directly and automatically bound by the agreement, contrary to Article 29 of the agreement envisaged” (§51). This would happen because the Agreement would not be placed anymore under the former third pillar (law enforcement, police and judicial cooperation), which would not give the right to Denmark, Ireland and UK to opt out of it. Therefore, the Agreement would automatically apply to all EU Member States. However, this argument should not play a role in deciding which is the appropriate legal basis, as it is not linked to the purpose or the content of the Agreement at all.

Nevertheless, the AG established that the purposes of fighting crime and respecting data protection rights are inseparable. This is in any case a valuable further step, considering that the Council and the Commission completely excluded Article 16 TFEU from the legal bases. So which are the appropriate legal bases the AG recommends?

3. The “judicial cooperation” Article, found to be irrelevant

The AG finds that “as currently drafted, the agreement envisaged does not really seem to contribute to facilitating cooperation between the judicial or equivalent authorities of the Member States” (§108), within the meaning of Article 82(1)(d) TFEU. He sees as incidental the possibility for judicial authorities of Canada to send in particular cases PNR data to judicial authorities in the EU, which would further contribute to judicial cooperation within the EU.

Interestingly, the AG mentions that this conclusion is not affected by the fact that the Council decisions concluding the PNR Agreements with US and Australia are also based on Article 82(1)(d). He reminds that “the legal basis used for the adoption of other Union measures that might display similar characteristics is irrelevant” (§109).

However, the fact remains that if Article 82(1)(d) is not a proper legal basis for the act concluding the EU-Canada PNR Agreement, it is most probably not a proper legal basis for the other EU acts concluding PNR Agreements.

4. The “police cooperation” Article, found to be relevant

Even if he saw that the agreement does not in fact facilitate judicial cooperation within the Union, the AG considers that, on another hand, it does facilitate police cooperation within the Union. To this end, he is building his argumentation mainly on Article 6 of the Agreement, which is the only one referring to “Police and judicial cooperation”.

Indeed, as recalled in §105, “under Article 6(2) of the agreement envisaged Canada is required, at the request of, among others, the police or a judicial authority of a Member State of the Union, to share, in specific cases, PNR data or analytical information containing PNR data obtained under the agreement envisaged in order to prevent or detect ‘within the European Union’ a terrorist offence or serious transnational crime.”

However, what the AG does not refer to in his analysis is the last sentence of Article 6(2) of the Agreement, which states that Canada shall make this information available in accordance with agreements and arrangements on law enforcement, judicial cooperation, or information sharing, between Canada and Europol, Eurojust or that Member State”. Therefore, sharing PNR data obtained by Canada from air carriers in the conditions set out in the Canada-PNR Agreement with Europol, Eurojust or a specific MS will be done in accordance with separate agreements. In conclusion, there are completely different agreements that have as purpose sharing of information to ensure both police and judicial cooperation between Canada and the competent authorities of the EU, which apply to sharing PNR data as well.

Finally, the AG considers that indeed Article 87(2)(a) is properly set out as legal basis of the act concluding the agreement envisaged, but he also states that it seems to him it is “insufficient to enable the Union to conclude that agreement”. Therefore, he proposes the act concluding the Agreement to be also based on Article 16(2) TFEU.

This conclusion prompts a much expected first substantive analysis of the content of Article 16(2) TFEU in an act of the Court of Justice after the entering into force of the Lisbon Treaty in 2009.

5. Relevance of Article 16(2) TFEU to serve as legal basis for concluding the EU-Canada PNR Agreement

 The AG recalls that “the content of the agreement envisaged supports that [data protection – my addition] objective, in particular the terms in the chapter on ‘Safeguards applicable to the processing of PNR data’, consisting of Articles 7 to 21 of the agreement envisaged” (§113). Therefore, he concludes that, in his view, “action taken by the Union must necessarily be based … on the first subparagraph of Article 16(2) TFEU, which, it will be recalled, confers on the Parliament and the Council the task of laying down the rules relating to the protection of individuals with regard to the processing of personal data by, inter alia, the Member States when carrying out activities which fall within the scope of application of EU law and the rules relating to the free movement of such data” (§114).

The AG further develops the three main principles that underlie this approach.

Firstly, he reminds that the EU is competent to conclude international agreements in the field of data protection (Article 216(1) TFEU in conjunction with Article 16 TFEU). In addition, “there is no doubt that the terms of the agreement envisaged must be characterized as “rules” relating to the protection of the data of natural persons, within the meaning of the first subparagraph of Article 16(1) TFEU, and intended to bind the contracting parties” (§115). (Note: considering Article 16(1) does not have subparagraphs, probably there was an error of transcript and this reference should have been either to the first subparagraph of Article 16(2) or simply to Article 16(1)).

Secondly, the AG adds that the first subparagraph of Article 16(2) “is intended to constitute the legal basis for all rules adopted at EU level relating to the protection of individuals with regard to the processing of their personal data, including the rules coming within the framework of the adoption of measures relating to the provisions of the FEU Treaty on police and judicial cooperation in criminal matters” (§116). He explains thus why Article 16 TFEU is relevant even if the act concluding the Agreement would also be based on an Article providing for police cooperation.

Thirdly, and most importantly, the AG clearly states that Article 16(2) cannot be considered irrelevant for the agreement because the protecting measures which can be adopted under that Article relate to the processing of data by authorities of the Member States and not, as in this instance, to the transfer of data previously obtained by private entities (the air carriers) to a third country (§118). This is a key interpretation, because, indeed, the ad litteram wording of Article 16 is restrictive – it refers to putting in place rules by the Union regarding processing of personal data by:

  • Union institutions, bodies, offices and agencies and
  • By the Member States when carrying out activities which fall within the scope of Union law.

Applying Article 16 ad litteram would mean that the Union does not have the competence to regulate how private entities process data. As the AG convincingly explains, “to put a strictly literal interpretation on the new legal basis constituted by the first subparagraph of Article 16(2) TFEU would be tantamount to splitting up the system for the protection of personal data. Such an interpretation would run counter to the intention of the High Contracting Parties to create, in principle, a single legal basis expressly authorising the EU to adopt rules relating to the protection of the personal data of natural persons. It would therefore represent a step backwards from the preceding scheme based on the Treaty provisions relating to the internal market, which would be difficult to explain. That strictly literal interpretation of Article 16 TFEU would thus have the consequence of depriving that provision of a large part of its practical effect” (§119).

 The AG concludes that the answer to the question about the legal basis is that “in the light of the objectives and the components of the agreement envisaged, which are inseparably linked, the act concluding that agreement must in my view be based on the first subparagraph of Article 16(2) TFEU and Article 87(2)(a) TFEU as its substantive legal bases” (§120).

Before going through the analysis of the compliance of the Agreement with Articles 7 and 8 of the Charter, it’s worth having a look at one of the fundamental issues raised by the Agreement, but which, unfortunately, was only looked at briefly and with no consequence.

 

……………………………………………………….

[1] “The European Parliament and the Council, acting in accordance with the ordinary legislative procedure, shall adopt measures to:

(d) facilitate cooperation between judicial or equivalent authorities of the Member States in relation to proceedings in criminal matters and the enforcement of decisions.”

[2] 1. The Union shall establish police cooperation involving all the Member States’ competent authorities, including police, customs and other specialised law enforcement services in relation to the prevention, detection and investigation of criminal offences.

  1. For the purposes of paragraph 1, the European Parliament and the Council, acting in accordance with the ordinary legislative procedure, may establish measures concerning:

(c) common investigative techniques in relation to the detection of serious forms of organised crime.

Section 2. A look at the surface: it is not an adequacy decision, but it establishes adequacy

(Section 2 of the Analysis of the AG Opinion in the “PNR Canada” Case: unlocking an “unprecedented and delicate” matter)

One of the fundamental issues concerning agreements such as the one in the present case is how do these agreements relate to the concept of “adequacy finding” for the purposes of transfers of personal data from the EU to third countries.

While it is straightforward looking at their nature that they are not unilateral acts issued by the European Commission to establish that a third country or the authorities of a third country have an adequate level of protection (as was the Decision invalidated by the Schrems judgement), in essence these agreements have the same effect as that of adequacy decisions: they establish a presumption that the legal system at the receiving end of a data transfer from the EU ensures an adequate level of data protection, eliminating thus impediments of transfers based on concerns that the data are not properly protected at the receiving end.

While the process leading to an adequacy decision by the Commission is long and involves a thorough analysis of the legal system of the third country in order to ascertain that it provides an essentially equivalent level of protection in theory and in practice, the conclusion of an international agreement involves a high level negotiation and commitments taken by the third country that it would ensure appropriate protection. It is more difficult to ascertain and control a posteriori if this indeed happens in practice. Moreover, if the commitments taken by the third country are not sufficient in the Agreement, a clause establishing that the transfers to that country are deemed to comply with EU data protection law may very well be considered as breaching Article 8(1) of the Charter. The CJEU stated in Schrems that the requirements for ensuring lawful international transfers of personal data stem from Article 8(1) of the Charter and the general obligation enshrined therein “to protect personal data” (§71-§72 of Schrems).

These issues are extremely challenging and the current proceedings would be a very good opportunity to address them. However, the AG only marginally touches this question and he does that only to argue against the fact that data protection is the predominant purpose of the Agreement and to argue in favour of a strict review of the limitations brought by the provisions of the Agreement to the exercise of Article 8 of the Charter.

First, in §93, he states that “the object of the agreement envisaged cannot principally be treated as equivalent to an adequacy decision, comparable to the decision which the Commission had adopted under the 2006 Agreement”. He continues by arguing that “both the aim and the content of the agreement envisaged show, on the contrary, that that agreement is intended to reconcile the two objectives which it pursues and that those objectives are inseparably linked” (i.e. – data protection and fight against terrorism) (§93).

However, about a hundred of paragraphs later, after he recalls the finding in §93 that “the agreement envisaged cannot be reduced to a decision finding that the Canadian competent authority guarantees an adequate level of protection” (§203), he recognizes that “Article 5 of the agreement envisaged does indeed provide that, subject to compliance with the terms of that agreement, the Canadian Competent Authority is to be deemed to provide an adequate level of protection, within the meaning of relevant Union data protection law, for the processing and use of PNR data” (§203).

Moreover, in the same paragraph, the AG even adds that “the contracting parties’ intention is indeed to ensure that the high level of personal data protection achieved in the Union may be guaranteed when the PNR data is transferred to Canada” (§203).

The arguments above follow after in paragraph 200 the AG finds that the provisions of the agreement should be subject to a strict review by the Court regarding their compliance with the requirements resulting also from “the adequacy of the level of protection of the fundamental rights guaranteed in the Union when Canada processes and uses the PNR data pursuant to the agreement envisaged”.

This analysis seems to me contradictory – both by comparing §93 and §203, and by comparing statements within §203. In any case, the consequences of the intention to establish adequacy through an international agreement are not further analysed. The only conclusion the AG draws after identifying the underlying intention of the parties to conclude this agreement is just that “I see no reason why the Court should not carry out a strict review of compliance with the principle of proportionality” (§203). Moreover, he further expands this argumentation by referring to the Schrems case and findings therein concerning “essentially equivalence” and how the means ensuring this equivalence must be “effective in practice” (§204).

Hopefully, the Court in its final Opinion will make a more in depth analysis of this issue.

Section 3. An interference of “a not insignificant gravity”: systematic, transforming all passengers into potential suspects and amounting to preemptive policing

(Section 3 of the Analysis of the AG Opinion in the “PNR Canada” Case: unlocking an “unprecedented and delicate” matter)

In order to answer the first question raised by the Parliament in the proceedings before the Court – whether the Agreement complies with EU Primary law, and in particular with Articles 7 and 8 of the Charter, AG Mengozzi follows the classical test: is there an interference?[1] And if so, is the interference justified?[2]

Analyzing separately Articles 7 and 8 of the Charter, still a challenge

Even if the Court has recently started to analyze separately the rights protected by Article 7 (to respect for private life) and by Article 8 of the Charter (to the protection of personal data) – see the judgments in DRI and Schrems, the AG seems to hesitate again between the two rights. He starts his analysis on whether there is an interference with the two rights (§170) by recalling the older case-law of the Court which stated that the right to the protection of private life and the right to the protection of personal data are “closely connected” (Schecke, §47; ASNEF, §41).

First he finds that the PNR data “touches on the area of the privacy, indeed intimacy, of persons and indisputably relates to one or more identified or identifiable individual or individuals” (§170). Thus, in the same sentence, the AG brings PNR data within the scope of both Article 7 and Article 8 of the Charter. He further identifies different treatments of the data under the terms of the Agreement (§170):

– systematic transfer of PNR data to the Canadian public authorities,

– access to that data,

– the use of that data,

– its retention for a period of five years by those public authorities,

– its subsequent transfer to other public authorities, including those of third countries,

The AG states that all of the above are “operations which fall within the scope of the fundamental right to respect for private and family life guaranteed by Article 7 of the Charter and to the ‘closely connected’ but nonetheless distinct right to protection of personal data guaranteed by Article 8(1) of the Charter and constitute an interference with those fundamental rights” (§170).

Therefore, the AG does not differentiate here between what constitutes interference with the right to respect for private life and what constitutes interference with the right to the protection of personal data.

However, in the following paragraph, the AG does make such a differentiation, but only because he restates the findings of the Court in Digital Rights Ireland, even if this partly repeats some of the findings in §170: “the obligation to retain that data, required by the public authorities, and subsequent access of the competent national authorities to data relating to a person’s private life also constitutes in itself an interference with the rights guaranteed by Article 7 of the Charter (he refers here to §34 and §35 of DRI in a footnote). Likewise, an EU act prescribing any form of processing of personal data constitutes an interference with the fundamental right, laid down in Article 8 of the Charter, to protection of such data (he refers here to §29 and §36 of DRI)” (§171).

There is not a lot of clarity transpiring from these two paragraphs, especially considering that §170 in fact refers to interference only with the first paragraph of Article 8 and not with the entire Article 8 (See also Section 4 of this analysis for additional comments prompted by this differentiation).

What is certain is that indeed there is an interference with both rights. The AG further notes the seriousness of that interference, indicating that he is fully aware of its severity:

“The fact nonetheless remains that the interference constituted by the agreement envisaged is of a considerable size and a not insignificant gravity. It systematically affects all passengers flying between Canada and the Union, that is to say, several tens of millions of persons a year. Furthermore, as most of the interested parties have confirmed, no one can fail to be aware that the transfer of voluminous quantities of personal data of air passengers, which includes sensitive data, requiring, by definition, automated processing, and the retention of that data for a period of five years, is intended to permit a comparison, which will be retroactive where appropriate, of that data with pre-established patterns of behaviour that is ‘at risk’ or ‘of concern’, in connection with terrorist activities and/or serious transnational crime, in order to identify persons not hitherto known to the police or not suspected. Those characteristics, apparently inherent in the PNR scheme put in place by the agreement envisaged, are capable of giving the unfortunate impression that all the passengers concerned are transformed into potential suspects” (§176).

Even though at this stage the AG acknowledges the severity of the interference with fundamental rights of PNR schemes, he deems it to be justified by necessity (See Section 5 of this analysis).

Finally, it is also notable to mention that the AG found that the procedures for collecting the data come within the competence of the air carriers, “which, in this regard, must act in compliance with the relevant national provisions and with EU law” (§178). He concludes that “the collection of the PNR data therefore does not constitute a processing of personal data entailing an interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter that results from the agreement envisaged itself. In the light of the limited power of the Court in the context of the opinion procedure, that operation will therefore not form the subject matter of the following developments” (§179).

 

……………………………………………………..

[1] Dealt with in this section.

[2] Dealt with in Sections 4 and 5 of this analysis.