EU Commission’s leaked plan for the data economy: new rules for IoT liability and sharing “non-personal data”

It seems that it’s the season of EU leaks on internet and digital policy. One day after the draft new e-Privacy regulation was leaked (to Politico), another document appeared online (published by Euractiv) before its adoption and release – a Communication from the European Commission on “Building a European data economy”.

It announces at least two revisions of existing legal acts: the Database Copyright Directive (96/9) and the Product Liability Directive (85/374). New legislative measures may also be needed to achieve the objectives announced in the draft Communication. However, the Commission is not clear about this and leaves a lot of the decision-making for after the results of wide stakeholder and public consultations are processed.

The common thread of most of the policy areas covered by the Communication is “non-personal data”. The Commission starts from the premise that while the GDPR allows for the free movement of personal data within the EU, there are currently no common rules among Member States for sharing, accessing, transferring “non-personal data”. Moreover, the Commission notes that the number of national measures for data localisation is growing.

“The issue of the free movement of data concerns all types of data: enterprises and actors in the data economy deal with a mixture of personal and non-personal data, machine generated or created by individuals, and data flows and data sets regularly combine these different types of data”, according to the draft Communication.

And what is truly challenging is that “enterprises and actors in the data economy will be dealing with a mixture of personal and non-personal data; data flows and datasets will regularly combine both. Any policy measure must take account of this economic reality”.

If you are wondering what is meant by “non-personal data”, the draft Communication provides some guidance to understand what it refers to. For instance, the draft Communication mentions that “personal data can be turned into non-personal data through the process of anonymisation” and that “the bulk of machine-generated data are not personal data”. Therefore, anonymisation and de-identification techniques will gain even more importance.

While the GDPR covers how personal data are used in the EU, the proposals that will be made on the basis of this Communication envisage the use of all the other data.

So what does the Commission propose?

Several objectives are announced, most of them dealing with the free flow of and access to “non-personal data”, while another objective looks at reforming liability rules to accommodate algorithms, Artificial Intelligence and the Internet of Things.

Free flow of and access to non-personal data

  • According to the draft Communication, any Member State action affecting data storage or processing should be guided by a ‘principle of free movement of data within the EU’.
  • Broader use of open, well-documented Application Programming Interfaces (APIs) could be considered, through technical guidance, including identification and spreading of best practice for companies and public sector bodies.
  • The Commission could issue guidance based on the Trade Secrets Directive, copyright legislation and the Database Directive on how data control rights should be addressed in contracts. The Commission intends to launch the review of the Database Directive in 2017.
  • Access for public interest purposes – public authorities could be granted access to data where this would be in the general interest and would considerably improve the functioning of the public sector, for example access for statistical offices to business data or the optimization of traffic management systems on the basis of real-time data from private vehicles.
  • Selling and acquiring databases could be regulated. “Access against remuneration”: a framework based on fair, non-discriminatory terms could be developed for data holders, such as manufacturers, service providers or other parties, to provide access to the data they hold against remuneration. The Communication is not clear whether this proposal could also cover personal data. In any case, on several occasions throughout the draft Communication, it is mentioned or implied that the GDPR takes precedence over any new rules that would impact the protection of personal data.
  • A data producer’s right to use and licence the use of data could be introduced; by “data producer”, COM understands “the owner or long-term user of the device”. This approach would “open the possibility for users to exploit their data and thereby contribute to unlocking machine-generated data”.
  • Developing further rights to data portability (building on the GDPR data portability right and on the proposed rules on contract for the supply of digital content, further rights to portability of non-personal data could be introduced). The initiatives for data portability would be accompanied by sector specific experiments on standards (which would involve a multi-stakeholder collaboration including standard setters, industry, the technical community, and public authorities).

Rethinking liability rules for the IoT and AI era

Even though Artificial Intelligence is not mentioned as such in the draft Communication, it is clear that the scenario of algorithms making decisions is also envisaged by the announced objective to reform product liability rules, alongside IoT. As the draft Communication recalls, currently, the Products Liability Directive establishes the principle of strict liability, i.e. liability without fault: where a defective product causes damage to a consumer, the manufacturers may be liable even without negligence or fault on their part. The current rules are only addressed to the producer, always require a defect and that the causality between the defect and the damage has to be proven.

The Commission proposed two approaches, which will be subject to consultation:

  • “Risk-generating or risk-management approaches: liability would be assigned to the market players generating a major risk for others and benefitting from the relevant device, product or service or to those which are best placed to minimize or avoid the realization of the risk.”
  • Voluntary or mandatory insurance schemes: they would compensate the parties who suffered the damage; this approach would need to provide legal protection to investments made by business while reassuring victims regarding fair compensation or appropriate insurance in case of damage.”

“Connected and automated driving” – used as test case

The Commission intends to test all the proposed legal solutions, after engaging in wide consultations, in a real life scenario and proposes “connected and automated driving” as the test case.

Finally, read all of these objectives and proposals having in mind that they come from a draft document that was leaked to Euractiv. It is possible that by the time of adoption and publication of this Communication (and there is no indication as to when it will be officially published) its content will be altered.

***

Find what you’re reading useful? Please consider supporting pdpecho.

A million dollar question, literally: Can DPAs fine a controller directly on the basis of the GDPR, or do they need to wait for national laws?

by Gabriela Zanfir-Fortuna

The need to discuss the legal effect of the GDPR emerged as there are some opinions in the privacy bubble informing that it will take at least a couple of years before the GDPR will de facto have legal effect at national level, after the moment it becomes applicable in 2018. The main argument for this thesis is that national parliaments of the Member States will need to take action in a way or another, or that national governments will need to issue executive orders to grant new powers to supervisory authorities, including the power to fine.

This post will bring forward some facts emerging from EU primary law and from the case-law of the Court of Justice of the EU (CJEU) that need to be taken into account before talking about such a de facto grace period.

The conclusion is that, just like all EU regulations, the GDPR is directly applicable and has immediate effect from the date it becomes applicable according to its publication in the EU Official Journal (in this case, 25 May 2018), with no other national measures being required to give it effect in the Member States (not even translations at national level). While it is true that it contains provisions that give a margin of appreciation to Member States if they wish to intervene, most of the articles are sufficiently clear, detailed and straightforward to allow direct application, if need be ( for instance, if a Member State is late in adjusting and adapting its national data protection law).

1) EU regulations enjoy “direct applicability”: the rule is that they are “immediately applicable” and they don’t need national transposition

First and foremost, it is a fact emerging from the EU treaties that EU Regulations enjoy direct applicability, which means that once they become applicable they do not need to be transposed into national law.

This rule is set out in the second paragraph of Article 288 of the Treaty on the European Union, which states that:

“A regulation shall have general application. It shall be binding in its entirety and directly applicable in all Member States.”

On the contrary, according to the third paragraph of Article 288 TFEU, directives “shall be binding, as to the result to be achieved, upon each Member State to which it is addressed, but shall leave to the national authorities the choice of form and methods.”

Therefore, as the CJEU explained in settled case-law, “by virtue of the very nature of regulations and of their function in the system of sources of Community law, the provisions of those regulations generally have immediate effect in the national legal systems without it being necessary for the national authorities to adopt measures of application” (see Case C-278/02 Handlbauer2004, §25 and Case 93/71 Leonesio, 1972, §5) and in addition they also “operate to confer rights on individuals which the national courts have a duty to protect” (Case C-70/15 Lebek, 2016, §51).

However, the CJEU also ruled that “some of their provisions may nonetheless necessitate, for their implementation, the adoption of measures of application by the Member States” (Case C-278/02 Handlbauer2004, §26; C-403/98 Monte Arcosu, 2001, §26). But this is not the case of sufficiently clear and precise provisions, where Member States don’t enjoy any margin of manoeuvre. For instance, the Court found in Handlbauer that “this is not the case as regards Article 3(1) of Regulation No 2988/95 which, by fixing the limitation period for proceedings at four years as from the time when the irregularity is committed, leaves the Member States no discretion nor does it require them to adopt implementation measures” (§27).

Therefore, whenever an EU regulation leaves the Member States no discretion, nor does it require them to adopt implementation measures, the provisions of that regulation are directly and immediately applicable as they are.

2) EU regulations’ direct applicability is not depending on any national measure (not even translation published in national official journals)

The CJEU explained as far back as 1973 that for EU regulations to take effect in national legal systems of Member States there is not even the need to have their texts translated and published in the national official journals.

Asked whether the provisions of a Regulation can be “introduced into the legal order of Member States by internal measures reproducing the contents of Community provisions in such a way that the subject-matter is brought under national law”, the Court replied that “the direct application of a Regulation means that its entry into force and its application in favour of or against those subject to it are independent of any measure of reception into national law” (Case 34/73 Variola, 1973, §9 and §10). AG Kokott explained that such measures include “any publicity by the Member States” (Opinion in C-161/06 Skoma-lux, §54) in an Opinion that was substantially upheld by the Court in a judgment stating that the publication of a regulation in the Official Journal of the EU in an official language of a Member State is the only condition to give it effect and direct applicability in that Member State (Judgment in Case C-161/06).

The Court concluded in Variola that “a legislative measure under national law which reproduces the text of a directly applicable rule of Community law cannot in any way affect such direct applicability, or the Court’s jurisdiction under the Treaty” (operative part of the judgment). The Court also explained in Variola that “by virtue of the obligations arising from the Treaty and assumed on ratification, Member States are under a duty not to obstruct the direct applicability inherent in Regulations and other rules of Community law. Strict compliance with this obligation is an indispensable condition of simultaneous and uniform application of Community Regulations throughout the Community” (Case 34/73 Variola, 1973, §10).

3) National authorities could impose administrative penalties directly on the basis of a provision of a Regulation, where necessary 

The Court dealt with the question of national authorities imposing administrative fines directly on the basis of the provisions of an EU regulation in Case C-367/09 Belgish Interventie en Restitutie Bureau  on the interpretation of provisions from Regulation 2988/95.

After recalling its case-law on direct applicability of EU regulations (§32), including the exemption that some provisions of a Regulation necessitate for their implementation the adoption of measures of application (§33), the CJEU found that in that specific case national authorities cannot impose fines directly on the basis of Articles 5 and 7 of Regulation 2988/95 because “those provisions merely lay down general rules for supervision and penalties for the purpose of safeguarding the EU’s financial interests (…). In particular, those provisions do not specify which of the penalties listed in Article 5 of Regulation No 2988/95 should be applied in the case of an irregularity detrimental to the EU’s financial interests nor the category of operators on whom such penalties are to be imposed in such cases” (§36).

Therefore, the Court did not question the possibility of a national authority to impose fines directly on the legal basis provided by a regulation. The CJEU went directly to analyse the content of the relevant provision and found that fines could not be imposed because of the general character of that provision, which required additional measures to be adopted both at Member State and at EU level (were the provisions more clear, the authorities could have directly issued fines on the basis of the regulation).

One look at Article 83 GDPR and one can easily tell that this is not the case of that provision – it is clear who imposes fines, for what, against whom, on what criteria and what is the maximum amount for each category of fines. Neither is it the case of Article 58 on the powers of supervisory authorities. Article 83 GDPR allows Member States some discretion only if they wish to provide specific rules for fining public authorities (paragraph 7) and only if their legal system does not provide for administrative fines – in this case, the states are allowed to apply Article 83 in such a manner that the fine is initiated by the competent supervisory authority and imposed by competent national courts (paragraph 9).

4) Conclusion: beware of the GDPR from day 1

The GDPR, like all EU regulations, is directly applicable and has immediate effect in the legal order of Member States by virtue of its publication in the Official Journal of the EU and the conditions of applicability in time expressed therein, no additional national measures being required to give it effect.

While there are provisions that give Member States a margin of appreciation and a discretion to implement national measures, most of the provisions are sufficiently clear and precise to be applied as they are.

Of course there will be national data protection laws that will specify additional rules to the GDPR, giving effect to that margin of appreciation. But the national laws that will complement an EU regulation, such as the GDPR, are valid only as long as “they do not obstruct its direct applicability and do not conceal its [EU] nature, and if they specify that a discretion granted to them by that regulation is being exercised, provided that they adhere to the parameters laid down under it” (CJEU, Case C‑316/10 Danske Svineproducenter Justitsministeriet, §41).

As always, here is the fine print (or the caveat) whenever we are discussing about the interpretation of EU law: only the CJEU has the authority to interpret EU law in a binding manner.

(Note: The author is grateful to dr. Mihaela Mazilu-Babel, who provided support with preliminary research for this post)

***

Find what you’re reading useful? Please consider supporting pdpecho.

What’s new in research: Georgetown Law Technology Review, human rights and encryption, and data protection proof free-trade agreements (open access)

I’m starting this week’s “What’s new in research” post with three good news:

  • There is a new technology law journal in town – Georgetown Law Technology Review, which was just launched. It provides full access to its articles, notes and comments. “Few issues are of greater need for careful attention today than the intersection of law and technology“, writes EPIC’s Marc Rotenberg welcoming the new Review.
  • Tilburg Institute for Law, Technology and Society (TILT) launched its Open call for Fellowships Applications for the 2017-2018 academic year. “This programme is for internationally renowned senior scholars who wish to spend the 2017- 2018 academic year, or a semester, in residence at TILT as part of its multi-disciplinary research team to work on some of the most interesting, challenging and urgent issues relating to emerging and disruptive technologies.” I spent three months at TILT in 2012, as a visiting researcher, during my PhD studies. I highly recommend this experience – it’s one of the best environments there are to develop your research in the field of data protection/privacy.

 

livres4

As for the weekend reads proposed this week, they tackle hot topics: human rights and encryption from a global perspective, international trade agreements and data protection from the EU law perspective, newsworthiness and the protection of privacy in the US.  

 

  1. Human rights and encryption, by Wolfgang Schultz and Joris van Hoboken, published by UNESCO.

“This study focuses on the availability and use of a technology of particular significance in the field of information and communication: encryption, or more broadly cryptography. Over the last decades, encryption has proven uniquely suitable to be used in the digital environments. It has been widely deployed by a variety of actors to ensure protection of information and communication for commercial, personal and public interests. From a human rights perspective, there is a growing recognition that the availability and deployment of encryption by relevant actors is a necessary ingredient for realizing a free and open internet. Specifically, encryption can support free expression, anonymity, access to information, private communication and privacy. Therefore, limitations on encryption need to be carefully scrutinized. This study addresses the relevance of encryption to human rights in the media and communications field, and the legality of interferences, and it offers recommendations for state practice and other stakeholders.”

2. “Trade and Privacy: Complicated Bedfellows? How to Achieve Data Protection-Proof Free Trade Agreements“, by Kristina Irion, Svetlana Yakovleva, Marija Bartl, a study commissioned by the European Consumer Organisation/Bureau Européen des Unions de Consommateurs (BEUC), Center for Digital Democracy (CDD), The Transatlantic Consumer Dialogue (TACD) and European Digital Rights (EDRi).

“This independent study assesses how EU standards on privacy and data protection are safeguarded from liberalisation by existing free trade agreements (the General Agreement of Trade in Services (GATS) and the Comprehensive Economic and Trade Agreement (CETA)) and those that are currently under negotiation (the Trans-atlantic Trade and Investment Partnership (TTIP) and the Trade in Services Agreement (TiSA)). Based on the premise that the EU does not negotiate its privacy and data protection standards, the study clarifies safeguards and risks in respectively the EU legal order and international trade law. In the context of the highly-charged discourse surrounding the new generation free trade agreements under negotiation, this study applies legal methods in order to derive nuanced conclusions about the preservation of the EU’s right to regulate privacy and the protection of personal data.”

3. “Making News: Balancing Newsworthiness and Privacy in the Age of Algorithms, by Erin C. Caroll, published by the Georgetown University Law Center.

“In deciding privacy lawsuits against media defendants, courts have for decades deferred to the media. They have given it wide berth to determine what is newsworthy and so, what is protected under the First Amendment. And in doing so, they have often spoken reverently of the editorial process and journalistic decision-making.

Yet, in just the last several years, news production and consumption has changed dramatically. As we get more of our news from digital and social media sites, the role of information gatekeeper is shifting from journalists to computer engineers, programmers, and app designers. The algorithms that the latter write and that underlie Facebook, Twitter, Instagram, and other platforms are not only influencing what we read but are prompting journalists to approach their craft differently.

While the Restatement (Second) of Torts says that a glance at any morning newspaper can confirm what qualifies as newsworthy, this article argues that the modern-day corollary (which might involve a glance at a Facebook News Feed) is not true. If we want to meaningfully balance privacy and First Amendment rights, then courts should not be so quick to defer to the press in privacy tort cases, especially given that courts’ assumptions about how the press makes newsworthiness decisions may no longer be accurate. This article offers several suggestions for making better-reasoned decisions in privacy cases against the press.”

Enjoy the reads and have a nice weekend!

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

 

 

Greek judges asked the CJEU if they should dismiss evidence gathered under the national law that transposed the invalidated Data Retention Directive

Here is a new case at the Court of Justice of the EU that the data protection world will be looking forward to, as it addresses questions about the practical effects of the invalidation of the Data Retention Directive.

old_bailey_microcosm

(licensed under Creative Commons)

Case C-475/16 K. (yes, like those Kafka characters) concerns criminal proceedings against K. before Greek courts, which apparently involve evidence gathered under the Greek national law that transposed the now-invalidated Data Retention Directive. The Directive was invalidated in its entirety by the CJEU in 2014, after the Court found in its Digital Rights Ireland judgment that the provisions of the Directive breached Articles 7 (right to respect for private life) and 8 (right to the protection of personal data) of the Charter of Fundamental Rights.

The Greek judges sent in August a big set out questions for a preliminary ruling to the CJEU (17 questions). Among those, there are a couple of very interesting ones, because they deal with the effects in practice of the invalidation of an EU Directive and what happens with national laws of the Member States that transposed the Directive.

For instance, the national judge asks whether national courts are obliged not to apply legislative measures transposing the annulled Directive and whether this obligation also means that they must dismiss evidence obtained as a consequence of those legislative measures (Question 3). The national judge also wants to know if maintaining the national law that transposes an invalidated Directive constitutes an obstacle to the establishment and functioning of the internal market (Question 16).

Another question raised by the national judge is whether the national legislation that transposed the annulled Data Retention Directive and that remained in force at national level after the annulment is still considered as falling under the scope of EU law (Question 4). The answer to this question is important because the EU Charter and the supremacy of EU law do not apply to situations that fall outside the scope of EU law.

The Greek judge didn’t miss the opportunity to also ask about the effect on the national law transposing the Data Retention Directive of the fact that this Directive was also enacted to implement a harmonised framework at the European level under Article 15(1) of the ePrivacy Directive (Question 5). The question is whether this fact is enough to bring the surviving national data retention laws under the scope of EU law.

As long as the Charter will be considered applicable to the facts of the case, the national judge further wants to know whether national law that complies partly with the criteria set out in the Digital Rights Ireland decision still breaches Articles 7 and 8 of the Charter because it doesn’t comply with all of it (Question 13). For instance, the national judge estimates that the national law doesn’t comply with the request that the persons whose data are retained must be at least indirectly in a situation which is liable to give rise to criminal prosecutions (para 58 DRI), but it complies with the request that the national law must contain substantive and procedural conditions for the access of competent authorities to the retained data and objective criteria by which the number of persons authorised to access these data is limited to what is strictly necessary (paras 61, 62 DRI).

Lastly, it will be also interesting to see whether the Court decides to address the issue of what “serious crime” means in the context of limiting the exercise of fundamental rights (Questions 10 and 11).

If you would like to dwell into some of these topics, have a look at the AG Opinion in the Tele2Sverige case, published on 19 July 2016. The judgment in that case is due on 21 December 2016. Also, have a look at this analysis of the Opinion.

As for a quick “what to expect” in the K. case from my side, here it is:

  • the CJEU will seriously re-organise the 17 questions and regroup them in 4 to 5 topics, also clarifying that it only deals with the interpretation of EU law, not national law or facts in national proceedings;
  • the national laws transposing the Data Retention Directive will probably be considered as being in the field of EU law – as they also regulate within the ambit of the ePrivacy Directive;
  • the Court will restate the criteria in DRI and probably clarify that all criteria must be complied with, no exceptions, in order for national measures to comply with the Charter;
  • the CJEU will probably not give indications to the national courts on whether they should admit or dismiss evidence collected on the bases of national law that does not comply with EU law – it’s too specific and the Court is ‘in the business’ of interpreting EU law; the best case scenario, which is possible, is that the Court will give some guidance on the obligations of Member States (and hopefully their authorities) regarding the effects of their transposing national laws when relevant EU secondary law is annulled;
  • as for what “serious crime” means in the context of limiting fundamental rights, let’s see about that. Probably the Court will give useful guidance.

***

Find what you’re reading useful? Please consider supporting pdpecho.

What’s new in research: full-access papers on machine learning with personal data, the ethics of Big Data as a public good

Today pdpecho inaugurates a weekly post curating research articles/papers/studies or dissertations in the field of data protection and privacy, that are available under an open access regime and that were recently published.

This week there are three recommended pieces for your weekend read. The first article, published by researchers from Queen Mary University of London and Cambridge University, provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

The second article is the view of a researcher specialised in International Development, from the University of Amsterdam, on the new trend in humanitarian work to consider data as a public good, regardless of whether it is personal or not.

The last paper is a draft authored by a law student at Yale (published on SSRN), which explores an interesting phenomenon: how data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. The paper underlines that the US privacy law system lacks protection for individuals whose data are sold in this scenario and proposes a solution.

1) Machine Learning with Personal Data (by Dimitra Kamarinou, Christopher Millard, Jatinder Singh)

“This paper provides an analysis of the impact of using machine learning to conduct profiling of individuals in the context of the EU General Data Protection Regulation.

We look at what profiling means and at the right that data subjects have not to be subject to decisions based solely on automated processing, including profiling, which produce legal effects concerning them or significantly affect them. We also look at data subjects’ right to be informed about the existence of automated decision-making, including profiling, and their right to receive meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.

The purpose of this paper is to explore the application of relevant data protection rights and obligations to machine learning, including implications for the development and deployment of machine learning systems and the ways in which personal data are collected and used. In particular, we consider what compliance with the first data protection principle of lawful, fair, and transparent processing means in the context of using machine learning for profiling purposes. We ask whether automated processing utilising machine learning, including for profiling purposes, might in fact offer benefits and not merely present challenges in relation to fair and lawful processing.”

The paper was published as “Queen Mary School of Law Legal Studies Research Paper No. 247/2016″.

“International development and humanitarian organizations are increasingly calling for digital data to be treated as a public good because of its value in supplementing scarce national statistics and informing interventions, including in emergencies. In response to this claim, a ‘responsible data’ movement has evolved to discuss guidelines and frameworks that will establish ethical principles for data sharing. However, this movement is not gaining traction with those who hold the highest-value data, particularly mobile network operators who are proving reluctant to make data collected in low- and middle-income countries accessible through intermediaries.

This paper evaluates how the argument for ‘data as a public good’ fits with the corporate reality of big data, exploring existing models for data sharing. I draw on the idea of corporate data as an ecosystem involving often conflicting rights, duties and claims, in comparison to the utilitarian claim that data’s humanitarian value makes it imperative to share them. I assess the power dynamics implied by the idea of data as a public good, and how differing incentives lead actors to adopt particular ethical positions with regard to the use of data.”

This article is part of the themed issue ‘The ethical impact of data science’ in “Philosophical transactions of the Royal Society A”.

3) What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers (by Theodore Rostow)

Privacy scholarship to date has failed to consider a new development in the commercial privacy landscape. Data brokers have begun to sell data products to individual consumers interested in tracking the activities of love interests, professional contacts, and other people of interest. This practice creates an avenue for a new type of privacy harm — “insider control” — which privacy scholarship has yet to recognize.

U.S. privacy laws fail to protect consumers from the possibility of insider control. Apart from two noteworthy frameworks that might offer paths forward, none of the viable reforms offered by privacy scholars would meaningfully limit consumers’ vulnerability. This Note proposes changes to existing privacy doctrines in order to reduce consumers’ exposure to this new harm.”

This paper was published as a draft on SSRN. According to SSRN, the final version will be published in the 34th volume of the Yale Journal on Regulation.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Even if post Brexit-UK adopts the GDPR, it will be left without its “heart”

Gabriela Zanfir Fortuna

brexit

There has been lately a wave of optimism of those looking for legal certainty that the GDPR will be adopted by the UK even after the country leaves the European Union. This wave was prompted by a declaration of the British Secretary of State, Karen Bradley, at the end of October, when she stated before a Committee of the Parliament that “We will be members of the EU in 2018 and therefore it would be expected and quite normal for us to opt into the GDPR and then look later at how best we might be able to help British business with data protection while maintaining high levels of protection for members of the publicThe information commissioner of the UK, Elisabeth Denham, welcomed the news. On another hand, as Amberhawk explained in detail, this will not mean that the UK will automatically be considered as ensuring an adequate level of protection.

The truth is that as long as the UK is still a Member of the EU, it can’t opt in or opt out, for that matter, from regulations (other than the ones subject to the exemptions negotiated by the UK when it entered the Union – but this is not the case for the GDPR). They are “binding in their entirety” and “directly applicable”, according to Article 288 of the Treaty on the Functioning of the EU. So, yes, quite normally, if the UK is still a Member State of the EU on 25 May 2018, then the GDPR will start applying in the UK just as it will be applying in Estonia or France.

The fate of the GDPR after Brexit becomes effective will be as uncertain as the fate of all other EU legislative acts transposed in the UK or directly applicable in the UK. But let’s imagine the GDPR will remain national law after Brexit, in a form or another. If this happens, it is likely that it will take a life of its own, departing from harmonised application throughout the EU. First and foremost, the GDPR in the UK will not be applied in the light of the Charter of Fundamental Rights of the EU and especially its Article 8 – the right to the protection of personal data. The Charter played an extraordinary role in the strengthening of data protection in the EU after it became binding, in 2009, being invoked by the Court of Justice of the EU in its landmark judgments – Google v Spain,  Digital Rights Ireland and Schrems.

The Court held as far back as 2003 that “the provisions of Directive 95/46, in so far as they govern the processing of personal data liable to infringe fundamental freedoms, in particular the right to privacy, must necessarily be interpreted in the light of fundamental rights” (Österreichischer Rundfunk, para 68). This principle was repeated in most of the following cases interpreting Directive 95/46 and other relevant secondary law for this field, perhaps with the most notable results in Digital Rights Ireland and Schrems. 

See, for instance:

“As far as concerns the rules relating to the security and protection of data retained by providers of publicly available electronic communications services or of public communications networks, it must be held that Directive 2006/24 does not provide for sufficient safeguards, as required by Article 8 of the Charter, to ensure effective protection of the data retained against the risk of abuse and against any unlawful access and use of that data” (Digital Rights Ireland, para. 66).

“As regards the level of protection of fundamental rights and freedoms that is guaranteed within the European Union, EU legislation involving interference with the fundamental rights guaranteed by Articles 7 and 8 of the Charter must, according to the Court’s settled case-law, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, so that the persons whose personal data is concerned have sufficient guarantees enabling their data to be effectively protected against the risk of abuse and against any unlawful access and use of that data. The need for such safeguards is all the greater where personal data is subjected to automatic processing and where there is a significant risk of unlawful access to that data” (Schrems, para. 91).

Applying data protection law outside the spectrum of fundamental rights will most likely not ensure sufficient protection to the person. While the UK will still remain under the legal effect of the European Convention of Human Rights and its Article 8 – respect for private life – this by far does not equate to the specific protection ensured to personal data by Article 8 of the Charter as interpreted and applied by the CJEU.

Not only the Charter will not be binding for the UK post-Brexit, but the Court of Justice of the EU will not have jurisdiction anymore on the UK territory (unless some sort of spectacular agreement is negotiated for Brexit). Moreover, EU law will not enjoy supremacy over national law, as there is the case right now. This means that the British data protection law will be able to depart from the European standard (GDPR) to the extent desirable by the legislature. For instance, there will be nothing staying in the way of the British legislature to adopt permissive exemptions to the rights of the data subject, pursuant to Article 23 GDPR.

So when I mentioned in the title that the GDPR in the post-Brexit UK will in any case be left without its “heart”, I was referring to its application and interpretation in the light of the Charter of the Fundamental Rights of the EU.

***

Find what you’re reading useful? Please consider supporting pdpecho.

Interested in the GDPR? See the latest posts:

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

The GDPR already started to appear in CJEU’s soft case-law (AG Opinion in Manni)

CNIL’s public consultation on the GDPR: what’s in store for Data Protection Impact Assessments and certification mechanisms? (Part II)

Gabriela Zanfir Fortuna

The French Data Protection Authority, CNIL, made public last week the report of the public consultation it held between 16 and 19 July 2016 among professionals about the General Data Protection Regulation (GDPR). The public consultation gathered 540 replies from 225 contributors.

The main issues the CNIL focused on in the consultation were four:

  • the data protection officer;
  • the right to data portability;
  • the data protection impact assessments;
  • the certification mechanism.

These are also the four themes in the action plan of the Article 29 Working Party for 2016.

This post summarises the results and action plan for the last two themes. If you want to read about the results on the data protection officer and the right to data portability, check out Part I of this post. [Disclaimer: all quotations are translated from French].

1) On data protection impact assessments (DPIAs)

Article 35 GDPR obliges data controllers to carry out an assessment of the impact of the envisaged processing operations on the protection of personal data prior to the processing, if it is likely to result in a high risk to the rights and freedoms of natural persons, taking into account the nature, scope, context and purposes of the processing, and in particular where that processing uses new technologies. According to Article 35(3), the supervisory authorities must make public a list of the kind of processing operations which are subject to this requirement.

Article 35(3) provides that there are three cases where DPIAs must be conducted:

a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing (including profiling);

b) where processing on a large scale sensitive data (e.g. health data, data disclosing race, political options etc.);

c) a systematic monitoring of a publicly accessible area on a large scale.

According to the report, the DPIA emerges as a dynamic compliance tool, which contributes to maintain data security, reduce the risks of processing, determine the suitable safeguards, prevent legal deficiencies and better implement Privacy by Design and Privacy by Default (p. 17). It was deemed by participants as a “new and useful tool”.

There were three main category of questions raised by the participants to the public consultation:

  • When do controllers have to conduct a DPIA?
  • How to conduct a DPIA?
  • Who does what within the work for a DPIA?

The respondents requested the supervisory authority to be active in helping them prepare for DPIAs – “to clarify everything that is unclear, to involve companies [in discussions], to provide criteria and examples” and to help harmonise the criteria at European level (p. 14).

Several particular cases were brought up by the respondents, such as processing of HR data, processing of data by websites, processing of data by public administration or by hospitals. These scenarios raised questions such as: does the term “large scale” only refer to Big Data? Does it refer to the volume of data that will be processed or to the number of people whose data will be processed? Are “new technologies” all the technologies that are used for the first time by a controller? Is behavioural advertising “profiling” in the sense of the GDPR? (p. 14).

The participants also wanted to know whether a DPIA should be conducted as well for those processing operations that are already in place and that would qualify for one of the “compulsory” cases that require a DPIA.

As for the methodological approach, the respondents asked for a simple method. They also referred to other existent tools that could be used, such as ISO 29134 and EBIOS. In any case, they suggested that the method should be tested with controllers and should be harmonised at European level. There were also questions whether professional associations could create their own methodology for DPIAs based on sectors of activity (p. 15).

The conclusion of the CNIL was that the contributions to the public consultation showed a great need for clarification, but also revealed “interesting ideas” for the implementation of the DPIA requirements, which will be taken into account. The most difficult points revealed are the criteria to be taken into account when deciding if a DPIA must be conducted, the harmonisation of methodologies at European level and the prior consultation of supervisory authorities (p. 17).

The immediate action plan refers to guidance from the Article 29 Working Party on DPIA and what constitutes “high risks”, which will provide interpretations to vague requirements. The CNIL also aims to make some steps by themselves, such as updating their current guidance for Privacy Impact Assessments.

4) On the certification mechanism

Article 42 of the GDPR provides that the establishment of data protection certification mechanisms and of data protection seals and marks, for the purpose of demonstrating compliance with the Regulation of processing operations by controllers and processors shall be “encouraged” by Member States, DPAs, the European Data Protection Board and the European Commission. Article 42(3) clarifies that the certification is voluntary and must be available via a transparent process.

Surprisingly, the “certification” part of the public consultation was the one that provided more plain suggestions than questions, compared to the other three, as is apparent from the report. On another hand, the contributions seem to be smaller in volume, provided this indeed is a novel topic for the data protection world.

One of the questions dealt with in the consultation was “who should issue certifications/labels”? The respondents preferred the option of a certification issued at European level and only in the absence of such a possibility, a certification issued at national level that should be mutually recognised. They also underlined that the coexistence of certifications issued by DPAs and certifications issued by certification bodies will be difficult. Participants to the consultation suggested that drafting of standards should be carried out by regulators in consultation with companies and the future evaluators, with a view to homogenise the practices of the different certification bodies (p. 11).

To the question of what should be certified or labeled with priority, the respondents provided a list of suggestions (p. 11):

  • online products and services processing health data;
  • the solutions to monitor/surveil databases;
  • the services provided by the state;
  • anonymisation techniques;
  • search engines;
  • social media platforms.

As to which are the specific needs of small and medium enterprises, the replies referred to support for filing the requests for certification, the need of reduced costs and the need of a simple methodology (p. 12).

Another topic discussed was how to retrieve a label or a certification in case of misconduct – proposals ranged from creating an “alarm system” to signal non-compliance with the certification, to having an effective withdrawal after an adversarial procedure with a formal notice to the certification body, which could propose a corrective plan during the procedure (p. 12).

Finally, the point that certification under Article 42 GDPR should essentially focus on data protection and not data security was also raised (p. 13).

The report does not contain an action plan for certification.

***

Find what you’re reading useful? Please consider supporting pdpecho.

 

 

CNIL just published the results of their GDPR public consultation: what’s in store for DPOs and data portability? (Part I)

Gabriela Zanfir Fortuna

The French Data Protection Authority, CNIL, made public this week the report of the public consultation it held between 16 and 19 July 2016 among professionals about the General Data Protection Regulation (GDPR). The public consultation gathered 540 replies from 225 contributors.

The main issues the CNIL focused on in the consultation were four:

  • the data protection officer;
  • the right to data portability;
  • the data protection impact assessments;
  • the certification mechanism.

These are also the four themes in the action plan of the Article 29 Working Party for 2016.

This post (Part I) will summarise the results and action plan for the first two themes, while the last two will be dealt with in a second post (Part II). [Disclaimer: all quotations are translated from French].

1) On the data protection officer

According to Article 37 GDPR, both the controller and the processor must designate a data protection officer where the processing is carried out by a public authority (1)(a), where their core activities consist of processing operations which require regular and systematic monitoring of data subjects on a large scale (1)(b) and where their core activities consist of processing sensitive data on a large scale (1)(c).

The report reveals that there are many more questions than answers or opinions about how Article 37 should be applied in practice. In fact, most of the contributions are questions from the contributors (see pages 2 to 4). They raise interesting points, such as:

  • What is considered to be a conflict of interest – who will not be able to be appointed?
  • Should the DPO be appointed before May 2018 (when GDPR becomes applicable)?
  • Will the CNIL validate the mandatory or the optional designation of a DPO?
  • Which will exactly be the role of the DPO in the initiative for and in the drafting of the data protection impact assessments?
  • Which are the internal consequences if the recommendations of the DPO are not respected?
  • Is it possible that the DPO becomes liable under Criminal law for how he/she monitors compliance with the GDPR?
  • Should the DPO be in charge of keeping the register of processing operations and Should the register be communicated to the public?
  • Should only the contact details of the DPO be published, or also his/her identity?
  • Must the obligations in the GDPR be applied also for the appointment of the DPO that is made voluntarily (outside the three scenarios in Article37(1))?
  • Can a DPO be, in fact, a team? Can a DPO be a legal person?
  • Are there any special conditions with regard to the DPO for small and medium enterprises?

The CNIL underlines that for this topic an important contribution was brought by large professional associations during discussions, in addition to the large number of replies received online.

In fact, according to the report, the CNIL acknowledges “the big expectations of professional associations  and federations to receive clarifications with regard to the function of the DPO, as they want to prepare as soon as possible and in a sustainable way for the new obligations” (p. 5).

As for future steps, the CNIL recalls that the Article 29 Working Party will publish Guidelines to help controllers in a practical manner, according to the 2016 action plan. (There’s not much left of 2016, so hopefully we’ll see the Guidelines soon!). The CNIL announces they will also launch some national communication campaigns and they will intensify the training sessions and workshops with the current CILs (Correspondants Informatique et Libertés – a role similar to that of a DPO).

2) On the right to data portability

new-note-2

Article 20 GDPR provides that the data subject has the right to receive a copy of their data in a structured, commonly used and machine-readable format and has the right to transmit those data to another controller only if the processing is based on consent or on a contract.

First, the CNIL notes that there was “a very strong participation of the private sector submitting opinions or queries regarding the right to data portability, being interesting especially about the field of application of the new right, the expenses its application will require and about its consequences on competition” (p. 6).

According to the report, the right to data portability it’s perceived as an instrument that allows regaining the trust of persons about processing of their personal data, bringing more transparency and more control over the processing operation (p. 6).

On another hand, the organisations that replied to the public consultation are concerned about the additional investments they will need to make to implement this right. They are also concerned about (p. 6):

  • “the risk of creating an imbalance in competition between European and American companies, as European companies are directly under the obligation to comply with this right, whereas American companies may try to circumvent the rules”. My comment here would be that they should not be concerned about that, because if they target the same European public to offer services, American companies will also be under a direct obligation to comply with this right.
  • “the immediate cost of implementing this right (for instance, the development of automatic means to extract data from databases), which cannot be charged to the individuals, but which will be a part of the management costs and will increase the costs for the services”.
  • “the level of responsibility if the data are mishandled or if the data handed over to the person are not up to date”.

The respondents to the public consultation seem to be a good resource for technical options to use in terms of the format needed to transfer data. Respondents argued in favor of open source formats, which will make reusing the data easier and which will be cheaper compared to proprietary solutions. Another suggested solution is the development of Application Program Interfaces (APIs) based on open standards, without a specific licence key. This way the persons will be able to use the tools of their choice.

One of the needs that emerged from the consultation was to clarify whether the data that are subject to the right to portability must be raw data, or whether transferring a “summary” of the data would suffice. Another question was whether the data could be asked for by a competing company, with a mandate from the data subject. There were also questions regarding the interplay of the right to data portability and the right of access, or asking how could data security be ensured for the transfer of the “ported” data.

In the concluding part, the CNIL acknowledges that two trends could already be seen within the replies: on the one hand, companies tend to want to limit as much as possible the applicability of the right to data portability, while on the other hand, the representatives of the civil society are looking to encourage persons to take their data in their own hands and to reinvent their use (p. 10).

According to the report, the Technology Subgroup of the Article 29 Working Party is currently drafting guidelines with regard to the right to data portability. “They will clarify the field of application of this right, taking into account all the questions raised by the participants to the consultation, and they will also details ways to reply to portability requests”, according to the report (p. 10).

***

Find what you’re reading useful? Consider supporting pdpecho.

Click HERE for Part II of this post.

A look at political psychological targeting, EU data protection law and the US elections

Cambridge Analytica, a company that uses “data modeling and psychographic profiling” (according to its website), is credited with having decisively contributed to the outcome of the presidential election in the U.S.. They did so by using “a hyper-targeted psychological approach” allowing them to see trends among voters that no one else saw and thus to model the speech of the candidate to resonate with those trends. According to Mashable, the same company also assisted the Leave. EU campaign that leaded to Brexit.

How do they do it?

“We collect up to 5,000 data points on over 220 million Americans, and use more than 100 data variables to model target audience groups and predict the behavior of like-minded people” (my emphasis), states their website (for comparison, the US has a 324 million population). They further explain that “when you go beneath the surface and learn what people really care about you can create fully integrated engagement strategies that connect with every person at the individual level” (my emphasis).

According to Mashable, the company “uses a psychological approach to polling, harvesting billions of data from social media, credit card histories, voting records, consumer data, purchase history, supermarket loyalty schemes, phone calls, field operatives, Facebook surveys and TV watching habits“. This data “is bought or licensed from brokers or sourced from social media”.

(For a person who dedicated their professional life to personal data protection this sounds chilling.)

Legal implications

Under US privacy law this kind of practice seems to have no legal implications, as it doesn’t involve processing by any authority of the state, it’s not a matter of consumer protection and it doesn’t seem to fall, prima facie, under any piece of the piecemeal legislation dealing with personal data in the U.S. (please correct me if I’m wrong).

Under EU data protection law, this practice would raise a series of serious questions (see below), without even getting into the debate of whether this sort of intimate profiling would also breach the right to private life as protected by Article 7 of the EU Charter of Fundamental Rights and Article 8 of the European Convention of Human Rights (the right to personal data protection and the right to private life are protected separately in the EU legal order). Put it simple, the right to data protection enshrines the “rules of the road” (safeguards) for data that is being processed on a lawful ground, while the right to private life protects the inner private sphere of a person altogether, meaning that it can prohibit the unjustified interferences in the person’s private life. This post will only look at mass psychological profiling from the data protection perspective.

Does EU data protection law apply to the political profilers targeting US voters?

But why would EU data protection law even be applicable to a company creating profiles of 220 million Americans? Surprisingly, EU data protection law could indeed be relevant in this case, if it turns out that the company carrying out the profiling is based in the UK (London-based), as several websites claim in their articles (here, here and here).

Under Article 4(1)(a) of Directive 95/46, the national provisions adopted pursuant to the directive shall apply “where the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State“. Therefore, the territorial application of Directive 95/46 is triggered by the place of establishment of the controller.  Moreover, Recital 18 of the Directive’s Preamble explains that “in order to ensure that individuals are not deprived of the protection to which they are entitled under this Directive, any processing of personal data in the Community (EU – n.) must be carried out in accordance with the law of one of the Member States” and that “in this connection, processing carried out under the responsibility of a controller who is established in a Member State should be governed by the law of that State” (see also CJEU Case C-230/14 Weltimmo, paras. 24, 25, 26).

There are, therefore, no exceptions to applying EU data protection rules to any processing of personal data that is carried out under the responsibility of a controller established in a Member State. Is it relevant here whether the data subjects are not European citizens, and whether they would not even be physically located within Europe? The answer is probably in the negative. Directive 95/46 provides that the data subjects it protects are “identified or identifiable natural persons“, without differentiating them based on their nationality. Neither does the Directive link its application to any territorial factor concerning the data subjects. Moreover, according to Article 8 of the EU Charter of Fundamental Rights, “everyone has the right to the protection of personal data concerning him or her”.

I must emphasise here that the Court of Justice of the EU is the only authority that can interpret EU law in a binding manner and that until the Court decides how to interpret EU law in a specific case, we can only engage in argumentative exercises. If the interpretation proposed above would be found to have some merit, it would indeed be somewhat ironic to have the data of 220 million Americans protected by EU data protection rules.

What safeguards do persons have against psychological profiling for political purposes?

This kind of psychological profiling for political purposes would raise a number of serious questions. First of all, there is the question of whether this processing operation involves processing of “special categories of data”. According to Article 8(1) of Directive 95/46, “Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.” There are several exceptions to this prohibition, of which only two would conceivably be applicable to this kind of profiling:

  • if the data subject has given his explicit consent to the processing of those data (letter a) or
  • the processing relates to data which are manifestly made public by the data subject (letter e).

In order for this kind of psychological profiling to be lawful, the controller must obtain explicit consent to process all the points of data used for every person profiled. Or the controller must only use those data points that were manifestly made public by a person.

Moreover, under Article 15(1) of Directive 95/46, the person has the right “not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”. It is of course to be interpreted to what extent psychological profiling for political purposes produces legal effects or significantly affects the person.

Another problem concerns the obligation of the controller to inform every person concerned that this kind of profiling is taking place (Articles 10 and 11 of Directive 95/46) and to give them details about the identity of the controller, the purposes of the processing and all the personal data that is being processed. In addition, the person should be informed that he or she has the right to ask for a copy of the data the controller holds about him or her and the right to ask for the erasure of that data if it was processed unlawfully (Article 12 of Directive 95/46).

Significantly, the person has the right to opt-out of a processing operation, at any time, without giving reasons, if that data is being processed for the purposes of direct marketing (Article 14(b) of Directive 95/46). For instance, in the UK, the supervisory authority – the Information Commissioner’s Office, issued Guidance for political campaigns in 2014 and gave the example of “a telephone call which seeks an individual’s opinions in order to use that data to identify those people likely to support the political party or referendum campaign at a future date in order to target them with marketing” as constituting direct marketing.

Some thoughts

  • The analysis of how EU data protection law is relevant for this kind of profiling would be more poignant if it would be made under the General Data Protection Regulation, which will become applicable on 25 May 2018 and which has a special provision for profiling.
  • The biggest ever fine issued by the supervisory authority in the UK is 350.000 pounds, this year. Under the GDPR, breaches of data protection rules will lead to fines up to 20 million euro or 4% of the controller’s global annual turnover for the previous year, whichever is higher.
  • If any company based in the UK used this kind of psychological profiling and micro-targeting for the Brexit campaign, that processing operation would undoubtedly fall under the rules of EU data protection law. This stands true of any analytics company that provides these services to political parties anywhere in the EU using personal data of EU persons. Perhaps this is a good time to revisit the discussion we had at CPDP2016 on political behavioural targeting (who would have thought the topic will gain so much momentum this year?)
  • I wonder if data protection rules should be the only “wall (?)” between this sort of targeted-political-message-generating campaign profiling and the outcome of democratic elections.
  • Talking about ethics, data protection and big data together is becoming more urgent everyday.

***

Find what you’re reading useful? Consider supporting pdpecho.

Fresh EU data protection compliance guidance for mobile apps, from the EDPS

The European Data Protection Supervisor adopted this week “Guidelines on the protection of personal data processed by mobile applications provided by European Union institutions”.

While the guidelines are addressed to the EU bodies that provide mobile apps to interact with citizens (considering the mandate of the EDPS is to supervise how EU bodies process data), the guidance is just as valuable to all controllers processing data via mobile apps.

The Guidelines acknowledge that “mobile applications use the specific functions of smart mobile devices like portability, variety of sensors (camera, microphone, location detector…) and increase their functionality to provide great value to their users. However, their use entails specific data protection risks due to the easiness of collecting great quantities of personal data and a potential lack of data protection safeguards.”

Managing consent

One of the most difficult data protection issues that controllers of processing operations through mobile apps face is complying with the consent requirements. The Guidelines provide valuable guidance on how to obtain valid consent (see paragraphs 25 to 29).

  • Adequately inform users and obtain their consent before installing any application on user’s smart mobile device
  • Users have to be given the option to change their wishes and revoke their decision at any time.
  • Consent needs to be collected before any reading or storing of information from/onto the smart mobile device is done.
  • An essential element of consent is the information provided to the user. The type and accuracy of the information provided needs to be such as to put users in control of the data on their smart mobile device to protect their own privacy.
  • The consent should be specific (highlighting the type of data collected), expressed through active choicefreely given (users should be given the opportunity to make a real choice).
  • The apps must provide users with real choices on personal data processing: the mobile application must ask for granular consent for every category of personal data it processes and every relevant use. If the OS does not allow a granular choice, the mobile application itself must implement this.
  • The mobile application must feature functionalities to revoke users’ consent for each category of personal data processed and each relevant use. The mobile application must also provide functionalities to delete users’ personal data where appropriate.

The Guidelines invite controllers to “analyse the compliance of its intended processing before implementing the mobile application during the feasibility check, business case design or an equivalent early definition stage of the project”. The controller “should take decisions on the design and operation of the planned mobile application based on an information security risk assessment”.

Other recommendations concern:

  • data minimisation – “the mobile application must collect only those data that are strictly necessary to perform the lawful functionalities as identified and planned”.
  • third party components or services – “Assess the data processing features of a third party component or of a third party service before integrating it into a mobile application”.
  • security of processing – “Apply appropriate information security risk management to the development, distribution and operation of mobile applications” (paragraphs 38 to 41).
  • secure development, operation and testing – “The EU institution should have documented secure development policies and processes for mobile applications, including operation and security testing procedures following best practices”.
  • vulnerability management – “Adopt and implement a vulnerability management process appropriate to the development and distribution of mobile applications” (paragraphs 47 to 51).
  • protection of personal data in transit and at rest – “Personal data needs to be protected when stored in the smart mobile device, e.g. through effective encryption of the personal data”.

 

***

Find what you’re reading useful? Consider supporting pdpecho.