Tag Archives: big data

What’s new in research: networks of control built on digital tracking, new models of internet governance

pdpEcho is kicking off 2017 with a brief catalogue of interesting recently published research that sets the tone for the new year.

1474043869-800pxFirst, Wolfie Christl and Sarah Spiekermann‘s report on “Networks of Control”, published last month, is a must read for anyone that wants to understand how the digital economy functions on the streams of data we all generate, while reflecting on the ethical implications of this economic model and proposing new models that would try keep the surveillance society afar. Second, a new report of the Global Commission of Internet Governance explores global governance gaps created by existing global governance structures developed in the analog age. Third, the American Academy of Sciences recently published a report with concrete proposals on how to reconcile the use of different public and private sources of data for government statistics with privacy and confidentiality. Last, a volume by Angela Daly that was recently published by Hart Publishing explores how EU competition law, sector specific regulation, data protection and human rights law could tackle concentrations of power for the benefit of users.

 

  1. “Networks of control. A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy”, by Wolfie Christl, Sarah Spiekermann  [OPEN ACCESS]

“Around the same time as Apple introduced its first smartphone and Facebook reached 30 million users in 2007, online advertisers started to use individual-level data to profile and target users individually (Deighton and Johnson 2013, p. 45). Less than ten years later, ubiquitous and real-time corporate surveillance has become a “convenient byproduct of ordinary daily transactions and interactions” (De Zwart et al 2014, p. 746). We have entered a surveillance society as David Lyon foresaw it already in the early 1990s; a society in which the practices of “social sorting”, the permanent monitoring and classification of the whole population through information technology and software algorithms, have silently become an everyday reality” (p. 118).

One of the realities we need to take into account when assessing this phenomenon is that “Opting out of digital tracking becomes increasingly difficult. Individuals can hardly avoid consenting to data collection without opting out of much of modern life. In addition, persons who don’t participate in data collection, who don’t have social networking accounts or too thin credit reports, could be judged as “suspicious” and “too risky” in advance” (p. 129).

The authors of the report explain that the title “Networks of Control” is justified “by the fact that there is not one single corporate entity that by itself controls today’s data flows. Many companies co-operate at a large scale to complete their profiles about us through various networks they have built up” (p. 7). They also explain that they want to close a gap created by the fact that “the full degree and scale of personal data collection, use and – in particular – abuse has not been scrutinized closely enough”, despite the fact that “media and special interest groups are aware of these developments for a while now” (p. 7).

What I found valuable in the approach of the study is that it also brings forward a topic that is rarely discussed when analysing Big Data, digital tracking and so on: the attempt of such practices to change behaviour at scale. “Data richness is increasingly used to correct us or incentivize us to correct ourselves. It is used to “nudge” us to act differently. As a result of this continued nudging, influencing and incentivation, our autonomy suffers (p. 7)”.

A chapter authored by Professor Sarah Spiekermann explores the ethical implications of the networks of control. She applies three ethical normative theories to personal data markets: “The Utilitarian calculus, which is the original philosophy underlying modern economics (Mill 1863/1987). The Kantian duty perspective, which has been a cornerstone for what we historically call “The Enlightenment” (Kant 1784/2009), and finally Virtue Ethics, an approach to life that originates in Aristotle’s thinking about human flourishing and has seen considerable revival over the past 30 years (MacIntyre 1984)” (p. 131).

Methodologically, the report is based on “a systematic literature review and analysis of hundreds of documents and builds on previous research by scholars in various disciplines such as computer science, information technology, data security, economics, marketing, law, media studies, sociology and surveillance studies” (p. 10).

2. Global Commission on Internet Governance “Corporate Accountability for a Free and Open Internet”, by Rebecca MacKinnon, Nathalie Maréchal and Priya Kumar  [OPEN ACCESS]

The report shows that “as of July 2016, more than 3.4 billion people were estimated to have joined the global population of Internet users, a population with fastest one-year growth in India (a stunning 30 percent) followed by strong double digit growth in an assortment of countries across Africa (Internet Live Stats 2016a; 2016b)” (p. 1).

“Yet the world’s newest users have less freedom to speak their minds, gain access to information or organize around civil, political and religious interests than those who first logged on to the Internet five years ago” (p. 1).

Within this framework, the report explores the fact that “ICT sector companies have played a prominent role in Internet governance organizations, mechanisms and processes over the past two decades. Companies in other sectors also play an expanding role in global governance. Multinational companies wield more power than many governments over not only digital information flows but also the global flow of goods, services and labour: onethird of world trade is between corporations, and another third is intra-firm, between subsidiaries of the same multinational enterprise” (p. 5).

The authors also look at the tensions between governments and global companies with regard to requests for access to data, to weaken encryption and facilitate censorship in ways that contravene international human rights standards.

3. “Innovations in Federal Statistics: Combining Data Sources While Protecting Privacy”, by National Academy of Sciences [OPEN ACCESS]. 

The tension between privacy on one hand and statistical data and censuses on the other hand compelled the German Constitutional Court to create in the ’80s “the right to informational self-determination”. Could statistics bring a significant reform of such sort to the US? Never say never.

According to epic.org, the US National Academy of Sciences recently published a report that examines how disparate federal data sources can be used for policy research while protecting privacy.

The study shows that in the decentralised US statistical system, there are 13 agencies whose mission is primarily the creation and dissemination of statistics and more than 100 agencies who engage in statistical activities. There is a need for stronger coordination and collaboration to enable access to and evaluation of administrative and private-sector data sources for federal statistics. For this purpose, the report advices that “a new entity or an existing entity should be designated to facilitate secure access to data for statistical purposes to enhance the quality of federal statistics. Privacy protections would have to be fundamental to the mission of this entity“. Moreover, “the data for which it has responsibility would need to have legal protections for confidentiality and be protected using the strongest privacy protocols offered to personally identifiable information while permitting statistical use”.

One of the conclusions of the report is that “Federal statistical agencies should adopt modern database, cryptography, privacy-preserving and privacy-enhancing technologies”. 

4. Private Power, Online Information Flows and EU Law. Mind The Gap, by Angela Daly, Hart Publishing [50 pounds]

“This monograph examines how European Union law and regulation address concentrations of private economic power which impede free information flows on the Internet to the detriment of Internet users’ autonomy. In particular, competition law, sector specific regulation (if it exists), data protection and human rights law are considered and assessed to the extent they can tackle such concentrations of power for the benefit of users.

Using a series of illustrative case studies, of Internet provision, search, mobile devices and app stores, and the cloud, the work demonstrates the gaps that currently exist in EU law and regulation. It is argued that these gaps exist due, in part, to current overarching trends guiding the regulation of economic power, namely neoliberalism, by which only the situation of market failure can invite ex ante rules, buoyed by the lobbying of regulators and legislators by those in possession of such economic power to achieve outcomes which favour their businesses.

Given this systemic, and extra-legal, nature of the reasons as to why the gaps exist, solutions from outside the system are proposed at the end of each case study. This study will appeal to EU competition lawyers and media lawyers.”

Enjoy the read! (Unless the reform of the EU e-Privacy rules is taking much of your time these days – in this case, bookmark the reports of interest and save them for later).
***
Enjoy what you are reading? Consider supporting pdpEcho

A look at political psychological targeting, EU data protection law and the US elections

Cambridge Analytica, a company that uses “data modeling and psychographic profiling” (according to its website), is credited with having decisively contributed to the outcome of the presidential election in the U.S.. They did so by using “a hyper-targeted psychological approach” allowing them to see trends among voters that no one else saw and thus to model the speech of the candidate to resonate with those trends. According to Mashable, the same company also assisted the Leave. EU campaign that leaded to Brexit.

How do they do it?

“We collect up to 5,000 data points on over 220 million Americans, and use more than 100 data variables to model target audience groups and predict the behavior of like-minded people” (my emphasis), states their website (for comparison, the US has a 324 million population). They further explain that “when you go beneath the surface and learn what people really care about you can create fully integrated engagement strategies that connect with every person at the individual level” (my emphasis).

According to Mashable, the company “uses a psychological approach to polling, harvesting billions of data from social media, credit card histories, voting records, consumer data, purchase history, supermarket loyalty schemes, phone calls, field operatives, Facebook surveys and TV watching habits“. This data “is bought or licensed from brokers or sourced from social media”.

(For a person who dedicated their professional life to personal data protection this sounds chilling.)

Legal implications

Under US privacy law this kind of practice seems to have no legal implications, as it doesn’t involve processing by any authority of the state, it’s not a matter of consumer protection and it doesn’t seem to fall, prima facie, under any piece of the piecemeal legislation dealing with personal data in the U.S. (please correct me if I’m wrong).

Under EU data protection law, this practice would raise a series of serious questions (see below), without even getting into the debate of whether this sort of intimate profiling would also breach the right to private life as protected by Article 7 of the EU Charter of Fundamental Rights and Article 8 of the European Convention of Human Rights (the right to personal data protection and the right to private life are protected separately in the EU legal order). Put it simple, the right to data protection enshrines the “rules of the road” (safeguards) for data that is being processed on a lawful ground, while the right to private life protects the inner private sphere of a person altogether, meaning that it can prohibit the unjustified interferences in the person’s private life. This post will only look at mass psychological profiling from the data protection perspective.

Does EU data protection law apply to the political profilers targeting US voters?

But why would EU data protection law even be applicable to a company creating profiles of 220 million Americans? Surprisingly, EU data protection law could indeed be relevant in this case, if it turns out that the company carrying out the profiling is based in the UK (London-based), as several websites claim in their articles (here, here and here).

Under Article 4(1)(a) of Directive 95/46, the national provisions adopted pursuant to the directive shall apply “where the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State“. Therefore, the territorial application of Directive 95/46 is triggered by the place of establishment of the controller.  Moreover, Recital 18 of the Directive’s Preamble explains that “in order to ensure that individuals are not deprived of the protection to which they are entitled under this Directive, any processing of personal data in the Community (EU – n.) must be carried out in accordance with the law of one of the Member States” and that “in this connection, processing carried out under the responsibility of a controller who is established in a Member State should be governed by the law of that State” (see also CJEU Case C-230/14 Weltimmo, paras. 24, 25, 26).

There are, therefore, no exceptions to applying EU data protection rules to any processing of personal data that is carried out under the responsibility of a controller established in a Member State. Is it relevant here whether the data subjects are not European citizens, and whether they would not even be physically located within Europe? The answer is probably in the negative. Directive 95/46 provides that the data subjects it protects are “identified or identifiable natural persons“, without differentiating them based on their nationality. Neither does the Directive link its application to any territorial factor concerning the data subjects. Moreover, according to Article 8 of the EU Charter of Fundamental Rights, “everyone has the right to the protection of personal data concerning him or her”.

I must emphasise here that the Court of Justice of the EU is the only authority that can interpret EU law in a binding manner and that until the Court decides how to interpret EU law in a specific case, we can only engage in argumentative exercises. If the interpretation proposed above would be found to have some merit, it would indeed be somewhat ironic to have the data of 220 million Americans protected by EU data protection rules.

What safeguards do persons have against psychological profiling for political purposes?

This kind of psychological profiling for political purposes would raise a number of serious questions. First of all, there is the question of whether this processing operation involves processing of “special categories of data”. According to Article 8(1) of Directive 95/46, “Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.” There are several exceptions to this prohibition, of which only two would conceivably be applicable to this kind of profiling:

  • if the data subject has given his explicit consent to the processing of those data (letter a) or
  • the processing relates to data which are manifestly made public by the data subject (letter e).

In order for this kind of psychological profiling to be lawful, the controller must obtain explicit consent to process all the points of data used for every person profiled. Or the controller must only use those data points that were manifestly made public by a person.

Moreover, under Article 15(1) of Directive 95/46, the person has the right “not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”. It is of course to be interpreted to what extent psychological profiling for political purposes produces legal effects or significantly affects the person.

Another problem concerns the obligation of the controller to inform every person concerned that this kind of profiling is taking place (Articles 10 and 11 of Directive 95/46) and to give them details about the identity of the controller, the purposes of the processing and all the personal data that is being processed. In addition, the person should be informed that he or she has the right to ask for a copy of the data the controller holds about him or her and the right to ask for the erasure of that data if it was processed unlawfully (Article 12 of Directive 95/46).

Significantly, the person has the right to opt-out of a processing operation, at any time, without giving reasons, if that data is being processed for the purposes of direct marketing (Article 14(b) of Directive 95/46). For instance, in the UK, the supervisory authority – the Information Commissioner’s Office, issued Guidance for political campaigns in 2014 and gave the example of “a telephone call which seeks an individual’s opinions in order to use that data to identify those people likely to support the political party or referendum campaign at a future date in order to target them with marketing” as constituting direct marketing.

Some thoughts

  • The analysis of how EU data protection law is relevant for this kind of profiling would be more poignant if it would be made under the General Data Protection Regulation, which will become applicable on 25 May 2018 and which has a special provision for profiling.
  • The biggest ever fine issued by the supervisory authority in the UK is 350.000 pounds, this year. Under the GDPR, breaches of data protection rules will lead to fines up to 20 million euro or 4% of the controller’s global annual turnover for the previous year, whichever is higher.
  • If any company based in the UK used this kind of psychological profiling and micro-targeting for the Brexit campaign, that processing operation would undoubtedly fall under the rules of EU data protection law. This stands true of any analytics company that provides these services to political parties anywhere in the EU using personal data of EU persons. Perhaps this is a good time to revisit the discussion we had at CPDP2016 on political behavioural targeting (who would have thought the topic will gain so much momentum this year?)
  • I wonder if data protection rules should be the only “wall (?)” between this sort of targeted-political-message-generating campaign profiling and the outcome of democratic elections.
  • Talking about ethics, data protection and big data together is becoming more urgent everyday.

***

Find what you’re reading useful? Consider supporting pdpecho.

FastCompany.com: “DO YOU KNOW WHERE YOU’LL BE 285 DAYS FROM NOW AT 2 P.M.? THESE DATA-MASTERS DO”

The world of big data does not stop to amaze us. Apparently, data scientists are now able to even predict the future. This is a story published on FastCompany.com about Sadilek and Krumm’s paper titled “Far Out: Predicting Long-Term Human Mobility”:

“Would you like to know how crowded your drive to the beach will be in three weeks? Or where your ex will be on a Friday night next month so that you can avoid him?

Adam Sadilek, formerly of Microsoft, and John Krumm, a principal researcher at Microsoft, were inspired by the question of predicting where people would be in the future and even led off with the query, “Where are you going to be 285 days from now at 2PM?” in their paper, Far Out: Predicting Long-Term Human Mobility.

Read the whole story HERE.

Harvard Business Review: The Value of Big Data Isn’t the Data

Kristian J. Hammond wrote an interesting piece on the real value of big data in the Harvard Business Review blog:

“It is clear that a new age is upon us. Evidence-based decision-making (aka Big Data) is not just the latest fad, it’s the future of how we are going to guide and grow business. But let’s be very clear: There is a huge distinction to be made between “evidence” and “data.” The former is the end game for understanding where your business has been and where it needs to go. The latter is the instrument that lets us get to that end game. Data itself isn’t the solution. It’s just part of the path to that solution.

The confusion here is understandable. In an effort to move from the Wild West world of shoot-from-the-hip decision making to a more evidence-based model, companies realized that they would need data. As a result, organizations started metering and monitoring every aspect of their businesses. Sales, manufacturing, shipping, costs and whatever else could be captured were all tracked and turned into well-controlled (or not so well-controlled) data.

I would argue that what you want and what you need is to turn that data into a story. A story explains the data rather than just exposing it or displaying it. A narrative that gives you context to today’s numbers by exploring the trends and comparisons that you need in order to make sense of it all. The belief that Artificial Intelligence can support the generation of natural language reporting from data is what drove me to help found our company, Narrative Science. I fundamentally believe that a machine can tackle and succeed at freeing insight from data to provide the last mile in making big data useful, and this belief was the driver in building out a technology platform that makes it real.”

READ THE WHOLE ARTICLE HERE.

EU committee approves sharing of public data

According to thetelegraph.co.uk, the EU council’s so-called ‘Coreper’ committee backed Commission plans to open up public sector information such as geographical data and statistics to be re-used for any purpose across Europe.

European leaders hope this will inject a much needed €40bn into the crisis­-hit Eurozone as business and developers can use the data at a low cost to create products.

Neelie Kroes, Vice-President of the European Commission, said: “Opening up public data means opening up business opportunities, creating jobs and building communities.”

The UK was praised for the availability of its data. It is hoped that other countries in the 27-nation bloc can follow suit, and produce apps like Commuter, which provided live updates to London public transport users using such data.

Coreper committee’s latest endorsement is part of the drive to update the 2003 Public Sector Information Directive, which was introduced to make it easier for businesses to access and re-use government-held information.

Read the whole story HERE.

govhealthit.com: Q&A – Privacy activism in the age of Big Data

Read a very useful interview with Deborah Peel, MD, founder of the group Patient Privacy Rights, on govhealthit.com.

We chose an interesting sample:

Q: Your website cites a number of fairly nefarious and invasive scenarios: “If a school or university learns your child has ADHD or is being treated for depression, they may deny admission. If a boss knows you take Xanax or Zoloft, they may reconsider your promotion.” Wouldn’t both of those practices be illegal?

A: Of course it’s all illegal, using people’s information against them. But there’s no way that the poor employee can even know until later that’s something’s happened. For example, I can’t tell you how many stories psychiatrists hear about where somebody’s been out for two weeks for depression. They go back in, they’re assigned to a completely new job, and they end up quitting. How are they going to ever prove or know who looked at their records when there is no chain of custody? That’s the other thing that electronic records can prove: you can’t move them, you can’t open them, you can’t see them, without there being a transaction.

[Q&A: Health org’s don’t protect patient data for reasons dating ‘back to the industrial revolution’]

One of the things that we have lobbied for is a chain of custody and accounting of disclosures. And we did get that into the HITECH Act. You’re supposed to be able to get a chain of some disclosure, three years of all disclosures of electronic data from your EHR. They don’t even have the rules yet for how we can get disclosures of electronic health records — not from pharmacies, not from labs, not from insurers, not from all the other clearinghouses. What we really need is a chain of custody for all of health data, wherever it is. Because we don’t even know that, there’s no way to prove harm. One of our major projects right now is we’re working really hard with Harvard and Latanya Sweeney to raise the funds to build a data map. We do not even know how many entities have our information or what they’re doing with it. So how can we weigh risks and benefits, when we have institutional control of information, not patient control?

CPDP2013 Programme • Thursday 24 January 2013

13

CPDP2013 Panels at Grande Halle

8.45 Big Data: Big Promises, Big Challenges

co-organised by INRIA and CPDP

hosted by Daniel Le Métayer (INRIA) & Marieke De Goede (University of Amsterdam)

panel John Boswell, SAS (US), Toon Calders, Eindhoven University of Technology (NL), Stephane Grumbach, Inria (FR), Omer Tene, College of Management School of Law, Rishon Le Zion (ISR)

Big Data promises a great deal: they are inscribed with the potential to transform society, science, governance, business and society as a whole. But the quest for predictability which is at the core of Big Data also raises many questions related to determinism, discrimination, manipulation, conformism, to cite a few.

This panel will address the following issues:

  • What are the main benefits and risks associated with Big Data and the integration of large, diverse datasets?
  • What critical social, moral and legal problems are raised by Big Data and what could be the way forward to minimize the risks while not compromising the benefits?
  • Is the current philosophy of data protection in Europe compatible with Big Data or is it deeply called into question?

10.15 Coffee break

10.30 Balancing of Fundamental Rights in Online Copyright Enforcement

co-organised by IViR and CPDP

hosted by Serge Gutwirth, Vrije Universiteit Brussel (BE) & Nico van Eijk, University of Amsterdam (NL)

panel Fabienne Brison, VUB & Hoyng Monegier LLP (BE), Malcolm Hutty, EuroISPA (BE), Marietje Schaake, Member of European Parliament – ALDE (NL), Wendy Seltzer, World Wide Web Consortium & Chillingeffects.org (US)

There is an ongoing trend towards stricter enforcement of copyright on the internet. Is copyright enforcement possible without infringing fundamental rights?

11.45 DPAs and The Challenges of Cooperation

hosted by Charles Raab (University of Edinburgh) & Ivan Szekely (CEU)

panel Alexander Dix, Berlin Commissioner for Data Protection and Freedom of Information (DE), Hielke Hijmans, EDPS (EU), David Smith, Office of the Information Commissioner (UK), Kush Wadwha, Trilateral Consulting (UK)

Under the proposed European Data Protection Regulation, the data protection authorities of Member States are expected to co-operate with each other and with the Commission, and to achieve consistency in their activities. What are the prospects for this, and what has been their previous experience with joint activities and mutual assistance? The speakers in this panel are well-qualified to consider these and related questions in this subject, which is of great importance to the future of data protection.

13.00 Lunch

14.00 Data Protection: Redress Mechanisms and Their Use

co-organised by the EU Agency for Fundamental Rights and CPDP

hosted by Christopher Docksey (EDPS)

panel Ian Brown, Oxford Internet Institute (UK), Niraj Nathwani, EU Agency for Fundamental Rights (EU), Grzegorz Sibiga, Helsinki Foundation for Human Rights (PL), Eric Töpfer, Deutsches Institut für Menschenrechte (DE)

Earlier studies, including special Eurobarometer surveys and the report of the European Union Agency for fundamental rights (FRA) on data protection authorities, highlighted that redress mechanisms in the area of data protection are available, but little used. FRA will undertake legal and social fieldwork research on Member States’ redress mechanisms in the area of data protection to offer insights into the reasons why available redress mechanisms in the area of data protection are little used.

The panel will address the following issues:

  • usage of redress mechanisms in the area of data protection in the EU Member States;
  • barriers and incentives for using and applying particular redress mechanisms;
  • observations on need to improve accessibility and effectiveness of redress mechanisms;
  • observations concerning independence and resources of data protection authorities.

15.15 Coffee break

15.30 The Business Perspective on Data Protection Regulation

hosted by Christoph Luykx (Intel) & Rosa Barcelo (DG Connect – EC)

panel Frederico Etro, Universitá Ca’ Foscari (IT), speakers from IT firms

The panel aims at presenting an overview of the main issues on the horizon for ICT firms given global developments including legislative and regulatory such as the review of the EU data protection directive, market innovations and growing complexities.

16.45 Data Protection Legislation and Start-Up Companies

hosted by Erik Valgaeren (Stibbe)

panel Yves Baudechon, Social Lab Group (BE), Harri Koponen, Rovio (FI), speakers from start-ups, a Member of the European Parliament [tbc]

The European internet start-up economy is growing fast. Businesses are starting and growing across the European Union led by creative, driven people. More often than not, however, innovation relies on the processing of personal data. While there is no doubt that it is inspiring to see this level of impact in Europe, innovation needs the right regulatory environment in order to thrive. This panel seeks to dissect the apparent contradiction between the perception of bureaucratic burden and an increasingly citizen focused data protection framework and the need for innovation-friendly regulation.

18.30 2013 International Champion of Freedom Award, and Cocktail offered by EPIC (till 20.00)

 

CPDP2013 Panels at Petite Halle

10.15 Coffee break

 

10.30 EU fight against botnets: a honeypot for personal data? (till 13.00)

co-organised by JRC-Institute for the Protection and Security of the Citizen and CPDP

hosted by Laurent Beslay, European Commission, JRC – Institute for the Protection and Security of the Citizen (EC)

panel Alberto Escudero-Pascual, IT46.se (IT), Eric Freyssinet, Cybercrime Division of Gendarmerie Nationale (FR), Corrado Leita, Symantec, Jean-Christophe Le Toquin, ACDC EU project Microsoft (FR), Pasquale Stirparo, JRC-Institute for the Protection and Security of the Citizen (EU).

How to offer legally admissible evidence for taking down botnets and being in compliance with the EU data protection regulatory framework which imposes stringent safeguards on the confidentiality of personal communications and their related traffic data? Through an interactive discussion between the speakers and the participants, innovative solutions for detecting, measuring, analysing, mitigating and eliminating botnets taking into account the principle of privacy by design will be presented and debated.

13.00 Lunch

14.00 Onlife Manifesto – Being Human and Making Society in the Digital Age: Privacy in Light of Hannah Arendt

co-organised by DG Connect, CRIDS and CPDP

hosted by Luciano Floridi, University of Hertfordshire & University of Oxford (UK) & Michael Friedewald, Fraunhofer ISI (DE)

panel Charles Ess, University of Oslo (NO), Luciano Floridi, University of Hertfordshire & University of Oxford (UK), Claire Lobet-Maris, University of Namur – CRIDS (BE)

For Hannah Arendt, politics emerge from the plurality and the public space is the space lying between us, where each of us can experience freedom. “While all aspects of the human condition are somehow related to politics, this plurality is specifically the condition – not only the conditio sine qua non, but the conditio per quam – of all political life” (The Human Condition).

This panel will focus on what matters for the public space, and in particular:

  • the questions raised by the computing era and the current regulation of privacy;
  • the means needed to reinvigorate the sense of plurality;
  • the responses of the “Onlife Manifesto” produced by an interdisciplinary group of experts.

The speakers are members of a scientific group leading a conceptual work called the “Onlife Initiative”. This initiative is part of the Digital Futures project, initiated by the DG Connect: Nicole Dewandre – DG Connect – European Commission (EU).

15.00 Coffee break

15.30 Surveillance and Criminal Law

co-organised by EU PF7 project IRISS and CPDP

hosted by Antonella Galetta (VUB) & Gary T. Marx (MIT)

panel Eric Metcalfe, Monkton Chambers, Representative from the NGO Liberty [tbc] (UK), Representative from the company Omniperception [tbc] (UK), John Vervaele, University of Utrecht (NL).

This panel will look at how surveillance systems are operated in our everyday life and for law enforcement purposes. In particular, it will focus on the presumption of innocence and the impact of surveillance on fundamental rights. It will deal with these issues broadly as well as looking at the most specific contexts of the use and deployment of surveillance measures in pre-trial and post-trial contexts. Specific surveillance technologies and practices will be examined, such as CCTVs and electronic monitoring systems. These issues will be dealt in a comparative perspective considering the European and US experiences.

The main topics of discussion will be:

  • What are the effects of surveillance on the presumption of innocence?
  • How the impacts of surveillance on the presumption of innocence are countered by legislation and case law?
  • How are surveillance systems deployed in prisons and within the criminal justice system?
  • How is surveillance operated beyond regimes of custody?

16.45 Surveillance, Democracy and the State (till 18.00)

co-organised by EU PF7 project IRISS and CPDP

hosted by Reinhard Kreissl (IRKS) & Chiara Fonio (Università Cattolica del Sacro Cuore)

panel Roger Clarke, Xamax Consultancy Ltd (AUS), Ben Hayes, Statewatch (UK), Clive Norris, University of Sheffield (UK), Rowena Rodrigues, Trilateral Research and Consulting (UK)

Surveillance and democracy are intimately intertwined. Every modern polity has developed an elaborate system for identifying constituencies and monitoring and controlling the population. Modern surveillance practices as a means of governance are introduced with a double justification. On the one hand a surveillance regime is required for the distribution of entitlements such as social welfare payments, providing the data for social planning and the allocation of resources. On the other hand, large-scale surveillance is supposed to help identify predators, criminals and terrorists. As well as this, the private sector plays an increasing role, both as complicit in state surveillance, and by forming its own nucleus of surveillance.

The panel will address the following issues in particular:

  • The non-reciprocal nature of visibility in contemporary “democratic” surveillance societies. Should the watchers should be as transparent as the citizens they surveil?
  • Is the democratization of surveillance technologies possible? If yes, to what extent?
  • Why is the legitimacy and social cost of surveillance technologies so often overlooked?
  • Do modern surveillance technologies require new ethical thinking?

CPDP2013 at La Cave

10.15 Coffee break

10.30 Anti-Discrimination by Design in Social Data Mining

co-organised by the EU PF7 project MODAP and CPDP

hosted by Dino Pedreschi (University of Pisa) & Rosamunde Van Brakel (Vrije Universiteit Brussel)

panel Raphaël Gellert, Vrije Universiteit Brussel (BE), Stan Matwin, Dalhousie University (CA), Salvatore Ruggieri, University of Pisa (IT), Tal Zarsky, Falculty of Law, University of Haifa (ISR)

Social data are at the heart of the idea of a knowledge society, where decisions can be taken on the basis of knowledge in these data. Mining technologies enable the extraction of profiles useful to screen people when searching for those with a certain behavior. Profiles are useful in many context, from criminal investigation to marketing, from genetic screening to web site personalization. Profiles can help the categorization people on the bases of their personal and intimate information. Unfortunately, this categorization may lead to unfair discrimination against protected groups. It obvious that discrimination jeopardizes trust therefore inscribing non-discrimination into the knowledge discovery technology by design is becoming indispensable.

11.45 Engineering privacy-aware systems and services

co-organised by the EU PF7 project NESSOS and CPDP

hosted by Fabio Martinelli (CNR Pisa)

panel Fabio Martinelli, CNR-Pisa (IT), J. Lopez, Univerity of Malaga (ES), M. Clavel, Imdea, J. Cuellar, Siemens.

This panel aims to examine the concept of privacy by design from several perspectives. Indeed, most current engineering approaches consider security only at the technological level, failing to capture the high-level requirements of trust or privacy. We discuss privacy enhancing mechanisms for future internet services, in particular for mobile devices.

13.00 Lunch

14.00 Privacy by Design in Big Data and Social Data Mining

co-organised by the EU PF7 project MODAP and CPDP

hosted by Fosca Giannotti (University of Pisa)

panel Elena Ferrari, University of Insubria (IT), Roberto Lattanzi, Italian Data Protection Authority (IT), Manolis Terrovitis, Institute for the Management of Information Systems (GR), Tal Zarsky, Falculty of Law, University of Haifa (IS)

One of the most fascinating challenges of our time is understanding the complexity of the global interconnected society. The big data, originating from the digital breadcrumbs of human activities, promise to let us scrutinize the ground truth of individual and collective behavior. However, the big data revolution is in its infancy, and there are many barriers to set the power of big data free for social mining, so that scientists, and in prospect everybody, can access the knowledge opportunities. One of the most important barriers is the right of each individual to the privacy, i.e., the right to protect the own personal sphere against privacy violations due to uncontrolled intrusions. The key point is: how can the right to access the collective knowledge and the right to individual privacy co-exist?

15.00 Coffee break

15.30 Academic Papers Session (till 18.00)

hosted by Ronald Leenes (Tilburg University) & Jean-Pierre Nordvik (JRC) (tbc)

Session 1: Empirical research and case: privacy attitudes, concerns and responses • Papers

  • The cost of using Facebook: Assigning value to different aspects of privacy protection on social network sites, Wouter Steijn (Tilburg University, NL)
  • All my mates have got it, so it must be okay”: Constructing a Richer Understanding of Privacy Concerns, Anthony Morton (University College London, UK)
  • The Rise of African SIM Registration: Mobility, Identity, Surveillance & Resistance, Kevin Donovan (University of Cape Town, RSA) and Aaron Martin (London School of Economics and Political Science, UK)
  • Personal Data Protection in Malaysia; Different Principles. Different Approaches, Noriswadi Ismail (Quotient Consulting, Malaysia)

Session 2: Data protection concepts, regulation, and reform • Papers

  • The proposed data protection regulation and international data transfer in cloud transformations: a kaleidoscopic view, Iheanyi Nwankwo, Corrales Marcelo and Nikolaus Forgó (Leibniz Universität Hannover, DE)
  • The Impact of Data Protection on the Interests of Indivuduals: The Data Protection Premium?, Orla Lynskey (London School of Economics, UK)
  • Forgetting about consent. Why the focus should be on “suitable safeguards” in data protection law, Gabriela Zanfir (University of Craiova, RO)
  • Realizing the Complexity of Data Protection, Marion Albers (Hamburg University, DE)

 

CPDP2013 side events second day

20.00 Open Roundtable: No Free Lunch on Social Media at De Markten

organised with deBuren, Les Halles, EMSOC and CPDP

hosted by Jo Pierson, iMinds-SMIT Vrije Universiteit Brussel (BE) and Dominique Deckmyn, De Standaard (BE)

panel Colin J. Bennett, University of Victoria (CA), Rob Heyman, iMinds-SMIT Vrije Universiteit Brussel (BE), Alain Heureux, IAB (EU), Bruno Schröder, Microsoft Belux (BE)

Social media seem to challenge users’ privacy as the platforms value openness, connecting, and sharing with others. They became the focal point for privacy discussions as EU regulation and consumer organisations advocate privacy.

In a forever increasing connectivity, online advertising has become a key source of income for a wide range of online services. It has become a crucial factor for the growth and expansion of the Internet economy. For digital advertising to continue to grow, it needs the right set of rules. To create growth, also trust needs to be encouraged in emerging technologies, so that consumers feel comfortable using them. How much privacy do users expect on platforms designed to share information?

In this debate we wish to dissect the privacy definitions proposed by social media platforms, advertisers and consumer representatives. We especially wish to focus on the current trade-off made on all social media; users are offered free access to social media, but in the end the advertisers pay through advertising for their free lunch.

In this context the solutions that offer users online anonymity, to take them around the current web 2.0 business models, are also explained. How can the eco-system work and be sustainable? We want to discuss if a perfect fit exists for the three stakeholders: users, social media platforms and advertisers.

Registration is required at http://emsoc.be/wec_events/no-free-lunch-on-social-media-2/?ac=1T

Obama administration will spend 200 mil. $ for the research and development of "Big Data"

Aiming to make the most of the fast-growing volume of digital data, the Obama
Administration today announced a “Big Data Research and Development Initiative.”

Highlights of the press release:

  • the initiative will improve “our ability to extract knowledge and insights from large and complex collections of digital data”
  • “In the same way that past Federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education, and national security,” said Dr. John P. Holdren, Assistant to the President and Director of the White House Office of Science and Technology Policy.
  • the purpose of the Big Data Research and Development Initiative is to:

– Advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze, and share huge quantities of data.

– Harness these technologies to accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning; and

– Expand the workforce needed to develop and use Big Data technologies.

  • several concrete projects will be developed by the National Science Foundation, the Department of Defense, the National Institutes of Health, the Department of Energy, and the US Geological Survey.

Read the whole press release HERE.