The state of play on the European data protection reform package
10:53am | 24 October 2012 | by Raegan MacDonald,
Recently, Access attended a two-day hearing bringing together parliamentarians from EU member states about the Data Protection Reform Package (proposal for a Regulation and Directive) in the European Parliament.
The overarching theme of the conference was building “trust,” which is particularly important for ensuring consumer protections and fostering commerce in the online environment. As Vice-President of the Commission Viviane Reding pointed out, only 1% of Europeans had internet access when the current Directive was passed in 1995, so the framework must be strengthened and adapted to suit the realities of ubiquitous computing, and be “future proof” to withstand more technological changes. This will be no easy task.
There were many speakers from the Commission, European Parliament, civil society, and researchers and US representatives. (list of participants from national parliaments and hearing agenda and speakers. The hearing attempted to touch on the many aspects involved in the reform package which consist of two pieces of legislation -- the proposed General Data Protection Regulation (GDPR) and the Data Protection Directive (that will deal primarily with the processing of data for law enforcement purposes). Like many of these events that synthesise incredibly vast amounts of complex topics into two days, there was a lot of “buzz word bingo”. However, overall it was productive to hear the main concerns and positions of the MPs from individual member states.
What follows is a brief overview of some of the issues that were discussed which are of particular relevance to Access’ work and members.
Right to be Forgotten
The now infamous Right to be Forgotten (Article 17 of the proposed GDPR), received a great amount of attention from national parliamentarians. Broadly, there are two areas of criticism: one on the potential for this right to infringe on free expression, and the other, about the possibility of imposing intermediary liability (by making service providers responsible for the deletion of content). The name itself is slightly misleading, and it is unclear as to how this could be realistically applied to the online environment without having detrimental effects on free speech. However, it is worth pointing out that the principle itself builds on rights that already exist in European data protection law. The primary purpose of Article 17 is to have data removed that people have shared about themselves, and does not include data published by others.
Additionally, Article 80 of the proposed Regulation directs states to provide exemptions/derogations to protect freedom of expression. The trick will be to ensure that reasonable exceptions in the Regulation are expanded to cover research, (all) media, and artistic and literary expression. As individual Member States sometimes have different interpretations of the fundamental right to freedom of expression, there remains some skepticism as to how this will guarantee that the right to privacy and the right to expression do not conflict.
On making controllers (or services) liable for the enforcement of this right is also a dangerous and unwanted result of the Right to be Forgotten. Making online services liable for the availability of content over which they have no control can lead to measures (e.g., blocking, de-indexing) that infringe on freedom of expression and could also lead to the implementation of monitoring technologies that could greatly erode privacy, which would be, needless to say, a perversion of what this right intends to do.
Privacy by Design / by Default
Another addition to the data protection framework is data protection by design and by default, Article 23 of the GDPR. The concept of privacy by design is simple -- data controllers, in both the private and public sectors, should “bake” privacy into technological architecture of products and services as well as organisational policies. This “end to end” privacy approach includes, but is certainly not limited to, making use of privacy enhancing technologies (PETs) and carrying out privacy impact assessments where necessary.
This concept was also discussed in moderate detail, particularly in Session IV on “data controllers and processors in the private and employment sector.” Notably, Managing Director at Facebook (and former MEP) Erika Mann, explained that while privacy by design is indeed a novel idea, it is not conducive to application of social networks. This is not the first time we’ve heard statements like these from large companies, but this line of argumentation is flawed and frankly dilutes the quality of data protection debates by pitting sharing and privacy against each other.
While it is true that people join social networks like Facebook to share, that doesn’t necessarily imply that they do not also value their privacy. Whether or not your information can be shared with anyone (including advertisers) without your knowledge is completely different from making conscious choices about sharing details about yourself with your friends and family.
It is no secret that there exists tension between the practices of large US companies -- such as Facebook and Google -- and European data protection regulators (look here and here, for example). However, touching on the sensitive divide in privacy approaches between the two sides of the Atlantic, Marc Rotenberg, President of the Electronic Privacy Information Center (EPIC), reminded attendees that in principle we all want the same things. These are: updated and relevant privacy laws, where legal rights can be enforced; the honouring of privacy policies and ensuring companies are held accountable for their actions; transparency on data processing; privacy by design and PETs routinely adopted; special protections for children; and ensuring that data subjects have control over their data. These are the elements required to build trust.
He indicated support for a strong international framework, and sees the GDPR as one that could potentially serve as a first step towards this goal. We couldn’t agree more.
Public pressure abounds: How can the EU make this work and not kill innovation?
It’s well known that the Commission has been under pressure from the US Chamber of Commerce to weaken the proposed Regulation. They seem to be particularly concerned with some aspects such as the heightened fines that can be imposed, “privacy by design” and mandatory data breach notifications. Adam Schlosser, US Chamber of Commerce’s Senior Manager for Global Regulatory Cooperation, in an interview with TechWeekEurope, said that they have been engaged on the GDPR since March, with a task force of 50 working on this.
However, one of the greatest misconceptions about strengthening user privacy is that it’s bad for the economy. Commissioner Reding has pointed out that the Regulation will save EU companies 2.3 billion Euros (3 billion USD) per year in administrative costs.
MEP Alexander Alvaro (Germany, ALDE) proposed the thought of the GDPR becoming part of a global standard on data protection. However, despite new efforts to increase privacy -- such as the Consumer Privacy Bill of Rights -- the United States is still quite a ways away from concrete legislation regulating privacy. Data protection is also becoming a meaningful criterion to support and ensure trust in international trade. Many emerging economies, such as India, are eager to be certified as data-secure by Europe to enable greater bi-lateral trade.
Consumer research shows that, even for US citizens, when consumers have trust -- which is best fostered when they have knowledge about what data is being collected, by whom, and what’s being done with it -- they are more likely to participate in the e-economy. Strong data protection laws are good then not only for individual rights, but for the economy. This is what builds trust, this is what the future looks like.