What the EU is getting wrong about the Internet of Things

At the end of January, a couple of us at Access Now attended the Computers, Privacy, and Data Protection (CPDP) conference in Brussels, Belgium. The theme was “The Internet of Bodies” and the topic du jour was everything connected to the Internet of Things — with  topics ranging from its potentially positive impact on society to downright dark and dystopian visions of a world where humans lose autonomy to machines.

But is there cause to worry?

At one of the panels, “Information and Law Enforcement,” Jan Philipp Albrecht, Member of the European Parliament for the German Greens, highlighted the inadequacy of current legal frameworks for dealing with the real threats that a connected IoT world brings. He argued that there are types of data to which law enforcement should NEVER have access; that states should never demand data retention from the IoT; and that as a (still) liberal society, the use of this technology means that we need to re-visit our values and take a critical look at how we are treating privacy.

This created some buzz on the panel and in the audience. Policy makers have begun to address the concerns raised by the IoT, but this effort is largely focused on cybersecurity for vital infrastructure  — like preventing massive data breaches or botnet attacks — and not necessarily on ensuring better protection for individuals. In the EU, we have one of the strongest data privacy frameworks worldwide, the General Data Protection Regulation (GDPR). It’s intended to protect our personal data, but does not apply when the information is sought in a criminal investigation. In the EU, that is an area covered by the Police Directive and national legislation in member states. In the world of IoT, this distinction could be problematic for human rights. States may push ahead with over-broad legislation and crime prediction data analysis, and unless we are careful, that could mean that we will ending up living in the digital panopticon that civil society has so long cautioned against.

The issues that Jan Albrecht raises suggest that it may be time to build a wall between the state (law enforcement or otherwise) and IoT-generated and curated data about us, before it all finds its way into the cloud. A wall that no warrant, or direct request of a company, can cross. When we turn our personal thoughts, habits, and needs into data, these data should be only ours to hold and handle. A bit like when it’s only in our own minds.

Innovation and other buzzwords

The EU Commission has an Internet of Things unit, which is placed in DG Connect as a part of its Digital Single Market implementation strategy. The overarching intention of this unit is to strengthen trust in the digital ecosystem, but in general its focus is on creating a stable market for IoT and making sure that EU policy serves IoT innovation and research. Looking at the outcomes document from an IoT Privacy and Security Workshop or the Commission’s working document for the advancement of IoT in Europe, known as the Digitising European Industry framework (DEI), it would be easy to forget about IoT privacy threats. In fact, it appears that policy makers consider the biggest obstacles to developing a thriving IoT environment to be building the capacity of networks to handle a large number of connected devices and the fragmentation of approaches for getting that done.

To be fair, the DEI does mention privacy when it identifies creating a human-centered IoT as a key challenge:

“Risk of users being forced to compliance and data sharing instead of developing a human-centred IoT where users can trust that the IoT systems around them operate according to understood principles and guarantees for their integrity, privacy and security.”

However, appearing to put emphasis on privacy is likely a smokescreen in order to build users’ trust in the utility of Europe’s IoT services, and the true end-game is to serve the market, versus the benefit and protection of European users.

This is in stark contrast to the approach taken under the current e-Privacy reform, which seeks to change its current service-based approach in order to benefit and serve the users, and protect their privacy rights. This strategy of taking credible action to build users’ trust in the online environment is indispensable for the success digital economy.

The other piece of this puzzle for the European Commission is certification and trust labeling for the IoT — which could be key for ensuring the security of the infamously insecure world of connected things. This is still under development. It is partially driven by a need to secure critical infrastructure (as required under the Network and Information Security Directive) and partially by similar initiatives under the European Union’s Cybersecurity Strategy. If done right, certification could help ensure better security practices, although it won’t solve every problem (in fact it can create an artificial ceiling for security), and does nothing to restrict the growing use of data-crunching technologies that are increasingly intertwined with our physical interactions.

Proceed with human rights!

That data crunching is a problem for human rights. Even if we achieve better security for the Internet of Things, it does not guarantee better privacy for us. There is a narrative across the EU that although privacy and data protection must be respected, it simply cannot come at the cost of innovation and research. That idea is in itself is flawed. There is no better time than right now to question the world that is being built around us, and as civil society, it is our duty to ask the hard questions, and make sure innovators have good answers before they proceed.

The framework for human rights in an IoT world will not survive if it’s anchored solely in cybersecurity policies aimed at protecting the infrastructure. Moreover, as most techies will tell you, no amount of certification/standardization can fully guarantee the security of a system. In order to guarantee human rights in the Internet of Things, our key focus must be on the users — and giving them real protections and safeguards (for more on this, read our blog post on human rights safeguards for IoT). The European Commission should be more ambitious when it comes to regulating the IoT market, rerouting its focus from the market to the users.