Image source: Alec Perkins
In September the EU Commission published its draft proposal for an improved export controls regime, seeking to modernise and simplify the existing system to limit the spread of technology for online surveillance. This “dual-use” technology has increasingly been employed to violate human rights around the world, but regulating it requires a delicate balance. As the term dual-use implies, technology in this category has fully legitimate uses, as well as potentially harmful ones. The same technologies that underlie surveillance systems are used for security research.
Previously, similar proposals have raised concerns globally, with many experts asserting that the lack of precision in the definitions used in export control legislation will have a chilling effect on digital security research. Computer scientists in particular are often averse to government restrictions on the export of technology because historically restrictions on cryptography have harmed computer security. Those fears have been exacerbated by some of the recent debate regarding the scope of controls on “intrusion software” in the Wassenaar Arrangement, but the need to limit the export of surveillance technology remains. The EU Commission hopes to make the framework for these controls more functional.
Overall, it’s a good start
The proposal aims to bypass some of the concerns cited above also by introducing a “catch-all” mechanism intended to either flag or green-light technology based on specific characteristics. In practice, this means that a whole category of technology could be expedited for export, while other, previously unregulated technologies might now fall under the umbrella of export control and require an export license. The proposed catch-all mechanism is a unique tool, and it could become essential in regulations that are technology-neutral and adaptable to future developments in technology and human rights.
All right, but if you had to nitpick?
Well, for the catch-all to work, it relies on companies undergoing due diligence and potentially submitting their technology for “voluntary” review by the export controls authority, when the technology has not been explicitly regulated. This is as counter-intuitive as it sounds. It puts a lot of trust in, and responsibility upon, companies’ internal compliance programs. These programs would be hard to monitor; after all, companies don’t have the same transparency requirements that governments do.
There is also more room for transparency in reporting on technology assessment through the catch-all. It would be incredibly valuable for civil society, academics, and NGOs alike to gain insight to which technologies are being waved through and which flagged as requiring a license for export. The draft legislation specifies a 30-day internal period for consultation between export authorities, but there is no mechanism for insight or input into the process by anyone else. That is something that should be remedied going forward.
Notably, encryption technology has been put on the list of EUGEAs — EU general export authorisations. Theoretically, this means that encryption technology would not carry a heavy regulatory burden, since companies would not be required to apply for approval each time there is an update, or an internal transfer. However, in reality, it’s likely that any product intended for export that incorporates encryption functionality will still be put through the ringer. That leaves room for improvement.
There are also lingering concerns about definitions. The proposal uses terms such as “terrorism”, “human rights violations”, and “serious human rights violations” without much-needed context — an issue that the commission promises to fix once the guidelines in the proposal are further developed. Some could, would, and have argued that it is a little too late in the process to have left this issue unaddressed. Right now the guidelines as such have questionable legal force and the catch-all process is voluntary. This could prove to be yet another example of privatised enforcement measures that put the rule of law at risk.
Going forward, we need more collaboration
Ultimately, this proposal puts a great deal of responsibility on the export control authorities, who will need to ensure that controls are narrowly applied, limiting the global spread of equipment, software, and technology designed for surveillance, without chilling security research or hobbling the basic functionality of systems. Legislators must therefore put emphasis on consulting with technologists, industry, and civil society to make sure companies take responsibility for respecting human rights, while maintaining and promoting internet security across the board.
This conversation is well underway, with the next stakeholder meeting taking place today, on 12 December 2016. We are in the room together with other civil society groups to advocate for human rights and strong security, and to ensure that any new regime is implemented with adequate transparency and accountability. As always, we will keep you updated as the regulation moves forward.