India’s Digital Personal Data Protection Bill passed: “it’s a bad law”

Data protection in the United States: Where do we go from here?

Earlier this month, Facebook founder and CEO Mark Zuckerberg testified before the U.S. Congress to answer questions about the Cambridge Analytica scandal and how his company operates. Questioning during the hearings revealed that what happened is only the tip of the iceberg of the data privacy crisis for companies like Facebook that rely on data harvesting and manipulation. If media attention to the hearings in the U.S. and globally is any indication, there is now strong momentum behind making changes to the law to stop another “Cambridge Analytica” from happening. Many of the questions from members of Congress at the hearing were quite pointed. Nevertheless, it’s not clear yet what lawmakers are actually prepared to do.

Whether in the U.S. or elsewhere, we advocate for comprehensive, user-centered data protection, and for laws and regulations that meet international human rights standards (see  Creating a Data Protection Framework: A Do’s and Don’ts Guide for Lawmakers, built on our experiences working with lawmakers on Europe’s General Data Protection Regulation, or GDPR). Our mission is to defend and extend the digital rights of users at risk, and privacy is a cornerstone of human rights in the digital age. Data protection is of critical importance to privacy, free expression, and a healthy, functioning democracy. However, getting there entails different approaches in different countries, and each piece of legislation must be evaluated on its merits.

In this post, we look at the context for reform in the U.S., provide a brief analysis of current proposals before Congress for data protection, and outline the provisions that we would like to see in proposals going forward.

Why is the path unclear?

Many countries and regions have passed laws to protect people’s data, and the European Union even recognizes data protection as a human right. The United States has not done anything of the kind, instead favoring an approach that has led to repeated, catastrophic data breaches and privacy violations with little or no recourse for users.  

The U.S. has taken a “sectoral” approach to privacy, meaning there is a patchwork of laws that give us some limited protections for certain types of data, like our health data or student data. But there is no blanket protection from unchecked data collection, misuse, manipulation, or abuse.

In fact, the only real limitation on what companies like Facebook or Cambridge Analytica can do with our data is the Federal Trade Commission Act’s (FTC) prohibition on “unfair and deceptive trade practices,” as enforced by the FTC. The agency conducts investigations and enters into consent decrees with entities believed to have violated this provision. These decrees are important, but it’s abundantly clear that they are not effective. After all, Facebook was already subject to a consent decree dating back to 2011, and it did little if anything to prevent the Cambridge Analytica misuse of data. Notably, experts have called attention to the deficiencies in the independent audits (or “assessments,” depending on what official FTC statement or document you read) that are frequently required under consent decrees:

“These audits, as a practical matter, are often the only ‘tooth’ in FTC orders to protect consumer privacy. They are critically important to accomplishing the agency’s privacy mission. As such, a failure to attend to their robust enforcement can have unintended consequences, and arguably, provide consumers with a false sense of security.”

The Cambridge Analytica scandal highlights many of the problems with the U.S. approach. We generally have very little understanding regarding the amount of data that companies like Facebook collect about us, let alone understanding of the profiles they create or inferences they make from that data. Further, these companies are under little obligation to provide meaningful information about whether and how our profiles are purchased, analyzed, or transmitted. Terms of service/use are often long, complicated, and at the same time, provide little useful detail. For example, many companies have a lengthy, impenetrable policy with provisions that let them share undefined personal information with undisclosed “third parties,” including “vendors” and/or “business partners.” It is the lack of restrictions on processing our personal data that leads to an environment where Cambridge Analytica can manipulate the huge cache of data it has procured to design and serve misleading advertisements.

In short, the status quo in the U.S. does not protect people, and as such, it is not sustainable. So we are starting to see members of Congress introduce or revive pieces of data protection legislation. Some of these proposals are regressive and may only serve to further entrench the prevailing business model that rewards unchecked data collection and manipulation/exploitation in the dark. Others are a solid starting point for a conversation about what is necessary to provide the data protection people in the U.S. and around the world desperately need.

Here is our analysis of current federal data protection proposals (PDF). As you can see, there is no magic bullet, and none of the proposals now in play do all the things we need to prevent what happened with Cambridge Analytica or adequately protect people.

When do the trumpets cheer? Here’s what we need to protect the users

As we note above, Access Now released a guide for lawmakers on creating rights-respecting data protection regulations, based on our experiences working with lawmakers in Europe on the GDPR, which comes into force next month. Building on our work in Brussels and globally, we have created a list of items that should be included in a truly comprehensive, federal approach to data protection. These are the elements necessary to fully protect people, in the U.S. and elsewhere, in our increasingly connected world.

First, a comprehensive set of data protection laws should apply equally to any entity that collects, uses, or manipulates information about people, whether public or private. It should not preempt or prevent the creation of any stronger protections that are already written into federal law or exist at the state level. It should also be forward looking, contemplating the wealth of information that will be available through the Internet of Things. And it should support the growth of business models that are not built on the collection and exploitation of massive amounts of sensitive data.

The law should provide the following set of rights:

User rights

  • Right of access
  • Right to erasure
  • Right to rectification
  • Right to explanation
  • Right to portability
  • Right to object

Ideally, it should create and fund:

Government programs and investments

  • Creation of a grant program for companies investing in privacy-protective business models and practices, including any model not based around user data;
  • Commitment to the protection of digital security, including encryption, and investment in research and development to explore best methods for protecting user data;
  • Creation of a board to develop security best practices for Internet of Things devices (Representative Lieu and other members of Congress have already introduced a bill that would take this approach);
  • Investment in companies that explore and develop systems for greater interoperability of edge providers;
  • Research into the harms of data breaches of non-financial personal data and potential redress mechanisms to respond to those harms; and
  • Establishment of an independent data protection commission with authority and resources to monitor implementation, conduct investigations, and sanction entities in case of data protection violations.

It should require:

Obligations on all entities processing data

  • Limitation of data processing to specific, enumerated purposes, including meaningful, opt-in consent, execution of a contract, or as necessary under law, and with heightened protections for the most sensitive data;
  • Affirmative obligation to issue timely notification to users when, and to whom, data are transferred, eliminating the shadow internet industries built around user data by creating a connection back to the person;
  • A blanket public data breach notification for all breaches, with individualized noticed necessary in the case of potential harm, including emotional harm;
  • Prohibition on the use of algorithms to arbitrarily discriminate, including against marginalized communities and communities of color; and
  • Further prohibition on mandatory arbitration clauses for users.
Once more, with feeling

From Europe to India, advocates around the world are fighting for better data protection, and winning. However, this is not the first rodeo for groups fighting here in Washington, DC. The U.S. has taken steps toward data protection legislation in the past, but, despite these efforts, it has never remained a high priority for members of Congress. However, as people have continued to embrace the digital world in their everyday lives, and more sensitive information moves into the custody of applications, platforms, and other third parties, the need has only gotten more critical and we can’t afford to wait any longer. We urge Congress to pay attention to the interactions between Facebook, Kogan, and Cambridge Analytica, not to mention the increasingly egregious data breaches and abuses, and to act now to plug the gaping holes in U.S. law. It may already be too late to provide redress for present harms, but we can help safeguard the future.