No more deceptive designs: recommendations for U.S. lawmakers 

Digital products, services, and platforms have become an integral part of our lives, shaping our experiences and interactions. The design of such digital interfaces plays a crucial role in establishing trust and facilitating seamless user experiences. However, as technology evolves and new generations embrace digital tools, more of us are falling victim to manipulative tactics among these interfaces aimed at influencing our decisions. These tactics, commonly known as “deceptive designs” or previously as “dark patterns,” erode our autonomy and control.

While defying any single definition, one can understand a “dark pattern” as a user interface design technique in a product or service that is aimed at influencing a person’s decision, often against their own best interests. The term “dark patterns,” coined in 2010, has recently come under scrutiny for associating “dark” with “bad,” and multiple actors, including the originator of the phrase, have turned to “deceptive design” as an alternative.

The consequences of deceptive designs extend beyond mere inconvenience. They give rise to significant human and civil rights violations, compromising individual freedoms and privacy. For instance, many services and platforms coerce individuals into sharing their personal information even when it’s not necessary to offer a service. The companies behind websites and applications, particularly those with e-commerce elements, often employ deceptive designs to nudge people into spending more money than they intend. Others use intricate menu structures and convoluted navigation processes that waste people’s time in order to boost “engagement.” Many also find it confusing when they are asked to give consent. More often than not, people are forced to give consent to get access to a service, and there is no easy way to revoke their consent, even if the option is implied in the companies’ privacy policies or terms of service. Moreover, when people recognize they’ve been trapped by these dark patterns, they may persist with their “choice” due to shame, embarrassment, or frustration.

To address this issue, Access Now has prepared a set of principles to guide U.S. lawmakers seeking to protect their constituents from deceptive designs. We strongly encourage policymakers to prioritize legislative and policy proposals that safeguard individual autonomy and protect users’ data, preventing exploitation and abuse.

Design should empower individuals to make informed decisions and exert control over their digital experiences. By advocating for transparency, ethical design practices, and robust regulatory measures, lawmakers can foster a healthy digital ecosystem that respects people’s autonomy and dignity and upholds fundamental human rights.

As an organization committed to defending and extending the digital rights of individuals and communities at risk, we are prepared to help lawmakers make informed decisions about deceptive design policy. If you have any suggestions regarding these recommendations or have questions, please contact Willmary Escoto at [email protected] and Sage Cheng at [email protected].

OUR RECOMMENDATIONS

When drafting legislation to combat deceptive interface designs, lawmakers should consider the following recommendations.

1. Foster transparency with regard to privacy

Ensure relevant entities, such as companies or organizations, design their products and services with their most vulnerable users in mind. This should include providing easy-to-understand and transparent privacy policies that are not burdensome to read or drafted using complex jargon. Privacy policies must be clear, transparent, comprehensible, and easily accessible, especially if the information is aimed at children. Entities should therefore use plain language, accommodate a wide range of digital literacy, and establish mechanisms for individuals to report violations and seek remedies. The consent processes should be clear, unambiguous, and free from elements designed for coercion or manipulation. Entities should use opt-ins to obtain consent rather than offering pre-selected checkboxes, and individuals should have easy ways to opt-out.

2. Enforce accessibility standards

Further build on web accessibility law and policies to ensure deceptive design interfaces do not create barriers for people with disabilities, senior citizens, and disadvantaged communities. This includes guidelines for clear language, visual displays, color contrast, and other accessibility considerations. Entities should create symmetrical designs, treat options equally, and avoid discrimination with color, font, and styling treatment. Legislation should ensure options are displayed consistently across different environments and devices.

3. Require data minimization

Encourage data minimization by requiring entities to collect and retain only the minimum necessary data for legitimate purposes. Minimizing data collection helps reduce the risks associated with excessive data collection and storage. Minimizing the amount of data companies collect is one of the best, most rights-respecting ways to prevent data and privacy violations. Personal data should only be collected for specific, legitimate purposes and not be used or disclosed for incompatible purposes. Entities should clearly state the purpose of data collection and obtain informed consent for any additional uses.

4. Continue engagement with the public and civil society to improve digital literacy awareness

Allocate resources to educate the public about deceptive interface designs and how to identify and report them. This can be done through public campaigns, educational materials, and partnerships with consumer advocacy organizations. Lawmakers should also collaborate with and gather insight and expertise from industry stakeholders, consumer advocates, and experts in user experience design and digital ethics to create more effective legislation and industry guidelines.

5. Mandate data security principles

Incorporate principles into legislation to require entities to proactively consider and integrate privacy protections into their systems, products, and services. This can include privacy impact assessments, implementing appropriate technical and organizational measures to safeguard personal data, and promoting privacy-enhancing technologies such as encryption and anonymization. Regular audits and assessments should be conducted to evaluate compliance with privacy-by-design principles.