With the growing number of privacy laws and regulations that require consumer consent before collecting or processing data, some enterprises have turned to tricking consumers into giving their consent through the use of dark patterns, which are practices that deceive users into acting in a way in which they did not intend.1 Dark patterns are often built within confusing user interfaces, causing users to have no understanding of to what they are agreeing. Privacy professionals who can identify and eliminate dark patterns can help their enterprises be digital trust leaders.
What Do Privacy Dark Patterns Look Like?
User interface design is a common way in which privacy dark patterns manifest, and most people have encountered them in their daily lives. A common example of a privacy dark pattern is a cookie banner that has a bright, vibrant accept-all button with a reject button that is greyed out, making it look as though it cannot be clicked, or, worse, there is no reject button. These sites often do allow users to opt out of cookie tracking, but the process of rejecting cookies is more complex and takes more time than just accepting all cookies. Users may be frustrated or confused and accept all cookies simply because of a poor interface.
Privacy dark patterns may also appear in the form of difficult-to-find privacy settings. Confusing menus that obfuscate how to change privacy preferences can make it nearly impossible for users to see their current settings and opt out of undesired privacy settings.
Many dark patterns enable enterprises to collect unnecessary information. This can be seen on websites that require the user to create an account before viewing any products sold online. Although it may be necessary to collect a purchaser’s name, shipping address and billing information once someone buys a product, collecting that information before a data subject makes a purchase is a dark pattern, as it is collecting information that is unnecessary.
Privacy dark patterns may also be in the form of lengthy and jargon-filled privacy notices. The average website user may not have the time to read these notices, and if someone does take the time to read a notice, it may be incredibly challenging to understand what it is stating. In contrast to complex privacy notices, another common privacy dark pattern is having an easy-to-understand privacy notice or banner notice that is manipulative such as requiring to a user to click a button that says, “I hate saving money” to opt out of providing an email address to receive a coupon to access a website.
So What?
Recent privacy laws and regulations have emphasized the importance of consent when collecting and processing data. Consent gained through a dark pattern may not protect enterprises from fines and other penalties. In the United States, the California Privacy Rights Act (CPRA) and the Colorado Privacy Act (CPA) do not allow the use of dark patterns to gain consent.2
Even if an enterprise is not subject to the CPRA, the CPA or any other privacy law or regulation, there could still be significant consequences for using dark patterns. In January 2022, attorneys general of several US states sued Google for using dark patterns to collect location data.3
But more important than any fines or penalties is the effect privacy dark patterns have on digital trust. Digital trust is the confidence in the integrity of the relationships, interactions and transactions among providers and consumers within an associated digital ecosystem. An enterprise that relies on tricking its data subjects to conduct processing activities and earn a profit is irreparably tarnishing its integrity and the confidence consumers may have had in it. In contrast, an enterprise that pursues freely given consent can foster trust and gain a competitive advantage.
It is worth noting that dark patterns disproportionately affect certain groups. Specifically, those with lower incomes, those whose first language is not that of the site they are visiting, those from cultures that differ from the provider’s and those with less digital literacy are more prone to being susceptible to and harmed by privacy dark patterns.4 Enterprises that leverage dark patterns may inadvertently perpetuate digital inequality.
How to Combat Privacy Dark Patterns
One of the best ways to prevent privacy dark patterns is to practice privacy by design (PbD), which refers to a systems engineering approach that considers privacy issues from the beginning of product, service, business practice and physical infrastructure development.5 PbD is based on 7 privacy principles:
- Proactive not reactive; preventative not remedial
- Privacy as the default setting
- Privacy embedded into design
- Full functionality: positive-sum, not zero-sum
- End-to-end security: full life cycle protection
- Visibility and transparency: keep it open
- Respect for user privacy: keep it usercentric6
Adhering to these principles can help enterprises design products and services that protect users’ privacy and eliminate the harm of dark patterns. It is also worth noting that not all dark patterns are intentional; some enterprises may be unaware that they have privacy dark patterns in place. An enterprise that practices PbD can limit the privacy harm resulting from inadvertent privacy dark patterns. For example, if a social networking site has a complicated process to change the visibility of a user’s birthday but is automatically defaulted to not sharing a user’s birthday, the privacy harm to data subjects is reduced.
To effectively and comprehensively prevent privacy dark patterns, privacy professionals must work with other departments within the enterprise to identify and remediate any dark patterns, including:
- User experience (UX)/user interface—A well-designed user interface can go a long way toward empowering data subjects to identify their privacy preferences. A good interface allows users to accept or reject a particular data use/collection easily and quickly. A privacy-preserving UX design may provide privacy notices at specific points in the user journey (e.g., a social media application confirming the audience of an update when a user clicks “post”).
- Legal—New privacy laws and regulations may prohibit consent collected through dark patterns. Legal teams likely have knowledge of laws, regulations and enforcement actions that explain what constitutes a dark pattern.
- Communications—The way data collection/processing practices are shared with data subjects should be easy to understand. Though the legal department may require specific verbiage to be included in privacy notices, it is worthwhile to have a simpler, more easy-to-understand notice posted somewhere, and communication departments can help privacy professionals draft these simplified notices.
- Web—Web development teams can work alongside UX and privacy professionals to ensure that privacy preferences are easy to find and that websites are configured in privacy-preserving ways.
- Customer service—Customer service teams should be equipped to address data subject questions or requests related to customer privacy preferences. The absence of dark patterns can also help reduce customer complaints due to improper data collection/processing.
It is also important for enterprises to know about the consumers they are serving, for example, in. what language are consumers most comfortable? From what cultures are most users? Do a majority of consumers have health conditions, such as colorblindness, that could affect the way they interact with the interface?7 Are consumers mostly using mobile devices or laptops/desktops to access the product? Considering accessibility and listening to user feedback can reduce the dark patterns customers may experience.
Dark Patterns: Pervasive Across Enterprises
It is worth noting that dark patterns can appear in other areas of the organization outside of privacy. For example, automatically renewing subscriptions that do not warn customers and making it hard to stop auto-renewals are dark patterns. Privacy professionals have the opportunity to be digital trust leaders in their enterprises. Eliminating privacy dark patterns can improve an enterprise’s reputation and it can help enterprises identify dark patterns in other areas of the organization.
Privacy professionals who can break down organizational silos, prioritize data subjects and practice PbD can eliminate or reduce the harm caused by privacy dark patterns.
Conclusion
Consent is a must-have for processing and collecting personal data, but if that consent is not freely given, there could be serious consequences. Enterprises can suffer reputational damage and potentially have to pay lofty fines, while the privacy harms a data subject experiences could be devastating. Privacy professionals who can break down organizational silos, prioritize data subjects and practice PbD can eliminate or reduce the harm caused by privacy dark patterns. Enterprises that are transparent and fairly gain consumer consent to collect and process data will be leaders in digital trust.
Endnotes
1 Zhu, C.; “Dark Patterns—A New Frontier in Privacy Regulation,” Reuters, 29 July 2021
2 Stauss, D.; S. Weber; “How Do the CPRA, CPA & VCDPA Treat Dark Patterns?” Husch Blackwell, 16 March 2022
3 DeGeurin, M.; “Google Illegally Used Dark Patterns to Trick Users Into Handing Over Location Data, State AGs Say,” Gizmodo, 24 January 2022
4 Nguyen, S.; J. McNealy; “The Impact of Dark Patterns on Communities of Color,” Data & Society: Points, 17 May 2021
5 De la Torre, L.; “What Is ‘Privacy by Design’ (PbD)?,” 7 March 2019
6 Cavoukian, A.; “Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Practices,” December 2012
7 For more information on web accessibility, see W3C Web Access Initiative, “Making the Web Accessible”
Safia Kazi, CIPT
Is a privacy professional practices principal at ISACA®. In this role, Kazi focuses on the development of ISACA’s privacy-related resources, including books, white papers and review manuals. She has worked at ISACA for 8 years, previously working on the ISACA® Journal and developing the award-winning ISACA Podcast. In 2021, she was a recipient of the AM&P Network’s Emerging Leader award, which recognizes innovative association publishing professionals under the age of 35.