Everyone celebrates the holiday season differently, but one experience we all shared this December was the flood of emails from seemingly every company we’ve ever interacted with (Dashlane among them). The reason? The California Consumer Protection Act (CCPA) went into effect on January 1, 2020, prompting companies that do business online (everyone) to update their terms of service and promise their entire email database more transparency on data collection and storage practices.
The last time corporate lawyers flooded our inboxes was Spring 2018, shortly before the European Union’s General Data Protection Regulation (GDPR) went into effect. There’s a lot to explore when it comes to GDPR and CCPA, from criticisms the laws do not go far enough to reign in Big Tech to their potential to become global de-facto standards. Those topics are already well covered.
What’s less discussed is how these regulations have added to the internet’s growing user experience (UX) problem, and served to entrench privacy as something one must opt into as opposed to what it is—an inherent right of internet users everywhere. Both outcomes have greater consequences than the overall enforceability of the laws, but are much less discussed.
The Broken UX of the Internet
In the business and tech world, GDPR needs no introduction. In many ways, GDPR deserves credit for putting consumer data privacy on the map; it forced businesses to be more thoughtful about what customer information they collect and store.
But for most internet users, the most memorable change GDPR brought into their life was that every website they visited now featured a large popup window at the bottom of the page, soliciting they consent to a plethora of tracking permissions.
Most popup windows look the same: a wall of text that we glance at but often don’t read, a solid colored button that says something along the lines of “I accept,” and in a small font to the side, a link to go read more about alternatives to the readily available “I accept” button.
Popup windows implement the same tried-and-true design principles used to tempt you into clicking the “download now” or “buy” button — a bright colored button with high contrast to the website’s background color. Nearly every website on the internet uses this design, which complies with GDPR but fails in the spirit of giving customers control over what happens to their data.
Instead, this popup draws inspiration from what is known in the design world as dark patterns, a misleading or deceptive UI/UX decision that aims to get users to do things they don’t really want to do. Dark patterns are the force behind endless scrolling on social media, streaming services that immediately jump to the next episode to keep you on the couch for hours, and why it can be so difficult to find where exactly to discontinue a free trial.
In the case of GDPR, when we mindlessly click “accept all,” we forfeit our right to decide how that company collects, stores, and transmits our data. Convenience wins, and as a result anyone that isn’t a privacy crusader will likely sign away all the rights lawmakers spent so much time figuring out how to protect. By overlooking the power of design, GDPR allowed a simple popup window to undermine a part of a law that took six years to develop.
Privacy as a Choice, Not a Right
Dark patterns have been around since before the internet, so why haven’t regulators ensured these tricks wouldn’t undo years of legislative work? The primary reason is that privacy and non-tracking is still not a right automatically granted to people who use the internet. In the world of GDPR and CCPA, privacy is a destination that can only be reached by traversing thousands of pages of dark pattern designs and novel-length privacy policies stuffed with intentionally inaccessible legalese. And for those willing to make the effort, GDPR doesn’t even ensure that businesses provide a true alternative to the eye-catching “accept all” button.
This has made achieving a semblance of digital privacy feel overwhelming, feeding growing apathy around the issue. It already appears the California bill will do little to energize the masses. CCPA went into effect January 1, 2020 with such a vague definition of “selling” user data that Facebook and other large firms have already found loopholes. A revised draft of the bill is expected in mid-2020, and years of litigation are anticipated.
Legislate for the End User
Convenience dictates everything online, and momentum around privacy is no different. For most, it’s simply too much work to protect personal data. Solving this dissonance should be a focus of regulators around the world.
Ease of use and great design are essential to a successful product. Big Tech figured it out years ago, and until privacy champions in both the public and private sector approach the privacy crisis with the same focus on user experience, we will continue to see years of well-intentioned legislation have unintended and sometimes detrimental results. Privacy is essential to our freedom of expression and our healthy democracies, and with each law passed that fails to drive meaningful change, more and more people give up on the possibility of having any privacy online.
But there’s reason for hope – around the world we are seeing laws enacted and proposed with a focus on data portability (the ability to take your data off one platform and move it to another of your choosing) and interoperability (an open exchange of information between platforms, often through a unified open standard), which will spur competition and innovation. Digital delegation would also enable digital services such as Dashlane to handle more of the privacy-oriented nuances that people would take advantage of if there were a more convenient mechanism.
Many critics of large privacy legislation fear handing too much power to central government. History tells us they are justifiably cautious – privacy requires some freedom and autonomy from government. Rather than heavy regulations, laws around digital delegation, data portability, and data interoperability incentivize businesses to help people control their own data rather than hoarding it for themselves. This would also incentivize diverse business models, customer- and UX-friendly innovation, and remove the onus on major blocs such as the EU or the US to be the full arbiters of privacy.