There is a theoretical elegance to a fully transparent, informed, granular consent approach in consumer privacy — but is it really viable?
Consent is good in theory. It’s a way for an individual to give explicit permission to permit or deny particular types of data processing or sharing by a service. But in practice, consent alone falls short on many fronts.
Consent fatigue is real.
Consent fatigue is where a user grows so weary of the frequent and complex consent requests from digital interactions, they click past them to avoid the time suck and brain drain. The best example of consent fatigue is privacy notices. Back in 2008, researchers estimated that the average US user would likely encounter 1,462 privacy policies a year and consume around 76 days if they read them in full. Fourteen years on, privacy policies are a mainstay of our digital lives. Pew Research reports 81 percent of adults are being asked at least monthly to agree to a privacy policy but only 9 per cent are always reading the policy before agreeing to the terms and conditions.
Can we rely on ‘informed consent’ if the informed part is largely missing?
Consent doesn’t scale well from the consumer’s perspective.
Most people visit a small number of sites frequently and a large number of sites infrequently. And most don’t want a blanket setting for all the sites and services they use. Global opt-in or opt-out isn’t necessarily viable for the way many of us interact online, where we may want different choices for different types of web sites or services. Telling each site what we do and don’t want is cumbersome and resource-intensive. Circle back to consent fatigue.
And while the Global Privacy Control appears to reduce the amount of effort a user needs to expend to exercise some data sharing choices consistently across multiple web sites, the web site or service must be set up to react appropriately to receiving a GPC directive from a user’s app or browser. We’re still better off with the GPC than without it, but this type of limitation voids its potential as a universal panacea without regulatory enforcement.
Consent brings an unbalanced cost-to-effort ratio.
Expanding on our last point, web tech expert Robin Berjon says: “All that consent does is that it offloads privacy labor to the user, and only under very specific conditions does it increase the user’s autonomy.”
“There’s a reason why data brokers love consent so much and won’t ever stop talking about “transparency and choice“: it works in favor of whichever party has the time to invest in creating a consent funnel, and that’s never individuals.”
Of course, some services are trying to comply with regulations in the absence of an industry standard for consent notice design—and this inconsistency is yet another issue with a whole privacy model relying on consent.
Consent is easily manipulated.
Robin Berjon weighs in on this point too: “Digital self-determination in the face of privacy issues is highly manipulable and that’s even without dark patterns.” Add in dark patterns by services capitalizing on fatigued consumers and suddenly users are giving up far more data and privacy than they intended.
An EU study into cookie consent notices reveals this:
“… the majority are placed at the bottom of the screen (58%); not blocking the interaction with the website (93%); and offering no options other than a confirmation button that does not do anything (86%). So no choice at all then.
A majority also try to nudge users towards consenting (57%) — such as by using ‘dark pattern’ techniques like using a color to highlight the ‘agree’ button (which if clicked accepts privacy-unfriendly defaults) vs displaying a much less visible link to ‘more options’ so that pro-privacy choices are buried off screen.
And while they found that nearly all cookie notices (92%) contained a link to the site’s privacy policy, only a third (39%) mention the specific purpose of the data collection or who can access the data (21%).”
It should come as no surprise that big platforms like Facebook and Google are masters at using dark patterns for consent. They design the process to make it difficult for the user to set privacy choices and understand them. Most consumers twiddle all the dials and think they’ve set all possible privacy protections, when in fact they have not.
What’s more, the guidance from big platforms on how to make good privacy choices is often not readily consumable and is intentionally skewed in their favor. They want to keep the data tap turned on full.
Consent can be convoluted.
Even when companies are trying to do the right thing, they tend to offer consumers privacy choices and implications that are difficult to grasp. In the GDPR for example, consent is spoken about in terms of criteria: informed, transparent, timely, contextual, granular. Regulations almost add to the convolution for the consumer.
Consent is only one (limited) tool in the toolkit.
Berjon reminds us: “The answer to problems caused by consent isn’t more consent, automated consent, or consent that covers more processing. It’s to use consent where and when it is the right tool to increase people’s autonomy, and to stop pretending that you can bang nails in with it in all the other cases.”
He notes: “Lindsey Barrett summarized the situation well when she described notice and choice as ‘a method of privacy regulation which promises transparency and agency but delivers neither’.”
So, what’s the outlook for consent?
Despite growing concern over consent alone as a mechanism for protecting personal data, no one is taking it off the table just yet. Consent is one of six lawful bases to process personal data under Article 6 of the GDPR. Consent is also present in other data protection regulations too.
In looking to tame the beast, IAPP contributor Robert Gelman wonders whether “it’s possible that consent can play a meaningful and fair role in consumer privacy if an independent third party sets the terms of consent rather than a business desperate to have consumers agree to its terms?”
He says: “Perhaps a truly neutral third party standing between the consumer and the business seeking choices involving privacy might produce a better and fairer outcome.”
In exploring the notion of a third-party consent mechanism, Gelman cites the US Do Not Call Registry and Apple’s new ATT feature as two data points proving a viable role for third parties: “In both cases, a third party sets the terms and the methodology for consumer choice. In the one case, Apple set the terms. In the other case, the FTC did. This is not what happens when a website unilaterally sets the terms of consent. A one-sided approach is why consent does nothing for consumers today.”
Our position on consent
Since Anonyome Labs started in 2014 we have felt that while nothing is wrong with consent approaches in theory, the consumer needs more.
We believe that if regulations aren’t toothy on enforcement, it will result in big companies doing the wrong thing anyway. Regulations alone aren’t going to protect consumers from insolent companies and those beyond reach (outside the jurisdiction but doing business within it).
Consent is unnecessary when dealing with companies whose interests are aligned with the consumers’. But if interests aren’t aligned, a company may work hard to make it difficult for the user to withhold or withdraw consent.
Our consumer app MySudo steps in when the consumer can’t make consent choices in their favor or it’s all too hard. MySudo is a useful tool because it mitigates consent where the user needs to and keeps it in check where the user wants more control over their personal data.
It’s unlikely the privacy problem will ever be fully solved. Tools like MySudo and the capabilities in Sudo Platform, our business toolkit for rapidly deploying branded privacy and cybersecurity solutions, will always have a place and their role will grow ever more important.