It’s been a little over a year since the European Union’s General Data Protection Regulation (GDPR) was implemented, but almost immediately, people noticed its impact. First, there was the flurry of emails seeking users’ consent to the collection and use of their data. Since then, there’s also been an increase in the number of sites that invite the user to consent to tracking by clicking “Yes to everything,” or to reject them by going through a laborious process of clicking “No” for each individual category. (Though some non-EU sites simply broadcast “if we think you’re visiting from the EU, we can’t let you access our content.”) There was also the headline-grabbing €50 million fine imposed on Google by the French supervisory authority.
In its summary of the year, the EU Data Protection Board (EDPB) reported an increase in the number of complaints received under GDPR, compared to the previous year, and a “perceived rise in awareness about data protection rights among individuals.” Users are more informed and want more control over the collection and use of their personal data.
They’re probably irritated by the current crop of consent panels, and either ignore, bypass, or click through them as fast as possible – undermining the concept of informed, freely-given consent. They’re limited in the signals they can send about consent, and what they do signal may be meaningless. And if their access is blocked because of the geographic location of their IP address, they aren’t sending any consent signals at all. Whatever the motivation for this kind of blocking, it leads to a “fragmentation” of the Internet, in which information is freely available to some people, but inaccessible to others just because of where they seem to be located.
Nevertheless, because people are better-informed than they were, they are more motivated to complain.
If individuals’ complaints prove justified, organizations that collect personal data face the prospect of much bigger financial penalties than before for data protection offenses. The risk of penalties is independent of geographical location. It applies across national and jurisdictional boundaries, and therefore in contexts where the idea of “personal data” could have widely different cultural interpretations. When data controllers are faced with risks arising from laws outside their own jurisdiction, they are likely to need ways of setting themselves a high benchmark that reduces their exposure to compliance-related and reputational risk.
In short, everyone ends up relying on better behavior by data controllers.
If “improving behavior” involves setting a higher bar than legal compliance, it takes us into the realm of ethics, which can be a daunting prospect for the average business, so we wanted to develop something more approachable and practical: The Policy Brief on Responsible Data Handling. The policy brief looks at the issue from the data controller’s perspective, and identifies three principles to help them decide how to collect and process personal data in a responsible way: Transparency, Fairness, and Respect.
We developed each of these principles into specific guidelines. For example:
- If what you are doing with personal data comes as a surprise to the individual, you probably shouldn’t be doing it. If you can’t , or don’t, explain the uses you make of personal data, you’re probably failing on transparency.
- If what you do with personal data means you get the benefit, but the risk is offloaded onto the individual, your product or service probably hasn’t been designed with fairness as a key objective. Similarly, f you lock users in to your platform by making it impossible for them to retrieve their data and move it elsewhere, you’re failing on fairness.
- If you share personal data with third parties but don’t check that they treat it properly, you may be failing to respect the individual and their rights and interests.
The Policy Brief on Responsible Data Handling includes more examples for each principle and a short list of recommendations for policymakers and data controllers, whether private or public sector. Thanks to GDPR, we know that people want more control over their data. The policy brief is a step towards protecting privacy and building trust in the Internet itself. If you have comments or suggestions about how to continue that process, please let us know.