Saving time, or committing a crime? Article 22 of the GDPR

Increasingly prevalent use of automated processing to help save time when interacting with customers, employees or the general public might be seen as a win-win situation, but due consideration is often not given to ensuring this automation is compliant with Article 22.

The right to a private life is enshrined not only in law but in society as well. As the digital world has evolved, applying this concept to online life has become a complicated matter. AI and Big Data technology have muddied the waters where privacy is concerned. In response, regulators have included Article 22 in the GDPR to specifically address the more complex aspects of digital privacy, namely the use of automation and profiling based on personal data and metadata.

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
Article 22. GDPR

A quick reminder of what Article 22 is about

Article 22 provides the right for a data subject to choose not to be subject to a decision based solely on automated processing, especially if that decision has a significant impact on them. If the decision is based on special categories of data, even stricter requirements are enforced, even if the data subject has given consent.

 The impacts of the online data superstructure built on the internet, coupled with smart tools that interpret and make decisions based on this data, have created a perfect privacy storm.

The result is that anyone building innovative data-powered experiences is shaking their head wondering how to use data effectively, whilst ensuring they meet the constraints of Article 22.

We believe the principles of transparency and agency hold the key to providing a solution to this contentious issue for consumers and businesses alike.

Article 22 non-compliance leads to big fines

Non-compliance with article 22 is resulting in large fines. At the upper end, GDPR fines can reach €20 million, or 4% of annual global turnover, as demonstrated by two recent examples:

“Robo firing” of Uber drivers

A court ordered Uber to reinstate six drivers and pay $100,000 in compensation after the company’s algorithms mistakenly identified fraud events. The court cited Article 22 of the GDPR, stating that Uber’s decision was “based solely on automated processing, including profiling”.

Food delivery services and article 22

Deliveroo was fined $3 million for the misuse of data processing algorithms that led to workers being discriminated against. Foodinho was also fined $3.1 million for the use of algorithms that violated workers’ rights, preventing work opportunities. In both cases, Article 22 of GDPR was cited.

Both examples show how the use of algorithms can result in a black box where the cause cannot be proven, and effect is often only seen once enough individuals have been harmed in the same way. Real-time transparency, given to each individual, as described in Article 22 would resolve this.  

Examples of processes affected by Article 22

As more AI-enabled algorithms are applied to processes, the opportunities for misuse increase with no way to prove cause. Just a few examples of the types of processes where Article 22 comes into play are:

·   Automated CV reviews when hiring

·   Image recognition in augmented reality apps

·   Advertising profiling technology

·   eCommerce 'recommended items'

·   Credit card application decisions 

Cause, effect, and decisions: Article 22 meets the real world

Technology and the organisations operating it have now come to lean heavily on PrivacyTech to achieve their objectives. Hoping that by aggregating, anonymising, minimising or encrypting data they will avoid regulatory risk and be able to signal their ‘privacy-centric’ values to the masses. However, what these ever more extreme privacy-preserving tactics fail to address is that the individual from whom the data came is left with no way to prove an algorithm may have treated them unfairly, with bias or in a discriminatory manner. Without any evidence that a person's individual data was used, or the effect suffered by that individual, many of the rights and protections afforded under GDPR become moot.

“The value is not in the data itself, but the context and intelligence derived from that data. If the data subject has no visibility on that resulting intelligence, how then can they protect themself from its (mis)use.”
Nicholas Oliver, FORTYEIGHT

The principle of causality and the consequences on an individual when cause and effect are not correctly connected, underpin Article 22 compliance. But proving the link between cause and effect can be difficult to do, especially when PrivacyTech is involved. 

A flat tyre is an example of an effect, the cause of which, must be carefully investigated to establish an effective fix. Similarly, the effects of automated, AI-enabled, data processing can lead to a decision that belies and even misconstrues the cause. Complying with Article 22 requires an understanding of the root of inference from data to protect a data subject's right to redress.

The key to understanding the true cause of an effect is through granular observations. When Uber and Deliveroo were fined under Article 22, they were unable to provide observations (the cause) to explain the decisions that negatively impacted their employees. In terms of data processing, observations can reveal mistakes, allowing a consumer to right wrongs.  

This can be done using a 'Transparency-as-a-Service' (TaaS) platform, like FORTYEIGHT.

Removing the mask of Article 22 to reveal the cause

Transparency-as-a-Service (TaaS) serves as a “pane of glass” between the consumer and the company that provides the framework for Article 22 compliance. FORTYEIGHT delivers TaaS via an API that reaches out across the web to provide context and visibility of data use. This means that the consumer gets to see any data-led observations made about them.

TaaS solutions such as FORTYEIGHT provide the rails for the informed consumer. From this informed position, consumers then have the tools to redress misinformation and right data wrongs. In other words, FORTYEIGHT gives a data subject agency to correct incorrect decisions.

The power of transparency and agency

The effect of poorly applied algorithmic data processing and profiling results in massive fines under Article 22 of the GDPR. FORTYEIGHT works to protect consumers and businesses from poor and inaccurate decisions and non-compliance with GDPR. Collated observations on data decisions offer people the tools to make changes and show businesses the underlying cause of those decisions. Applying more transparency to the decisions resulting from processing of data not only helps to build trusted relationships but also better, more accurate, products.

Learn more about the value of digital transparency and book a demo with our team today.

Nicholas Oliver
Founder

Continue Reading