GDPR

Profiling and automated decision-making under GDPR

Published on 5th Sep 2018

DB_electronic_board

Profiling and automated decision-making (or ADM) are two areas of the GDPR that have caused a fair degree of confusion for businesses, often with perceived negativity and assumptions that the law significantly restricts most forms of computer-led analysis of data subjects and their activities. Not necessarily so.

As per the general flavour of the GDPR, the law has undoubtedly tightened and places a greater burden and requirements on businesses wishing to carry out profiling or ADM activities. However, there's still plenty of opportunity for those willing to understand the detail of the law, and more generally align their business models to the core themes of the GDPR.

What is profiling?

Profiling is the automated processing of personal data to evaluate certain things about an individual.

Typical applications of profiling include use of online behavioural advertising (such as targeted online ads based on browsing behaviour), credit scoring as part of a mortgage or finance application and the use of artificial intelligence and machine-learning (for example, for Internet of Things applications).

Profiling is all about evaluation and not decisions; it's an important distinction. Profiling could form part of an automated decision making activity, but on its own culminates in intelligence and opportunity and not computer- led decisions about an individual.

Profiling: key considerations

1. Comply with the transparency obligations of the GDPR

Profiling personal data is a processing activity and therefore caught by the transparency obligations under the GDPR, and your organisation's privacy notice (or other means of notifying individuals) should set this out.

2. Have a lawful basis for processing (and it's not all about consent!)

European Data Protection Board (EDPB) guidance suggests it's unlikely organisations will be able to rely on performance of a contract as the lawful basis for processing, and therefore the two most common lawful bases are:

a. consent – although you will need to show that the individual knows what they are consenting to, so they can make an informed choice, and more generally meet the consent requirements of the GDPR (which is a relatively high bar).

b. legitimate interest – very much an option for many profiling activities. This will require a legitimate interests assessment to be conducted beforehand and particular thought needs to be given to the detail and comprehensiveness of the profile, the impact of the profiling and the safeguards in place to ensure fairness and non-discrimination. Make sure that your assessment is honest, and that the risk outcomes are realistic.

3. Take account of data subject rights

Individuals have the right to object to profiling under the GDPR and therefore this needs to be brought to their attention clearly and separately from other information. Your organisation should have a process in place to handle such objections, particularly where the objection relates to profiling for direct marketing, which is an absolute right.

What is automated decision-making?

You guessed it, a machine makes a decision about an individual. To be more precise, it's a decision which must:

  • be conducted solely by automated means (i.e. no human intervention); and
  • have a legal or similarly significant effect on an individual.

The first limb is fairly straightforward; if any human intervention is involved (for example, considering the results of the automated decision before applying it to an individual) ,then the activity will not qualify as automated decision-making. However, if a human inputs the data but the decision-making is automated, it still could be considered automated decision-making.

The second limb of the test is a bit more complicated, as although a "legal effect" is fairly easy to define, i.e. something which affects an individual's legal status/rights (for example, housing or disability benefits), what constitutes "similarly significant effect" is more nebulous.

There are obvious examples of "similarly significant effect", such as automatic refusal of an online credit application or e-recruiting practices with no human intervention (such as using psychometric testing to filter-out candidates). Guidance points to significantly affecting circumstances, behaviour or choices of individuals, having a prolonged or permanent impact and at its most extreme leading to the exclusion or discrimination of individuals.

ADM: key considerations

If you are conducting ADM, in addition to the above key considerations for profiling your organisation needs to:

1. Understand your grounds for processing

Current EDPB guidance says that organisations cannot conduct ADM unless it is:

  • necessary for entering into, or the performance of, a contract between the individual and the data controller (for instance, a loan application between a bank and a borrower which requires an automatically generated credit score). This is often an option and/or standard practice;
  • authorised by EU or Member State law (for example, a bank undertakes profiling to identify fraud to comply with its regulatory obligations). This is usually fairly sector or industry specific; or
  • based on the individual's explicit consent (freely given, specific, informed and unambiguous affirmative indication of the individual's wishes, which must be an express form of consent such as sending an email). This ground is not always a business favourite, but is increasingly acceptable (by both businesses and individuals) for ADM activities.

2. Carry out a Data Protection Impact Assessment (DPIA)

The ICO considers ADM as high-risk processing, so you need to carry out a DPIA to assess the risks to individuals and ways to mitigate those risks.

3. Tell individuals you're conducting automated decision-making

This includes providing meaningful information about the logic involved and what the likely consequences are for individuals. This is usually done through an organisation's privacy notice or just-in-time notifications.

4. Tell individuals about their rights for a review of the automated decision

If an individual is unhappy with the outcome of an automated decision, they can ask for it to be reviewed (you will need an appeals process in place involving an employee with the authority to reverse the decision if necessary). ICO guidance recommends that this is explained at the point the decision is provided.

This includes providing an explanation of how and why the decision was reached, being able to verify the results and explain the rationale behind the decision, and delivering an audit trail showing key decision points that formed the basis for the decision.

Individuals also have the right of access in respect of ADM, including profiling.

5. Consider additional hurdles for special categories of data

If your organisation conducts ADM in respect of special categories of data (such as health records), there are additional requirements to consider, being predominantly the need for explicit consent of the individual or the ability to demonstrate that the processing is necessary for reasons of substantial public interest as a lawful basis for processing.

How we can help?

As this note hopefully clarifies, profiling and ADM are not off limits and remain opportunities to explore.

We are regularly advising clients on profiling and ADM issues in a wider variety of contexts, with a view to helping them undertake these activities in a compliant way. If you would like to discuss this or any other issue relating to the GDPR further, please contact one of our specialists below, or your usual Osborne Clarke contact.

Follow

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?