Empty Link Skip to Content

CJEU Delivers Important Decision on Automated Decision-making Under the GDPR

The European Court of Justice ("CJEU") recently delivered an important decision concerning what constitutes "automated individual decision-making" under Article 22 GDPR. The CJEU ruled in the SCHUFA case ( case C-634/21) that a German credit reference agency engages in solely "automated individual decision-making" within the meaning of Article 22 GDPR, when it creates credit repayment probability scores ("credit scores") as a result of automated processing, in circumstances where third party lenders "draw strongly" on these scores to determine whether to establish, implement or terminate contracts with individuals.

The decision shows that the obligation to comply with Article 22 GDPR falls on a credit reference agency, rather than just on the third party lender who makes the ultimate decision about the loan application, in circumstances where the latter relies strongly on such automated credit scores. This reflects a broad interpretation of Article 22 GDPR.

It means that any service provider, not just credit reference agencies, that provides critical automated decision-making support to a third party may be caught by Article 22 GDPR and will need to ensure compliance with same. For example, it could impact service providers providing automated decision-making services, that are relied upon by third party organisations for decision-making purposes related to recruitment, healthcare or insurance. However, it is appears that the judgment is specific to situations where a service provider's input is "strongly" relied upon by a third party to make a decision, and that decision has a legal effect or other similar significant effect on an individual. Therefore its impact should be limited to these specific circumstances.

Background

Article 22(1) GDPR prohibits organisations from making solely automated decisions which have a legal or other similar significant effect on individuals. Such solely automated decision-making is only authorised in the conditions set out in Article 22(2) GDPR, namely : (i) with an individual's explicit consent; (ii) where necessary to enter into, or for the performance of a contract with the individual; or (iii) where authorised under EU or Member State law. In addition, certain safeguards must be implemented to protect individuals rights and freedoms in respect of such decision-making, including ensuring that individuals can obtain human intervention, express their views, and contest the decision made about them.

Furthermore, Article 15(1)(h) GDPR is a component of the 'right of access' under the GDPR, and provides a data subject with the right to obtain from the controller, information about automated decision-making, including its logic and its consequences.

The Facts

SCHUFA is a German credit reference agency which assigns credit scores to people (in particular consumers) based on the probability that they will repay a loan. SCHUFA calculates a person's credit score by analysing their characteristics and behaviour. SCHUFA provides its customers (such as banks) with access to a person's credit score. Banks then use that credit score to decide whether to offer that person a loan, and on what conditions. 

A complainant, OQ, was refused the granting of a loan from a bank, due to a low credit score that SCHUFA supplied to the bank about her. OQ submitted an access and erasure request to SCHUFA. While SCHUFA informed OQ of her credit score, it refused to disclose certain information concerning how it calculates the score on the ground that it was a trade secret. In addition, SCHUFA highlighted that it limits itself to sending the credit scores to its customers (i.e. banks) and that it is those customers which make the actual lending decisions.

OQ subsequently lodged a complaint to the competent data protection authority, HBDI, requesting that they order SCHUFA to comply with her request for information and erasure. HBDI rejected the application and OQ appealed this decision before the Administrative Court of Wiesbaden, Germany. That court stayed the proceedings and asked the CJEU whether the establishment of a credit score constitutes "automated individual decision-making" within the meaning of Article 22 GDPR.  In particular, the CJEU had to consider whether SCHUFA made the "decision" about OQ, or whether it merely carried out profiling, with the third party bank making the "decision" when refusing to provide OQ with the loan.

The referring court had doubts as to the argument that Article 22 GDPR is not applicable to the activities of companies such as SCHUFA, in light of its finding that a poor automated credit score leads "in almost all cases" to the third party bank refusing to provide a loan, and Article 22 GDPR "precisely aims to protect people against the risks linked to decisions based on automation."

CJEU Decision

Whether the automated score constitutes "automated individual decision-making" under Article 22(1) GDPR

The CJEU rejected SCHUFA's claim that it was only engaged in preparatory acts and that any decisions were taken by the third party bank. Instead, the CJEU held that SCHUFA itself is engaging in "automated individual decision-making" within the meaning of Article 22 GDPR, when it creates the credit scores as a result of automated processing, and when lenders draw strongly on these scores to establish, implement or terminate contracts.

The CJEU noted that three conditions need to be met to be considered as being engaged in "automated individual decision-making" under Article 22 GDPR: (i) a decision must be made; (ii) it must be based solely on automated processing, including profiling; and (iii) it must produce legal effects concerning the individual or otherwise produce an effect that is equivalent or similarly significant in its impact on the individual. According to the CJEU, all of these conditions were met in the present case.

In particular, the CJEU adopted a broad interpretation of the term "decision", finding that it is capable of encompassing "a number of acts which may affect the data subject in many ways", and includes the calculation of a credit score. This broad interpretation is supported by Recital 71 GDPR. The CJEU also noted that the calculation of the credit score would have significant effects on the individual to whom it relates. Based on the factual conclusions of the referring court, the CJEU noted that a low credit score results in a bank rejecting the loan application "in almost all cases."

The CJEU's rationale for its finding that SCHUFA is engaging in "automated decision-making", was that a more restrictive interpretation of Article 22 GDPR, such that the establishment of the credit score is a preparatory act, and only the act taken by the third party is a "decision" under Article 22(1) GDPR, would result in a lacuna in legal protection and risk circumventing Article 22 GDPR. In such a scenario, the establishment of a credit score would evade the specific requirements under Article 22(2) to (4) GDPR, despite the fact that such a procedure is based on automated processing and significantly affects the data subject (where the action of the third party to whom that score is transmitted draws strongly on it). 

Furthermore, the data subject would not be able to claim, from the credit reference agency which determines the score concerning him or her, a right of access to the information referred to in Article 15(1)(h) GDPR, given the absence of automated decision-making by that company. In addition, assuming the act adopted by the third party did fall within the scope of Article 22(1) GDPR, that third party would still not be able to provide the specific information because it generally does not have it.

Comment

The judgment confirms that any solely automated credit scoring by a credit reference agency in relation to a person must take place in accordance with Article 22 GDPR, in circumstances where a third party "draws strongly" on the credit score to determine whether to "establish, implement or terminate a contractual relationship" with that person. If a credit reference agency or other similar provider issues a score that is not relied on heavily by the third party making the end decision – for example, because lenders attach significant weight to other factors – then it is arguable that the issuing of the score would not be covered by Article 22 GDPR.

The judgment will have implications beyond credit scoring, affecting sectors such as recruitment, healthcare and insurance, where AI decision-making is frequently relied upon. It serves as a reminder of the importance of developing an AI governance framework that respects organisations' obligations under the GDPR, as well as under the proposed AI Act. Service providers providing automated decision-making services, that are relied upon by third party organisations for decision-making purposes will inevitably be concerned by the potential ramifications of this decision. In many cases, such service providers have likely assumed that their customers will bear responsibility for any legal compliance risks associated with any decisions taken using the service provider's input. However, this judgment shows that such service providers maybe caught by Article 22 GDPR, at least in circumstances where their customer "strongly" relies on their input when making a decision, and the ultimate decision by the customer has a legal or similar significant effect on an individual.

Contact Us

For more information, please contact any member of  our Technology and Innovation Group or your usual Matheson contact.