EU Advocate General balances data protection rights against trade secrets in algorithmic credit scoring case
On 16 march 2023, Advocate General Pikamäe issued his opinion in cases C-26/22, C-64/22 and C-634/21 involving credit scoring agency SCHUFA. This blog post focuses on the third case, where the AG elaborates on the balancing exercise between trade secrets and data subject rights. This case gives the CJEU an opportunity to decide on the scope of the right of the data subject to obtain meaningful information when their personal data is processed through profiling systems.
Case C-634/21 concerns a German citizen whose loan has been refused. This refusal was based, among other things, on a credit scoring report provided by SCHUFA, a private company specialised in providing its clients with information on creditworthiness. The German citizen exercised his GDPR-based right to access towards SCHUFA to gain more insight in the reason why his credit had not been granted. SCHUFA sent the credit score and the basic principles for calculating someone's credit score to the individual, but it did not explain how the different parameters were taken into account in the calculation. It based this refusal on the fact that the calculation mechanism is protected by a trade secret. But how far does the protection of trade secrets reach to refuse a subject access request?
Data protection and trade secrets
A credit-scoring agency's calculation method will most probably be protected by one or more trade secrets. The GDPR only states in its recitals that data subject rights should not adversely affect trade secrets and other IP, but this does not mean that trade secrets can be used to refuse every access request. AG Pikamäe confirms that trade secrets cannot be used as an absolute ground for refusal, and that at least a minimal amount of information should be provided to the data subject. However, the AG states that the rule under the GDPR that this information should be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language, severely limits the access right relating to a (complex) credit-scoring algorithm. Controllers cannot overload the individual with information that is too complex and technical, as this would only have a perverse effect on transparency. It will be a matter of balancing these various interests in practice, always taking into account the purpose of the right of access: to be able to understand the processing activities so that the data subject can effectively exercise their other data subject rights, such as the right to rectify, restrict or erase their personal data.
AG Pikamäe advises that in any case, a data subject should receive detailed information on the method used to calculate the score and the reasons that lead to the result, irrespective of the trade secret. The parameters and their respective weight on the result should also be communicated, so that the data subject can effectively challenge this decision.
Data protection and algorithmic decisions: a right to explanation (?)
The GDPR contains a prohibition in principle on decisions based solely on automated processing, including profiling, which produce legal or similar effects. The German citizen claimed that this prohibition was breached as the financial institution automatically refused the credit based on the score provided by SCHUFA, which was an automated individual decision leading to legal or similar effects.
The AG confirmed that establishing an automated credit score constitutes such a decision based solely on automated processing, including profiling, producing legal or similar effects if that score is transmitted to a financial institution that, in accordance with a consistent practice, draws strongly on that score for its decision to grant a credit or not.
A (doctrinal) sub-topic of automated individual decision-making in the GDPR is the right to explanation. The right to explanation has been a hot topic among data protection academics for the last few years. The debate essentially concerns whether a data subject is entitled to an explanation from the data controller as to how an algorithm works when it is processing personal data in an automated way, leading to legal or similar effects. This right is not explicitly enshrined in the GDPR, but some say that this can be found in combining several parts of the GDPR (art. 13, 14, 15 and 22 and recital 71).
According to some authors, the right to explanation is an essential consequence of the principles of transparency and accountability under the GDPR, whereas other state that this right is nowhere to be found in the text of the regulation. Even though the majority of the doctrine today agrees that there is a limited right to explanation, the CJEU now has the opportunity in this case to clarify the extent of the data subject's rights in case of profiling.