CJEU confirms “right of explanation” in battle between trade secrets and algorithmic transparency

Article
EU Law

On February 27th, 2025, the Court of Justice of the European Union (CJEU) delivered an important decision on algorithmic transparency. It has confirmed the existence of a “right of explanation” in case of automated decision making, and it introduced the right for courts and authorities to obtain information protected by trade secrets in order to balance these interests with the data subject’s rights. The impact on organisations using AI systems to make an automated decision regarding an individual will be significant. In this article, we elaborate on the judgment, its consequences as well as its challenges for businesses.

Algorithmic transparency – right of explanation

In the present case (CJEU 27 February 2025, no. C-203/22, Dun & Bradstreet Austria), a credit scoring agency issued a negative credit score for a person seeking to enter into a mobile telephone contract with a provider. This lead to the telecom company refusing to enter into an agreement with that person. Given that this is an automated individual decision with legal or similar effects, a data subject has the right under the GDPR to request and receive “meaningful information about the logic involved”.

The referring Austrian court had appointed an expert who took the view that the scoring agency should divulge inter alia the mathematical formula, the values attributed to the data subject and the precise intervals within which the same value is attributed to different data. The question is, however, whether such technical information really serves the data subject’s understanding of the algorithm.

In its present judgment, the CJEU ruled that “meaningful information about the logic involved” entails relevant information, in a concise, transparent, intelligible and easily accessible form, on the procedure and principles actually applied in order to use, by automated means, the personal data concerning that person with a view to obtain a specific result, such as a credit profile. These criteria remind us of the GDPR’s existing standard of transparency (art. 12(1) GDPR). The CJEU elaborates, however, also specifying that the data subject must be informed on the importance and the envisaged consequences of the processing at issue, ideally combined with real, tangible examples thereof.

The CJEU’s decision shows that content and format of the explanation go hand in hand, and that there is both a lower limit and an upper limit to the information to be shared with the data subject:

  • Lower limit: the information should at least allow the data subject to sufficiently understand the principles and process of how the algorithm works, in order to verify the lawfulness of the processing and to be able to exercise its other rights (right to rectify, right to be forgotten, etc.). The CJEU states that the complexity of the algorithm does not relieve the data controller from “simplifying” it to layman terms. For profiling such as credit scoring, the CJEU added that it could be deemed sufficient by a national court to only inform the data subject of the extent to which a variation in the personal data taken into account would have led to a different result.
  • Upper limit: the data controller cannot bury the data subject in complex mathematical formulas and technical documentation. The CJEU requires the data controller to find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the automated decision. This is clearly illustrated by the CJEU’s formulation on the content of the information (“procedure and principles”) and its format (“concise, transparent, intelligible and easily accessible”).

We expect companies engaged in automated decision-making such as credit scoring, algorithmic pricing and CV filtering to be confronted with more and more questions regarding how their algorithm works, and why it came to a certain decision. These companies will have to “translate” the complexities of their algorithm into layman terms using whatever means possible. This may result in companies having to develop infographics and similar documents built specifically for that purpose. As most of these companies will rely on third party service providers, they should ensure that the applicable agreements provide in some degree of cooperation to secure the appropriate level of transparency.

Transparency vs. trade secrets: who decides?

When the data subject submitted her access request, the credit scoring agency refused to provide more information because it considered this information to be a trade secret. In line with its past case law, the CJEU ruled, however, that countervailing interests (e.g. trade secrets, intellectual property rights and third party personal data, etc.) cannot be used as a blanket ground of refusal. This means that a balance must be found.

The CJEU had ruled in an earlier judgment that the right of access of one data subject may sometimes require that that data subject receives another data subject’s personal data. It had also ruled that the national court may require the controller to share that third party’s personal data with the court, so that it can decide whether that third party’s personal data should be (partially or fully) shared with the requesting data subject.

In the present case of last week, the CJEU now broadens that earlier case law to all situations where the information to be shared with the data subject it is “likely to result in an infringement of the rights and freedoms of others”. In other words, a national court or data protection authority may now require a controller to share such “allegedly protected information”, so that it can decide whether or not (parts of) a trade secret should be divulged to the requesting data subject. The goal is to allow the authority or court to concretely assess what information would be relevant to the data subject’s access request.

Next steps and challenges

This decision is not a win for credit scoring agencies, or any company making automated individual decisions by processing personal data. We have identified three issues and challenges following this new competence for courts and data protection authorities to request information protected by a trade secret:

  • A first issue is that most protection authorities do not have the statutory competence to determine whether something is a trade secret or not. If there are discussions before a data protection authority on whether something is a trade secret, it will have to refer to the competent courts.
  • A second issue is that it is still unclear how much information falls under the “allegedly protected information”. Based on the first public discourse following the decision, we see some authors state that the full trade secret must be disclosed, while others suggest a more strict approach. Given the sensitivity of disclosing such sensitive information, we would recommend companies to find alternative ways of explaining the process and principles behind their algorithms, rather than to divulge information protected by trade secrets. If anything, this judgment is an incentive for companies to find ways around their own trade secrets to explain the logic behind their proprietary algorithm. If a company invokes the protection of a trade secret, it will end up in an uncertain situation, depending on what the courts and authorities deem reasonable to share with the claimant.
  • A third issue concerns security: companies invest very large sums of money to protect their trade secrets by ensuring that the IT infrastructure on which their secrets are stored is highly secure in order to avoid any theft or corporate espionage. The validity of their trade secret depends on it. Courts and data protection authorities do not have the same budget for data security. Therefore, there is a real risk that the IT systems of courts and data protection authorities become a new target of malicious actors to obtain such valuable information. In this sense, it would be a good idea for companies to enter into a dialogue with the authority to find a way to share the protected information in a safe and secure manner. Inspiration can be found in virtual data rooms that are currently being used for due diligence in M&A.

See also our past blog post on the topic of algorithmic transparency and trade secrets.