Navigation and service

The General Equal Treatment Act and the protection against discrimination by algorithmic decision-making systems

- Fact sheet on the legal opinion -

Authors: Prof Dr iur Indra Spiecker genannt Döhmann, LL.M. (Georgetown University),
Prof Dr iur Emanuel V. Towfigh, commissioned by the Federal Anti-Discrimination Agency
Year of publication: 2023

Brief overview

The use of algorithmic decision-making systems (ADM systems) poses new challenges to anti-discrimination law. Self-learning ADM systems are also known as AI (artificial intelligence). While ADM systems entail a high potential for discrimination, many instances of discrimination go unnoticed. The legal opinion explores to what extent the General Equal Treatment Act (AGG) is fit for the purpose of addressing discrimination by ADM systems. The draft of the EU’s AI Regulation was also included in the assessment. The opinion discusses existing protection gaps in the AGG for those experiencing discrimination by ADM systems and sets out how to close these gaps and how to strengthen law enforcement in this area.

Main results

ADM systems entail a significant potential for discrimination

  • ADM systems promise to deliver objective decisions that are unbiased by personal views and attitudes. In reality, however, the potential of ADM systems to cause discrimination is significant.
  • ADM systems work with the data they are provided and make their decisions by assigning group characteristics. From an anti-discrimination law perspective, however, especially this practice of assigning individuals to specific groups is problematic.
  • The quality of decisions made by ADM systems heavily depends on the data fed into the system. In general, neither the users nor the target groups of ADM systems are able to verify whether these data is correct, quality-assured or at all suitable for the intended use.
  • Even if only one discriminatory data set is fed into an ADM system, the system’s decisions can no longer be non-discriminatory. In such a case, the discrimination is reproduced and perpetuated by the system.
  • ADM systems are especially susceptible to what is known as proxy discrimination. This form of discrimination is particularly hard to detect. It is associated with characteristics that are supposed to be neutral but strongly correlate with the discrimination characteristics. This way, existing anti-discrimination requirements can – intentionally or unintentionally – be evaded.
  • Due to the lack of transparency of ADM systems and their decision-making processes, discrimination by these systems often remains undetected.

Discrimination by ADM systems is not sufficiently covered by the AGG

  • The provisions of the AGG are not designed to cover the specific circumstances around discrimination caused by algorithmic decision-making.
  • The AGG differentiates between direct and indirect discrimination. In cases of discrimination by ADM systems, however, assigning the incident to either of these categories often proves impossible.
  • ADM systems work by forming and assigning items to groups. They establish correlations. Discrimination based on such correlations is not covered by the AGG whose protection refers to individual characteristics.
  • The AGG lacks requirements to provide and disclose information which would offer insights into the data used and the way the respective system works.
  • The AGG cannot cover the problem posed by the overlapping responsibilities regarding ADM systems. The ADM service providers or system developers behind the users of ADM systems can currently not be made accountable under the AGG.

The AGG’s deficits in the field of law enforcement prevent an effective legal protection

  • The known protection gaps in the AGG would have an even more disparate effect in case of discrimination by ADM systems and prevent an effective legal protection.
  • The limited reversal of the burden of proof under Section 22 of the AGG is barely of any use when it comes to discrimination by ADM systems. To collect the evidence required for this reversal of the burden of proof to apply in an instance of discrimination by algorithms, it is generally necessary to know how the respective ADM system works. Those affected by the discrimination usually do not have that knowledge.
  • The AGG does not provide sufficient options to support those affected by discrimination. This issue has been known for a long time now and hampers the enforcement of anti-discrimination law. In the context of discrimination by ADM systems, the impact of this deficit is particularly severe.

Options for action

The authors propose changes to the law that could help improve protection from discrimination by ADM systems.

These include:

  • A fundamental realignment of the AGG regarding the role played by the Federal Anti-Discrimination Agency (ADS).
  • Comprehensive rights for the ADS to obtain information from the users of ADM systems as well as inclusion of the ADS into the scope of the AI Regulation to provide it with the rights of investigation and information as laid down in the AI Regulation.
  • The right for the ADS to take collective legal action. This would enable prosecution of systemic violations of discrimination bans involving the use of ADM systems.
  • Setting up an independent arbitration service at the ADS and stipulating rules for an arbitration procedure in the AGG. If those affected request an arbitration procedure, participation should be mandatory for the users of ADM systems.
  • Giving anti-discrimination organisations the right to take representative action to better support those affected by discrimination in asserting their rights and reduce structural inequalities.
  • Including a characteristic into Section 1 of the AGG that covers discrimination based on correlations. This way, the AGG could also cover the functionality of ADM systems.
  • Expanding the circle of those addressed by the AGG to also include ADM system developers and service providers so that the different responsibilities regarding the use of these systems can be covered.
  • Amending the provision on the justification of indirect discrimination in Section 3 (2) of the AGG to stipulate that when assessing the appropriateness of using an ADM system, the latter must be shown to be discrimination-proof.
  • Adjusting the interpretation of the reversal of the burden of proof to also apply in cases of discrimination by ADM systems.

Print fact sheet