Risks of discrimination by algorithms

Project description

Nowadays, digital information and communication technologies permeate almost all areas of life and generate extensive volumes of data, which could also be applied for automated decision making about individuals using advanced methods of data processing and analysis.

In this context, rules for social interactions, behavioral steering, and decision making are increasingly embedded in software, automatically enforced, and technically implemented. On the one hand, this may contribute to preventing human errors and biases in decision making. On the other hand, it may increase the risks of discrimination of protected groups and new types of discriminations may emerge.

The research project collected and analyzed cases where discrimination has become known involving algorithms, data sets, operators or users in order to identify causes, contexts, types and dimensions of discrimination and discrimination risks. This included questions relating to self-determination, justice, fairness, responsibility or data protection.

Based on this, options for action were derived and discussed, inter alia in comparison with existing legal frameworks. The project was funded by the Federal Anti-Discrimination Agency for a period of one year.


Orwat, C.
Künstliche Intelligenz, Diskriminierungsrisiken und Auswirkungen auf die Menschenwürde
2024. KIT Gründerschmiede Community Congress (2024), Karlsruhe, Germany, March 21, 2024 
Orwat, C.
Diskriminierungsrisiken durch Verwendung von Algorithmen
2020. Fachgespräch "Diskriminierungsrisiken durch Verwendung von Algorithmen – Interventionsmöglichkeiten, Schutzlücken sowie die Rolle von Antidiskriminierungsstellen" der Antidiskriminierungsstelle des Bundes (2019), Berlin, Germany, September 16, 2019 


Dr. Carsten Orwat
Karlsruhe Institute of Technology (KIT)
Institute for Technology Assessment and Systems Analysis (ITAS)
P.O. Box 3640
76021 Karlsruhe

Tel.: +49 721 608-26116