Home | deutsch  | Sitemap | Legals | Data Protection | KIT

Risks of discrimination by algorithms

Risks of discrimination by algorithms
Project team:

Orwat, Carsten (Project leader); Reinhard Heil, Oliver Siemoneit

Funding:

Federal Anti-Discrimination Agency

Start date:

2018

End date:

2018

Research area:

Innovation processes and impacts of technology

Project description

Nowadays, digital information and communication technologies permeate almost all areas of life and generate extensive volumes of data, which could also be applied for automated decision making about individuals using advanced methods of data processing and analysis.

In this context, rules for social interactions, behavioral steering, and decision making are increasingly embedded in software, automatically enforced, and technically implemented. On the one hand, this may contribute to preventing human errors and biases in decision making. On the other hand, it may increase the risks of discrimination of protected groups and new types of discriminations may emerge.

The research project aims to investigate and analyze cases of discrimination by algorithms, data sets, owners, or users in order to elaborate on reasons, contexts, types, and dimensions of discrimination and their potential risks. This also includes questions about self-determination, justice, fairness, responsibility, or the relation to data protection.

Based on this, the project will discuss options to reduce risks of discrimination, in particular compared to the existing legal framework. The project is funded by the Federal Anti-Discrimination Agency for a period of one year and has started on 1 January 2018.

Contact

Dr. Carsten Orwat
Karlsruhe Institute of Technology (KIT)
Institute for Technology Assessment and Systems Analysis (ITAS)
P.O. Box 3640
76021 Karlsruhe
Germany

Tel.: +49 721 608-26116
E-Mail