When technology reads our emotion
Recognizing affects and emotions is considered a deeply human capacity. But what happens if technology, too, can detect whether we are sad, stressed, or overwhelmed – and is even expected to respond to it? These developments are becoming increasingly realistic. They are made possible by affective computing, a technology that draws on artificial intelligence to infer emotional states from data. Applications of this emerging technology analyze facial expressions, voice, or behavior and attempt to deduce how people are feeling. Other applications support users in coping with anxiety, pain, or stress.
What happens when emotions become measurable and controllable?
This opens up new possibilities: for instance, in medical applications that intervene in emotional processes to treat depression and chronic pain, or in everyday life, where people monitor themselves in real time via smartphone. At the same time, this raises important questions. Who determines what is considered “normal” or “unusual”? What happens when machines evaluate emotions differently than we do? And how does our approach to emotions change when they are increasingly understood as measurable and controllable?
This is where the ShiftAffect research project, funded by the Federal Ministry of Research, Technology, and Space, comes in. It examines how technologies trained to analyze emotional states are deployed in our daily lives and what consequences this may have. The project involves not only researchers but also the people who develop or use such technologies.
Joint development of future scenarios
The ITAS project team combines these different perspectives in a targeted manner. Developers, healthcare professionals, and affected individuals are involved through interviews and workshops. Together, they develop possible future scenarios to think through potential applications and their implications in a systematic way.
The project aims to create a nuanced understanding of how these technologies affect real-life situations and what societal consequences they entail. Podcasts, short videos, and an interactive exhibition at the end of the project will also contribute to this effort. (22.04.2026)
Further links and information:

