‘This time with feeling?’ Assessing EU data governance implications of out of home appraisal based emotional AI
The boundaries of personal space and borders of bodily integrity are being tested by deployments of emotional artificial intelligence (EAI) in private and public spaces. By means of sensing, seeing and machine learning of facial expressions, voice, gaze, gestures and range of physiological signals (heart rate, skin conductivity and temperature, muscle activity, body temperature, respiration and other bio-signals), the goal is to make interior emotional life machine-readable for personal, commercial and security objectives.
In this paper, we focus on computer vision and face-based analytics to consider the nature, method and development of facial coding, the potential demise of existing approaches, and the rise of even more invasive methods. Criticisms of facial coding have long existed, but recent scholarship and industrial development signals a lack of confidence in ‘basic emotions’ and a turn to appraisal-based accounts of emotion. This inevitably entails use of data about internal physiological and experiential contexts, but also factors external to an individual. To explore this, this paper asks and answers the following question: With regard to deployment in out-of-home situations, what are the legal and privacy implications of appraisal-based emotion capture?