On Affective Computing

You interact with artificial intelligence systems every day, but you may not always know it. Some of these systems actively seek to understand your beliefs and emotions. When you are most receptive, they persuade you on what to believe or how to behave. These practices raise concerns of human agency and our democracy.



Affective Computing, also called ‘emotion recognition,’ is the development of Artificial Intelligence systems on your smartphone, computer and wearable devices that aim to recognize, interpret, process, and simulate human affects (although with questionable validity). They analyze your words and behavioral biometric data (eg. facial expressions, eye movements, voice quality, sweat, brainwaves).



Affective Computing could be harmful, especially in high-stakes situations in which an AI system makes decisions that affect a person’s physical or mental health or financial situation. During the pandemic, Americans are even more vulnerable to manipulation because they are spending a lot more time working and playing games on the internet, and mental health issues are increasing.



Sentiment analysis; augmented, virtual and extended reality systems; smart televisions; brain-computer interfaces and other sensors may be used to make inferences about video game players’ emotional state, yet the validity of these inferences is scientifically questionable. Although emotion recognition technologies are available for use in entertainment and gaming environments, it is unclear to what extent this practice is being done today, how it’s expected to evolve in the next several years, and the human rights implications. How does emotion recognition in entertainment and gaming present unique challenges to players’ right to privacy, freedom of expression and other human rights?



Senator Gillibrand’s well-strategized S.3300 - Data Protection Act of 2020 proposes to create a Federal Agency to protect individuals’ privacy rights. A few amendments could help it gain bipartisan support and address the concerns outlined above. It restricts, “any processing of biometric data for the purpose of uniquely identifying an individual.” I recommend reframing it as an acceptable practice if done by a Federal, State or local government agency for national security purposes. S.3300 should restrict the collection and use of behavioral biometrics for Affective Computing, applied to marketing and entertainment purposes. S.3300 restricts, “the use of personal data of children or other vulnerable individuals.” This could get implemented with an on-device ID that restricts downloads by protected groups. Senators Merkley and Sanders’ S.4400 does an excellent job of outlining different types of biometric and genetic data. Senator Booker’s S.2689 demonstrates how behavioral biometrics can be used to re-entrench pre-existing racial biases. I recommend Senators Gillibrand, Merkley, Sanders and Booker work together to further develop S.3300.

Previous
Previous

On Nudging Systems, During the COVID19 Pandemic

Next
Next

The Future of Human Computer Interaction