Values in Emotion Artificial Intelligence Hiring Services

Research Assistant

Prof. Andalibi’s Lab, University of Michigan School of Information

Team

Kat Roemmich (PhD Researcher)

Tillie Rosenberg (Research Assistant)

Nazanin Andalibi (Research Advisor)

Timeline

May 2021 - January 2022

Tools Used

NVivo

Overleaf

Context

Emotion AI is increasingly present in the high stakes context of hiring, with the potential to shape the future of work and the workforce. Emotion AI technologies are known to use affective computing and artificial intelligence techniques to sense, learn about, and interact with human emotional life.

Research

We conducted qualitative content analysis on the public-facing websites of EAI hiring services. After querying 13K unique vendors across identified categories, we applied inclusion/exclusion criteria, used NVivo to inductively coded relevant vendors under various categories, and developed a research code book for the final findings. From this process, we identified the organizational problems that EAI hiring services claim to solve and revealed the values emerging in desired EAI uses as promoted by EAI hiring services.

Findings

EAI hiring services market their technologies as technosolutions to three purported organizational hiring problems: 

  1. Hiring (in)accuracy 

  2. Hiring (mis)fit 

  3. Hiring (in)authenticity

Discussions

  • The unfair and deceptive mechanisms EAI hiring services use that unfairly exclude and exploit job candidates through EAI's creation, extraction, and affective commodification of a candidate's affective value through pseudoscientific approaches.

  • EAI hiring services’ claims to reveal the core values that underpin their stated desired use: techno-omnipresence, techno-omnipotence, and techno-omniscience.

  • The implications for fairness, ethics, and policy in EAI-enabled hiring within the US policy landscape.

What I Learned

📌 Being *human-centered* is at the core of the work I want to do

Before this research project, I had only thought about how technologies can be innovative & ethical. Of course whatever I build will be cool! But what does being human-centered actually mean? Which users am I trying to advocate for as a product builder? Through my research findings, I learned that despite Emotion AI hiring services being cool technology & marketing themselves as solving hiring challenges, many don’t actually center their products around the users that actually matter— job applicants.

✏️ Qualitative research provides depth to the story I’m trying to uncover as a researcher

Numbers can prove a hypothesis, but so can words. Quantitative research may reveal that Emotion AI hiring services don’t actually enable the best hiring decisions, but how and why? Qualitative research can answer these questions.