المقال
NGOs send letter to Zoom on Emotion Analysis Software
"Letter to Zoom on Emotion Analysis Software", 10 May 2022
We are writing to you as a group of organizations concerned about Zoom’s exploration of emotion tracking software. Zoom claims to care about the happiness and security of its users but this invasive technology says otherwise.
This move to mine users for emotional data points based on the false idea that AI can track and analyze human emotions is a violation of privacy and human rights. Zoom needs to halt plans to advance this feature.
In particular, we are concerned that this technology is:
- Based on pseudoscience: Experts admit that emotion analysis does not work.
- Discriminatory: Emotion AI, like facial recognition, is inherently biased.
- Manipulative: ...Offering emotion analysis to monitor users and mine them for profit will further expand this manipulative practice.
- Punitive: The use of this bad technology could be dangerous for students, workers, and other users if their employers, academic or other institutions decide to discipline them for “expressing the wrong emotions” based on the determinations of this AI technology.
- A data security risk: Harvesting deeply personal data could make any entity that deploys this tech a target for snooping government authorities and malicious hackers.