Fight for the Future, together with 27 other human rights organizations, has written a letter to Zoom, requesting that it cease developing emotion monitoring software.
Emotion monitoring software employs artificial intelligence to track a person’s emotional and facial responses during a video chat. “…intrusive technology… [and] a breach of privacy and human rights,” the letter says.
The five primary reasons why Fight for the Future and other groups are worried are discussed. They argue that it is a security concern, as well as discriminatory, manipulative, and based on pseudoscience, as well as a means of revenge.
One objective of emotional monitoring software, according to Protocol, is to offer statistics to a sales team on whether or not their pitch is successful for their customer.
A team may adapt their plan on the go if they know a client’s “emotional state.” Another goal is to monitor students’ emotional states, determine their level of engagement, and utilize that information to improve online courses.
They’re also worried about firms spying on consumers in order to influence them or penalize them for displaying “negative feelings.”
Furthermore, this technology may or may not operate.
The face does not always represent a person’s feelings, according to a research by the Association for Psychological Science, because people may readily disguise their emotions.
To read our blog on “Samsung Galaxy A52/A72 will feature stereo speakers, built-in Snapchat lenses, 30x Space Zoom, and more,” click here.













