Meio & Mensagem

Porque tecnologias de reconhecimento emocional devem ser banidas


How To


Porque tecnologias de reconhecimento emocional devem ser banidas

Estudo da AI Now, entidade que analisa a Inteligência Artificial, seu desenvolvimento e aplicação no mundo hoje, alerta que a tecnologia não está pronta para ser usada no reconhecimento de emoções humanas e defende seu banimento para esse fim.

20 de dezembro de 2019 - 7h43



Abaixo, texto técnico que resume os achados do estudo da AI NOW.


There’s little scientific basis to emotion recognition technology, so it should be banned from use in decisions that affect people’s lives, says research institute AI Now in its annual report.

A booming market: Despite the lack of evidence that machines can work out how we’re feeling, emotion recognition is estimated to be at least a $20 billion market, and it’s growing rapidly. The technology is currently being used to assess job applicants and people suspected of crimes.

Further problems: There’s also evidence emotion recognition can amplify race and gender disparities. Regulators should step in to heavily restrict its use, and until then, AI companies should stop deploying it, AI Now said.

Other concerns: In its report, AI Now called for governments and businesses to stop using facial recognition technology for sensitive applications until the risks have been studied properly, and attacked the AI industry for its “systemic racism, misogyny, and lack of diversity.” It also called for mandatory disclosure of the AI’s industry environmental impact.