Emotion-detecting tech should be restricted by law

A leading research center has called for new laws to restrict the use of emotion detection technologies.

The AI ​​Now Institute says the estate is “built on very fragile foundations”.

Despite this, systems are on sale to help veterinarians search for jobs, test suspected criminals for signs of deception and set insurance prices.

He wants this software to be banned from use in important decisions that affect people’s lives and / or determine their access to opportunities.

The U.S.-based organization has found support in the UK from the founder of a company developing his own emotional response technologies – but cautioned that any restrictions should be nuanced enough not to hinder all of the work in the region.

The constant evolution of

AI now refers to technology by its official name, affects recognition, in its annual report.

It indicates that the sector is going through a period of significant growth and could already be worth up to $ 20 billion (£ 15.3 billion).

“He pretends to read, if you will, our inner emotional states by interpreting the micro-expressions on our face, the tone of our voice or even the way we walk,” said co-founder Professor Kate Crawford.

“It is used everywhere, from how you hire the perfect employee, to assessing patient pain, to tracking students who seem to be paying attention in class.”

In addition to the deployment of these technologies, a large number of studies show that there is … no substantial evidence that people have this consistent relationship between the emotion you feel and the appearance of your face.

“ProfessorCrawford suggested that part of the problem was that some companies based their software on the work of Paul Ekman, a psychologist quiproposé in the 1960s there were only six émotionsbase expressed parémotionsvisage.

But splint added , subsequent studies showed that there was much greater variability, both in terms of the number of emotional states and the way people expressed them.

“It changes across cultures, across situations, and even across through a single day, “she said.

Micro expressions

AI Now gives several examples of companies selling emotion detection products, some of which have already responded.

Oxygen Forensics was cited for offering software detecting emotions to the police, but defended their efforts.

“The ability to detect emotions, such as anger, stress or anxiety, provides law enforcement with additional information when pursuing a large-scale investigation, “said its director of operations, Lee Reiber.

“Ultimately, we believe that responsible application of this technology will be a factor in making the world a safer place.”

Another example was HireVue, which sells AI-based video tools to recommend candidates that a company should interview.

It uses third-party algorithms to detect “emotional engagement” in the candidates’ micro-expressions to help it make its choices.

 

“Many candidates have benefited from HireVue technology to help eliminate very important human biases in the current recruitment process,” spokeswoman Kim Paone told Reuters news agency.

Cogito, who developed voice analysis algorithms for call center staff to help them detect customer distress, was also mentioned.

“Before emotion detection can make automated decisions, the industry needs more evidence that machines can actually effectively and constantly detect human emotions,” said its chief executive Joshua Feast to the BBC.

“What can be done today is to assess the behaviors that provide certain emotions and provide this intelligence to a human to help him make a more informed decision. For tomorrow and the future, it is all practitioners and leaders in the field to collaborate, research and develop solutions that help foster a deeper common understanding that will eventually lead to more connected relationships with each other – not in spite of technology, but because of that. “

The BBC also asked some of the other named companies to comment, but got no response.

Required context

Emteq – a Brighton company trying to integrate emotion sensing technology into virtual reality headsets – was not among those reported to be of concern.

Its founder said that while current AI systems could recognize different facial expressions, it was not easy to infer what the subject’s underlying emotional state was.

“You have to understand the context in which emotional expression takes place,” said Charles Nduka.

“For example, a person might frown not because they are angry but because they are concentrating or because the sun is shining and trying to protect their eyes. Context is essential, and that you can’t just get to look at the mapping of the computer vision of the face. “

He too believed that the use of technology should be regulated.

 

But he expressed concern that in doing so, legislators were not restricting the work he and others were doing to try to use emotion detection software in the medical field.

“If things are going to be banned, it is very important that people do not throw the baby out with the bathwater,” he said.