Warning on use of emotion analysis technology

The Information Commissioner’s Office will issue guidance on biometric technologies and their use in spring 2023, while it is currently warning organisations to be wary around the risks of using emotion analysis technology before putting it into use.

The ICO is worried that organisations are making critical decisions about people without appreciating there is no scientific evidence that biometric technologies, said to analyse emotions, work.

Biometric information is based on physical and behavioural characteristics, such as facial movements or heartbeats.

Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture.

Examples include monitoring the physical health of workers by offering wearable screening tools or using visual and behavioural methods including body position, speech, eyes and head movements to register students for exams.

The ICO said in a statement: “Emotion analysis relies on collecting, storing and processing a range of personal data, including subconscious behavioural or emotional responses, and in some cases, special category data.

“This kind of data is far more risky than traditional biometric technologies that are used to verify or identify a person.

“The inability of algorithms that are not sufficiently developed to detect emotional cues means there’s a risk of systemic bias, inaccuracy and even discrimination.”

ICO deputy commissioner Stephen Bonner said in a statement: “As it stands, we are yet to see any emotion AI [artificial intelligence] technology develop in a way that satisfies data protection requirements and have more general questions about proportionality, fairness and transparency in this area.”

Researchers at Cambridge University recently raised similar concerns about some of the claims made for AI image-analysis systems used to assess a job candidate’s personality. They suggest that some uses of AI in recruitment are little better than an “automated pseudoscience” similar to discredited beliefs that personality can be deduced from facial features or skull shape.

They say it is a dangerous example of “technosolutionism”, which means turning to technology to provide quick fixes for deep-rooted discrimination issues that require investment and changes to company culture, said Dr Eleanor Drage, a co-author of the report from Cambridge’s Centre for Gender Studies.

Bonner told the BBC that the ICO was warning companies: “If you go and buy this technology without any evidence that it’s actually working and then there’s harm for individuals, we’re going to step in.”

(See AI tools in recruitment: An ‘automated pseudoscience’ and not to be trusted?, 13 October 2022 on recruiter.co.uk)

Image credit | Shutterstock

NEW TO THE MARKET: 15-19 APRIL 2024

This week’s new launches include: LinkedIn, Peak 72

New to Market 15 April 2024

Blair’s Multiverse acquires AI talent software firm Searchlight

Workplace training company Multiverse has bought AI talent intelligence and skills assessment platform Searchlight for an undisclosed sum.

Contracts 10 April 2024

Nicholas Associates Group appoints Kendall COO

Rotherham-headquartered recruitment specialist Nicholas Associates Group (NAG) has strengthened its executive board with the appointment of Kelly Kendall as chief operating officer.

People 10 April 2024

Microsoft teams up with SThree to improve operations

SThree, the leading STEM-specialist staffing group, has announced a collaboration with Microsoft that is intended to power its industry-leading Technology Improvement Programme.

Contracts 26 March 2024
Top