Wed 01 May 2024

Potential pitfalls of using biometric recognition equipment in employment

Guidance has recently been issued by the Information Commissioner’s Office on biometric data.

In an age where technology shapes our daily lives more than ever, biometric recognition devices have become increasingly prevalent.  The benefits are clear, it offers enhanced security with convenience, no need to remember passwords (or risk losing them) when all that is required is a fingerprint. 

But the use of such technology gives rise to legal risks and compliance issues. This can be seen from recent media coverage involving challenges brought by employees to the use of biometric recognition devices at work. In the first example, the Information Commissioner's Office ("ICO") took steps to issue enforcement notices against an employer that was utilising facial recognition technology ("FRT") and fingerprint scanning to monitor employee attendance. However, such technology can lead to employment tribunal claims too. This was recently seen in relation to an employment tribunal claim brought against Uber Eats that challenged the use of facial recognition AI software alleged to be racially discriminatory.

ICO enforcement action

The ICO enforcement notices were issued following an investigation that found that the biometric data of more than 2,000 employees had been unlawfully processed by Serco Leisure, Serco Jersey and seven associated community leisure trusts. They had failed to demonstrate why the use of the FRT and fingerprint scanning was necessary or proportionate for the purposes of monitoring attendance when there were other less intrusive alternatives available such as ID cards or fobs. The employees had not been offered any alternative to having their faces and fingers scanned to clock in and out of their place of work, and it had been presented as a requirement in order to get paid. According to the UK Information Commissioner, Serco Leisure had not fully considered the risks before introducing the technology, "prioritising business interests over its employees' privacy". There was no way for the staff to opt out of the system, and employees were put in a position where they felt they had to hand over their biometric data. This was "neither fair nor proportionate under data protection law". 

Employment tribunal

In March, an employment tribunal claim for indirect race discrimination, harassment and victimisation brought by an Uber Eats driver settled prior to being heard by the tribunal.  Uber Eats had introduced facial recognition software for use by drivers to verify their identity when using the Uber Eats app, a pre-requisite to a driver accessing work. The claimant was suspended from the app following a failed facial recognition check and subsequent automated process. The claimant argued that the FRT placed people who are black, non-white or of African descent at a disadvantage as they were less likely to pass the recognition check. There were also concerns regarding a failure by Uber Eats to make the claimant aware of the processes that led to his suspension from the app, and the failure to provide any route to challenge the decision.  The claim was supported by both the App Drivers, Couriers Union and the Equality and Human Rights Commission. The case settled for an undisclosed financial sum.  Uber Eats say the ID check is to keep everyone who uses the app safe, and that it includes "robust human review".

What should employers learn from this?

The primary message is to be aware of the risks attached to this type of technology, and the potentially serious consequences of a breach of data protection laws. Risks for employers can include significant fines under GDPR (particularly if a breach of security in relation to biometric information were to arise which could potentially lead to identity theft). The ICO can issue fines of up to 4% of global turnover or £18 million, whichever is the greater. There is also the potential for significant reputational damage to arise.  Before using biometric recognition technology, employers need to identify both a lawful basis for collecting and processing the data it produces and, because of the sensitive nature of biometric data, a special category condition.  They also need to ensure that what they are doing is both necessary and proportionate, striking a balance between the needs of the business and protection of the employees' privacy, and a Data Protection Impact Assessment should always be carried out.

Recent ICO guidance aimed at organisations using or considering using biometric recognition systems covers topics such as what biometric recognition systems can be used for, how employers can demonstrate compliance with their data protection obligations, how biometric data can be processed lawfully (in many cases explicit consent will be required) and the risks employers need to consider when using this type of technology. 

Progression in technology such as this can seem attractive, and certainly has benefits. However, employers must ensure compliance with data protection rules and the protection of employee privacy. For this reason, embedding "privacy by design" into any new projects involving the use of new technologies, so GDPR issues are considered from the outset, is vital to both compliance and also good employee relations.

Make an Enquiry

From our offices we serve the whole of Scotland, as well as clients around the world with interests in Scotland. Please complete the form below, and a member of our team will be in touch shortly.

Morton Fraser MacRoberts LLP will use the information you provide to contact you about your inquiry. The information is confidential. For more information on our privacy practices please see our Privacy Notice