Migrants with a criminal record in the UK will soon be subject to constant monitoring using biometric smartwatches to keep tabs on their movements up to 5 times per day.

Face biometric watches will be introduced as part of tougher measures to tackle illegal immigration and higher numbers of people travelling to the UK, however, the Home Office in a statement makes a separation between asylum seekers, who will not be required to monitoring tags, and foreign-national offenders known to authorities.

The requirements imposed by the Home Office will include migrants having to wear an electronic ankle tag or smartwatch at all times which will collect their personal information including name, date of birth, nationality and photograph and compare data to national database systems. The data will be then stored for a period of 6 years by the Home office.

Data collected from these individuals will be shared with relevant authorities and the police to further monitor their activities in the UK.

The Home Office has previously refreshed legal migration and border control policy to increase the requirements of eligible migrants to the UK, implementing a point-based migration system to deter criminals while still encouraging people from overseas to settle and drive the economy in the UK.

In May, British technology provider, Buddi Limited, won a tender to supply “non-fitted devices” as part of the Government’s Satellite Tracking Service.

The scheme was also responsible for the rollout of fitted tracking devices for migrant offenders awaiting deportation at a cost of £70 million.

The surveillance of foreign individuals is known to be a controversial issue which critics argue violates human rights laws and breeds discrimination and demographic bias.

Biometric bias occurs when an algorithm is unable to operate in an accurate manner based on activities it is programmed to conduct such as automating a response based on the the biological and behavioural characteristics of a person matching an assigned profile. Studies have repeatedly found that some demographics can experience poor classification accuracy or matching performance with facial recognition technology.

According to the U.S. National Institute of Standards and Technology, people of African and Asian heritage experience 10-100 times more negative or misleading matching results compared to people of Caucasian origin, complicating procedures to find real perpetrators of fraud and crime.

Lucie Audibert, a lawyer and legal officer at Privacy International, commented: “Facial recognition is known to be an imperfect and dangerous technology that tends to discriminate against people of colour and marginalised communities. These ‘innovations’ in policing and surveillance are often driven by private companies, who profit from governments’ race towards total surveillance and control of populations”.

There are also concerns that the intrusive measures will have a negative impact on mental health and be counterproductive to deterring individuals with a criminal past from settling in the UK or continuing to commit crimes.

2,500 foreign criminals have so far been subject to electronic tagging and 10,000 deported.