Network Rail 'secretly used AI cameras to scan rail passengers' faces (2024)

A privacy row broke out today after it emerged that Network Rail has been secretly using Amazon's AItechnology to secretly monitor thousands of rail passengers at major stations across the UK.

Campaigners last night accused the Government-owned company of displaying a 'contempt for our rights' by secretly installing AI-powered surveillance at rail hubs across Britain.

It is feared that thousands of people have had their faces recorded with 'smart' CCTV cameras to establish their age, gender and emotions atLondon Waterloo and Euston stations, as well as Manchester Piccadilly, Leeds, Glasgow, Reading,Dawlish , Dawlish Warren and Marsden stations.

The scheme has been taking place for two years, with the data sent to Amazon Rekognition, according to a Freedom of Information request obtained by civil rights group Big Brother Watch.

Big Brother Watch last night warned that 'AI-powered surveillance could put all our privacy at risk' - adding that Network Rail had shown a 'contempt for our rights'.

Network Rail started the trials in 2022with the aim of improving customer service and enhancing passenger safety (file image of London Liverpool Street station)

Cameras placed at ticket barriers at mainline railway stations across the country analyse customers faces (file image)

The Information Commissioner's Office (ICO) previously warned companies against using the technology.

What to do if you think you have been filmed?

Under UK law, people have the right to request CCTV footage of themselves.

The individuals needs to make a request to the owner of the CCTV system. They can do this either in writing or verbally.

However, the Network Rail scheme does not include facial recognition technology which is used to identify a person so may have difficulty obtaining any footage.

Owners of CCTV cameras can also refuse to share any footage if other people can be seen in it.

Meanwhile, the Information Commissioner's Office (ICO) has urged organisations to assess the public risk before using such technology, and warned that any firms which do not act responsibly, pose a risk to vulnerable people or fail to meet ICO expectations will be investigated.

People that are unhappy about being filmed could complain to Network Rail first to give them a chance to resolve any privacy related issues followed by the ICO if the matter remains unresolved.

ICO guidance states: 'You should give the organisation you're unhappy with a chance to sort things out before bringing your complaint to us.

'Many data protection complaints can be resolved quickly and easily with the organisation.'

<!- - ad: https://mads.dailymail.co.uk/v8/us/news/none/article/other/mpu_factbox.html?id=mpu_factbox_1 - ->

Advertisem*nt

They also said the technologies are 'immature' and 'they may not work yet, or indeed ever.'

Jake Hurfurt, Head of Research & Investigations at Big Brother Watch, said: 'Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain's biggest stations, and I have submitted a complaint to the Information Commissioner about this trial.

'It is alarming that as a public body it decided to roll out a large scale trial of Amazon-made AI surveillance in several stations with no public awareness, especially when Network Rail mixed safety tech in with pseudoscientific tools and suggested the data could be given to advertisers.'

'Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used.

'AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail's disregard of those concerns shows a contempt for our rights.'

Carissa Véliz, an associate professor in psychology at the Institute for Ethics in AI at the University of Oxford, told Wired: 'Systems that do not identify people are better than those that do, but I do worry about a slippery slope.

'There is a very instinctive drive to expand surveillance. Human beings like seeing more, seeing further. But surveillance leads to control, and control to a loss of freedom that threatens liberal democracies'.

Speaking in 2022, the ICO's deputy commissioner Stephen Bonner said: 'Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever.'

'While there are opportunities present, the risks are currently greater.

'At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination.

'The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science.

'As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.

Civil liberties group Big Brother Watch have raised privacy concerns about the Network Rail scheme and have submitted a complaint to the Information Commissioner's Office (ICO) (file image of Carlisle railway station)

London Euston is one of the stations where the cameras have been placed

'The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.'

AI researchers have also warned that using the technology to detect emotions is 'unreliable' and should be banned.

In the EU such systems are banned or deemed 'high risk' under the Artificial Intelligence Act.

Gregory Butler, chief executive of Purple Transform, which constructed the trial for Network Rail, said that although the trial continued, the part looking at emotions and demographics had been short-lived.

The cameras were also part of a wider trial to use AI to tackle issues such as trespassing, overcrowding, bicycle theft and slippery floors.

A Network Rail spokesperson said: 'We take the security of the rail network extremely seriously and use a range of advanced technologies across our stations to protect passengers, our colleagues, and the railway infrastructure from crime and other threats.

'When we deploy technology, we work with the police and security services to ensure that we're taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.'

However, in a later statement, they saidno analysis of emotions took place, adding: 'The purpose of the trial was never to gather emotion data, the capability was removed before the technology was trialled as there wasn’t a want or need for it.'

The system did send images for analysis by Amazon Rekognition software to record demographic details, such as a passenger’s gender and age range, but that part of the programme has now ended.

Amazon did not comment on the scheme.

Network Rail 'secretly used AI cameras to scan rail passengers' faces (2024)

References

Top Articles
Latest Posts
Article information

Author: Kelle Weber

Last Updated:

Views: 6248

Rating: 4.2 / 5 (53 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Kelle Weber

Birthday: 2000-08-05

Address: 6796 Juan Square, Markfort, MN 58988

Phone: +8215934114615

Job: Hospitality Director

Hobby: tabletop games, Foreign language learning, Leather crafting, Horseback riding, Swimming, Knapping, Handball

Introduction: My name is Kelle Weber, I am a magnificent, enchanting, fair, joyous, light, determined, joyous person who loves writing and wants to share my knowledge and understanding with you.