Network Rail 'secretly used Amazon AI cameras to scan rail passengers' faces at major stations including Waterloo and Euston to record their ages, genders and emotions' – how can YOU find out if you've been filmed?

  • Post category:news
  • Reading time:6 min(s) read
Movie channels                     Music channels                     Sport channels

  • Do YOU know if you’ve been filmed? Email richard.percival@mailonline.co.uk  

A privacy row broke out today after it emerged that Network Rail has been secretly using Amazon’s AI technology to secretly monitor thousands of rail passengers at major stations across the UK. 

Campaigners last night accused the Government-owned company of displaying a ‘contempt for our rights’ by secretly installing AI-powered surveillance at rail hubs across Britain.

It is feared that thousands of people have had their faces recorded with ‘smart’ CCTV cameras to establish their age, gender and emotions at London Waterloo and Euston stations, as well as Manchester Piccadilly, Leeds, Glasgow, Reading, Dawlish , Dawlish Warren and Marsden stations.

The scheme has been taking place for two years, with the data sent to Amazon Rekognition, according to a Freedom of Information request obtained by civil rights group Big Brother Watch. 

According to the documents, there are between five and seven AI cameras at each station.

Big Brother Watch last night warned that ‘AI-powered surveillance could put all our privacy at risk’ – adding that Network Rail had shown a ‘contempt for our rights’.

Network Rail started the trials in 2022 with the aim of improving customer service and enhancing passenger safety (file image of London Liverpool Street station)

Network Rail started the trials in 2022 with the aim of improving customer service and enhancing passenger safety (file image of London Liverpool Street station)

Cameras placed at ticket barriers at mainline railway stations across the country analyse customers faces (file image)

Cameras placed at ticket barriers at mainline railway stations across the country analyse customers faces (file image)

The Information Commissioner’s Office (ICO) previously warned companies against using the technology.

What to do if you think you have been filmed?

Under UK law, people have the right to request CCTV footage of themselves.

The individuals needs to make a request to the owner of the CCTV system. They can do this either in writing or verbally.

However, the Network Rail scheme does not include facial recognition technology which is used to identify a person so may have difficulty obtaining any footage. 

Owners of CCTV cameras can also refuse to share any footage if other people can be seen in it.

Meanwhile, the Information Commissioner’s Office (ICO) has urged organisations to assess the public risk before using such technology, and warned that any firms which do not act responsibly, pose a risk to vulnerable people or fail to meet ICO expectations will be investigated.

People that are unhappy about being filmed could complain to Network Rail first to give them a chance to resolve any privacy related issues followed by the ICO if the matter remains unresolved. 

ICO guidance states: ‘You should give the organisation you’re unhappy with a chance to sort things out before bringing your complaint to us. 

‘Many data protection complaints can be resolved quickly and easily with the organisation.’

Advertisement

They also said the technologies are ‘immature’ and ‘they may not work yet, or indeed ever.’ 

Jake Hurfurt, Head of Research & Investigations at Big Brother Watch, said: ‘Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain’s biggest stations, and I have submitted a complaint to the Information Commissioner about this trial. 

‘It is alarming that as a public body it decided to roll out a large scale trial of Amazon-made AI surveillance in several stations with no public awareness, especially when Network Rail mixed safety tech in with pseudoscientific tools and suggested the data could be given to advertisers.’

‘Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used. 

‘AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail’s disregard of those concerns shows a contempt for our rights.’

Carissa Véliz, an associate professor in psychology at the Institute for Ethics in AI at the University of Oxford, told Wired: ‘Systems that do not identify people are better than those that do, but I do worry about a slippery slope.

‘There is a very instinctive drive to expand surveillance. Human beings like seeing more, seeing further. But surveillance leads to control, and control to a loss of freedom that threatens liberal democracies’. 

The ICO’s deputy commissioner Stephen Bonner said: ‘Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever.’

‘While there are opportunities present, the risks are currently greater.

‘At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination.

‘The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science.

‘As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.

Civil liberties group Big Brother Watch have raised privacy concerns about the Network Rail scheme and have submitted a complaint to the Information Commissioner's Office (ICO) (file image of Carlisle railway station)

Civil liberties group Big Brother Watch have raised privacy concerns about the Network Rail scheme and have submitted a complaint to the Information Commissioner’s Office (ICO) (file image of Carlisle railway station)

London Euston is one of the stations where the cameras have been placed

London Euston is one of the stations where the cameras have been placed

‘The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.’ 

AI researchers have also warned that using the technology to detect emotions is ‘unreliable’ and should be banned. 

In the EU such systems are banned or deemed ‘high risk’ under the Artificial Intelligence Act.

Gregory Butler, chief executive of Purple Transform, which constructed the trial for Network Rail, said that although the trial continued, the part looking at emotions and demographics had been short-lived. 

Network Rail has refused to answer questions about the scheme but in a statement, a spokesperson said: ‘We take the security of the rail network extremely seriously and use a range of advanced technologies across our stations to protect passengers, our colleagues, and the railway infrastructure from crime and other threats.

‘When we deploy technology, we work with the police and security services to ensure that we’re taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.’

MailOnline has contacted Network Rail for comment.