Cameras using AI software have been used to surveil thousands of train travellers in the UK, without proper disclosure. A report shared after a freedom of information (FOI) request details how an image recognition system was deployed for testing at eight stations around the UK. Major stations such as London Waterloo and Manchester Piccadilly were part of the trial that claims it aimed to improve security in and around railway hubs.
The AI cameras learn to recognise objects and actions and have been touted as a way to help monitor and help staff resolve issues such as overcrowding, theft and anti-social behaviour. But the Amazon “Rekognition” tech, which is triggered when passengers pass by, also makes it possible to analyse customer demographics by age, gender and even emotion.
Big Brother Watch, the civil liberties group behind the FOI request, has criticised the extent of the camera’s use and the lack of proper public awareness of the trial.
“The rollout and normalization of AI surveillance in these public spaces, without much consultation and conversation, is quite a concerning step,” said Jake Hurfurt, the group’s lead on research and investigations, reported by Wired.
Between five and seven cameras or sensors were deployed at each station, and at Leeds up to 350 cameras were linked to the system. The report lists various potential uses for the technology. These include detecting overcrowding, slippery floors and bike thieves, preventing dangerous behaviours such as running, skateboarding and even smoking in and around platforms, to suicide watch.
Gregory Butler, the CEO of data firm Purple Transform, partnering with Network Rail on the trials said: “AI helps human operators, who cannot monitor all cameras continuously, to assess and address safety risks and issues promptly.”
During the trial, five trespassing incidents deemed serious were picked up by systems in two different locations, two of them involving people collecting balls from tracks, one of them on a high-speed line for over five minutes.
But, according to Hurford, the tech’s ability to “analyze for emotion” such as “happy, sad, and angry” is one of the most “concerning” aspects of the trials, with the report suggesting commercial rather than health and safety uses for the data. The authors noted the emotion analysis and gathering of passenger “satisfaction” data that could “be utilized to maximum advertising and retail revenue.” This is despite a 2022 Information Commissioner warning that such technology was immature and could be discriminatory.
In response to criticism about the “alarming” scope of the trial, a Network Rail spokesperson said: “When we deploy technology, we work with the police and security services to ensure that we’re taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.”
Meanwhile, Hurford agreed that “Technology can have a role to play in making the railways safer,” but qualified that, adding “there needs to be a robust public debate about the necessity and proportionality of tools used.”
AI camera technology use is becoming more widespread. It will be used at the Paris Olympics to surveil crowds for potential weapons and bomb threats.