Amazon-powered AI cameras used to detect emotions of unsuspecting UK train passengers
Trending Now

Amazon-powered AI cameras used to detect emotions of unsuspecting UK train passengers


Network Rail did not respond to questions about the trials sent by WIRED, including about the current state of AI use, emotion recognition and privacy concerns.

A Network Rail spokesperson says: “We take the security of the rail network very seriously and use a range of advanced technologies to protect passengers at our stations, our colleagues and the railway infrastructure from crime and other threats.” “When we use technology, we work closely with the police and security services to ensure we are taking proportionate action and we always comply with relevant legislation regarding the use of surveillance technologies.”

It is unclear how widely the emotion recognition analysis was deployed, with The Times documents saying the use case “should be looked at more carefully” and reports from stations saying it was “impossible to verify accuracy.” However, Gregory Butler, CEO of data analytics and computer vision company Purple Transform, which is working with Network Rail on the trials, says the capability was turned off during the test and no images were stored while it was active.

Network Rail’s documents about the AI ​​trials describe a number of use cases, including the potential for cameras to send automated alerts to staff when they detect certain behaviour. Controversial face recognition technologyThe purpose of which is to match the identity of people with the identity stored in the database.

“The main benefit is that it allows for rapid detection of trespassing incidents,” says Butler. He adds that his firm’s analytics system, SiYtE, is being used at 18 locations, including train stations and along the tracks. In the past month, there have been five serious trespassing incidents at two locations, Butler says, including a teenager collecting balls from the tracks and “a man who spent more than five minutes collecting golf balls along the high-speed line.”

At Leeds Railway Station, Busiest outside LondonButler says there are 350 CCTV cameras connected to the SiYtE platform. “Analytics are being used to measure the flow of people and identify issues such as overcrowding on the platform and, of course, trespassing – where the technology can filter out track staff through their PPE uniforms,” he says. “The AI ​​helps human operators, who cannot constantly monitor all the cameras, to quickly assess and resolve safety risks and issues.”

Network Rail documents claim cameras used at Reading station helped police speed up a bike theft investigation by identifying bikes in the footage. “It was established that, although the analytics could not be confident in detecting theft, they could detect a person with a bike,” the files say. They also say new air quality sensors used in the trials could save staff time from having to check manually. One AI instance uses data from the sensors to detect “sweaty” floors that have become slippery due to condensation, and alerts staff when they need to be cleaned.

While the documents detail some elements of the tests, privacy experts say they are concerned about the overall lack of transparency and debate about the use of AI in public spaces. In a document prepared to assess data protection issues with the system, Big Brother Watch’s Herfurt says there appears to be a “disapproving attitude” toward those with privacy concerns. asks questions“Will some people object to this or find it intrusive?” One employee writes: “Usually, no, but for some people it’s just not worth it.”

At the same time, similar AI surveillance systems that use technology to monitor crowds are increasingly being used around the world. During the Paris Olympic Games in France later this year, AI video surveillance will monitor thousands of people and try to figure out how to use AI surveillance systems to monitor crowds. Crowd surge, weapon use, and abandoned object detection,

“Systems that don’t identify people are better than systems that do, but I worry about a slippery slope,” says Carissa Veliz, an associate professor of psychology at the Institute for Ethics in AI at the University of Oxford. AI trials on the London Underground It initially blurred out the faces of people who may have been avoiding paying fares, but then the approach was changed, the photos were blurred and the images were kept for longer than initially planned.

“There’s a very innate tendency to expand surveillance,” Veliz says. “Humans like to see more, to see farther. But surveillance leads to control, and control leads to a loss of freedom, which is a threat to liberal democracies.”

Leave a Reply

Your email address will not be published. Required fields are marked *