New York City intends to predict crime and dangerous threats before they use metro and artificial intelligence cameras.
The New York City Transportation Bureau (MTA) has announced that it is investigating the use of artificial intelligence systems to prevent crime and dangerous behaviors in the subway platforms.
Using artificial intelligence to prevent crime in the New York Metro
“Michael Kamper, head of the MTA Security Committee, said at a meeting of the Office of Safety:”
“We are examining and testing technologies such as artificial intelligence so that we can identify possible symptoms of problems or worrying behaviors in the subway platforms.”
He explained that if a person acts irrational or suspicious, the system could automatically warn and send security forces or police to the scene before an accident occurs.
Emphasizing that the aim of using this technology is to identify and control threats before, not just the post -incident recording, Cumper added that our future artificial intelligence is and MTA is currently working with several technology companies to examine the most appropriate solution for the New York Metro system.
However, no details of the name of the corporation, the exact use of artificial intelligence, or the type of behaviors to be identified.
MTA spokesman Aaron Danovan said in an interview with Gatamist magazine that the technology does not use face identification and focuses on identifying suspicious behaviors instead of identifying people.
This is not the first time that the New York Transportation Department has been turning to artificial intelligence. Earlier in 2023, MTA had announced that it had used artificial intelligence -based surveillance software to track people who enter the subway at no cost. This technology collects accurate information about the time, place and how these violations occur.
Recently, the British Ministry of Justice has also been developing an algorithm similar to the Minority Report film, which aims to identify those who are likely to commit murder. The project, initially known as the “Murder Forecasting Project,” uses police data and other government agencies, and has raised serious concerns about violations of privacy and the expansion of structural discrimination in the judicial system.
RCO NEWS