TOKYO – East Japan Railway Co. (JR East) ‘s temporary use of a system to allow its security cameras to recognize faces, including people released from prison, has caused a stir. While the system contributes to greater public safety and facilitates criminal investigations, it presents privacy concerns and there are concerns that it may intimidate society.
In July, JR East used the upcoming Tokyo 2020 Olympic and Paralympic Games to install security cameras at major train stations and other places capable of detecting suspicious behavior from people, and announced that, as necessary, the company would inspect in some cases personal property. .
But he did not reveal that the system would also be programmed to detect named individuals wanted by police on suspicion of crimes, as well as individuals on full or temporary release from prison for terrorist acts and other major incidents that affected JR. East or his passengers.
Information on those released from prison was obtained from the Public Prosecutor’s Office, on the basis of the crime victims notification system. If a person matching the description of a relevant individual was captured by the cameras installed by the AI, they would automatically detect them. However, the inclusion of the information in the system was eventually removed before the registration phase.
In any case, the detection of people with suspicious behavior and people wanted by the police is underway. Among the 8,350 cameras mounted at major stations and elsewhere, details such as how many installations of the AI system or who uses it have not been revealed.
How to ensure safety in public transport has become an urgent question. In August, 10 passengers were injured, some seriously, in a knife attack on an Odakyu Line train in Tokyo. Damage prevention and reduction plans compiled by the Ministry of Land, Infrastructure, Transport and Tourism in September included such efforts as the sophistication of security camera functions to detect suspicious individuals and objects. Despite this, in October, 17 other passengers were injured in an attack on a Keio Line train in Tokyo. No special requirements guaranteeing both convenience and safety have been found.
Despite this, railway companies’ expectations of the technology are by no means low. On September 21, JR East revealed that people subject to its initial crime prevention measures using facial recognition security cameras would be restricted. But the company also said, “We may revisit this in response to changes in the state of the company.”
The operator of the Tokaido Shinkansen, the Central Japan Railway Co. (JR Central), said of the measures: “We intend to continue careful investigations while taking into account confidentiality and other issues.”
These security cameras translate the characteristics of people’s faces into data and compare the information with facial information already recorded. He can identify an individual who is standing far away without realizing it. But since this facial recognition system can also be used to track people’s interactions with others, their movements, and purchase history, there are concerns that whether or not a person has been released from prison could be damaging. human rights and intimidate society.
But the law on the protection of personal information does not require the consent of the individual for the acquisition of data describing his facial characteristics. Still, the government’s Personal Information Protection Commission, which oversees the implementation of the law, has stated on its website that it believes “there must be notifications and announcements about the purpose of use “in the event that security cameras with facial recognition capabilities are installed.
Is JR East’s response appropriate? Regarding JR East’s notifications to stations saying “Security system (facial recognition system) in operation,” Professor Taro Komukai, an information law specialist at Chuo University, told the Mainichi Shimbun: “He must have a clearly defined objective such as ‘We can detect people who have caused problems at stations.’ ”
Regarding the information on those released or temporarily released from prison, the professor was of the opinion that “if they can be limited to a means of early prevention of incidents to protect the lives of passengers and station facilities as a assets, then there may be a legal exception that is unavoidable. “But he added,” If it goes beyond that objective and starts sharing information with investigative institutions, then it becomes problematic. ”
In the United States and Europe, the acquisition of facial recognition data is subject to particularly strict processing. The EU’s General Data Protection Regulation prohibits the use of data on a person’s body, including facial features, without the person’s consent. According to reports, in Spain this year retailers and other entities have been fined for identifying customers being monitored by matching biometric data with information about those released from prison.
(Japanese original by Ken Aoshima and Shotaro Kinoshita, Tokyo City Information Department)