Skip to Content

Seoul rolls out AI-enabled CCTV cameras to stop suicides; privacy experts divided

By Christy Somos, CTVNews.ca Writer

Click here for updates on this story

    TORONTO, Ontario (CTV Network) — The Seoul Metropolitan Government has announced a new pilot project to address suicides on bridges spanning the Han River, the main waterway bisecting the city, by rolling out AI-enabled CCTV cameras that use deep learning to identify the “patterns” of people in crisis.

Currently, Seoul has CCTV operators working on three rotating shifts that cover 24 hours a day, seven days a week, at four different control centres in the Yeouido, Banpo, Ttukseom and Gwangnaru neighbourhoods on the river.

In a statement emailed to CTVNews.ca, the Seoul Metropolitan Government said it has been operating a CCTV surveillance and response system since 2012 on the Mapo and Seogang Bridges spanning the Han River, as they have the highest number of suicide attempts, monitoring the bridges 24 hours a day and “proactively responding to suicide attempts.”

From 2015 to 2018, the Seoul Metropolitan Government invested approximately CAD$10.3 million to expand their system to eight more bridges, installing 20 CCTV systems comprising three types – fixed, rotating and thermographic cameras – on all 10 bridges.

The system allows rescuers to monitor the bridge through 572 CCTVs in real-time and arrive at the scene in four minutes, the statement said.

In a press release, the government explained that the new partnership between the Seoul Institute of Technology and the Seoul Fire&Disaster Headquarters (SFDH) is aimed at improving the current suicide detection system – and has been analyzing data from SFDH since April 2020.

The data includes dispatch reports, CCTV footage, data from bridge sensors, information from people who had previously attempted suicide, report history, phone calls and text messages.

The AI system would then send the CCTV footage that it flagged to a monitoring agent at the new integrated control centre to dispatch the relevant authorities.

“This system allows rapid responses to suicide attempts—both before or after an incident—and minimizes surveillance loopholes. Not only that, but it can also dramatically reduce warning errors thanks to AI’s ability to reflect environmental factors, including illumination levels and weather, as well as characteristics of Han River bridges such as wobbling caused by winds and traffic. As data accumulates, the accuracy of the system will increase further,” the release states.

According to the Seoul Metropolitan Government, every year approximately 486 people attempt suicide on bridges spanning the Han River and authorities are able to save 96.63 per cent.

While many would laud the purpose of such a high-tech solution to a social issue that is rampant in South Korea – the country with the highest rate of suicides amongst OECD countries — privacy experts remain divided on the merits of such a program and whether this would be approved for use in Canada.

SURVEILLANCE INFRASTRUCTURE South Korea’s surveillance infrastructure is vast, with more than 1.15 million CCTV cameras installed across the country, according to a 2019 Statista report estimate – a number expected to grow.

In its emailed statement, the Seoul Metropolitan Government said that as of December 2020, the city was managing 75,431 CCTVs “installed for security, facility management and traffic safety.”

The country’s Personal Information Protection ACT (PIPA) has strict compliance requirements for both the private and public sector when it comes to identifying information. However, government agencies that require personal data for public interest purposes can collect and use data without the need to obtain consent – for many Koreans, surveillance on such a scale is simply a part of daily life.

This played out in South Korea’s response to the COVID-19 pandemic, where health authorities could zero-in on contact tracing with astonishing precision thanks to the Infectious Diseases Control and Prevention Act.

Health authorities could conduct an investigation using CCTV, credit card transaction data and mobile phone tracking to warn close contacts of their exposure to someone with COVID-19, and to dispatch sterilization teams to disinfect areas the person had visited.

Emergency alert text messages were sent to all mobile phones through Korea’s cellular broadcasting service, which was also used by the government and health authorities to update citizens on new cases and issue warnings on COVID-19 hot spots.

That kind of surveillance and data gathering in other countries like Canada would be blocked by more stringent data protection and privacy laws.

And while an AI-enabled CCTV camera may assist authorities to help someone in a crisis – privacy experts are divided on whether the end result justifies the means.

Former Ontario privacy commissioner Ann Cavoukian said that while its “admirable” the Seoul Metropolitan Government is addressing concerns about suicide, using AI-enabled CCTV is an “encroachment on privacy.”

“At the very least, the government should be providing signage and give notice to the public walking on these bridges that these new measures are in effect,” Cavoukian said in a telephone interview with CTVNews.ca. “There will be no privacy, and people should be made fully aware that when they walk on that bridge, their identity is being captured, along with what they’re doing.”

Cavoukian said she doesn’t believe a program like Seoul’s would be authorized for use in Canada, as “people don’t want to be tracked.”

“That’s what this will enable you to do, whether you do it intentionally or not,” she said. “Once it starts, that’s what will happen, and then law enforcement will use it…in the last two years concern for privacy has been on an all time high, so I don’t think it would be popular here personally.”

David Fraser, a privacy lawyer with McInness Cooper in Halifax, N.S., disagrees – but says the program would have to be monitored closely.

“Using video surveillance technologies, particularly when coupled with something that resembles biometrics, more resembles artificial intelligence,” Fraser said in a telephone interview. “There’s a real good reason to be nervous about that and to scrutinise it very closely.”

However, Fraser said that “at the end of the day” Seoul’s pilot project is a “very interesting” application of AI technology that’s “clearly intended to reduce the loss of life and actually proactively save people’s lives.”

“Really, what could be the negative impacts or implications for individuals other than perhaps you have a pedestrian crossing the bridge with no intention to harm themselves and having an intervention,” Fraser said of the program. “I think I would be supportive of the project as long as the kind of diligence has also gone into making sure that the images and videos they’re collecting in order to feed their machine learning is appropriately protected, with appropriate controls.”

And while Canada does not have the same mass surveillance culture that South Korea has, Fraser pointed out that CCTV cameras in places like subway stations, train stations and bus stations could be eligible places for a program like Seoul’s should Canada want to implement it.

“I think you would probably find the nearest subway station in Canada is covered in every square inch with CCTV cameras,” Fraser said. “If such a programme were to be proposed in Canada, those transit authorities are all public sector, and all subject to privacy laws that regulate the collection use and disclosure of personal information, which for some information includes video images.”

Fraser said any such program proposed in Canada would have to go through a rigorous process known as a “Privacy Impact Assessment,” which weighs the ethical and logical scope of any policy that could impact Canadians’ privacy – including only collecting the information needed and to minimise the collection of any data that is not needed, and supervision of the analysis of said data.

‘FUNCTION CREEP’ The Seoul Metropolitan Government denied in its statement that there were privacy concerns associated with the project, saying it had been “provided with de-identified video data from the Seoul Fire & Disaster Headquarters for scientific research purposes only to extract ‘features’ for deep learning.” The statement reiterated “there is no possibility to raise concern [sic] because the video data is discarded after use according to security procedures on a monthly basis.”

Fraser likened Seoul’s pilot project to the phenomena of people giving up private medical information in order to raise awareness or to assist in research to prevent “what happened to them happening to others.”

However, Fraser did echo Cavoukian’s concerns about the potential for the erosion of privacy, saying he was cautious of “function creep.”

“So cameras were probably put there in the first place for a specific purpose… including identifying people at risk of self-harm,” Fraser said. “But putting in any new technology, that changes the game.”

Fraser said ethical complications can arise when seemingly innocent ideas on expanding the mandate of the original system are put forward.

“What happens when somebody says, ‘Hey, we have a problem with pickpockets, so let’s tweak the model so we can identify people who are pickpocketing?’” Fraser said. “The tweaks seem incremental and before you know it, we have a surveillance system that’s identifying people and leading to law enforcement interventions – you just have to look at wellness checks and police interventions to see how often those end in really bad outcomes.”

Fraser said any project like Seoul’s has to be preceded by “carefully asking ourselves the appropriate questions.”

“I think we also need to always be doing a bit of a reality check when it comes to this — societally, how are we with this?”

——

With files from Reuters

Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.

ctvnews.caproducers@bellmedia.ca

Article Topic Follows: CNN - Regional

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KYMA KECY is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content