Government's Assessment of Social Media's Efforts to Detect Child Sexual Exploitation and Abuse in Livestreams
Ministry of Digital Development and InformationSpeakers
Summary
This question concerns Ms He Ting Ru’s inquiry into the Infocomm Media Development Authority’s assessment of social media platforms' capabilities to detect child sexual exploitation and abuse in livestreams and current risk-mitigation measures. Minister for Digital Development and Information Josephine Teo noted that the Broadcasting Act and Online Criminal Harms Act allow the Government to direct services to disable access to such material. Under the Code of Practice for Online Safety, designated services are required to implement proactive technologies to detect and swiftly remove exploitation content and protect users from grooming. These services must also submit annual safety reports to the Infocomm Media Development Authority detailing their efforts to combat harmful content. These reports are published alongside the Authority's assessments of each service's performance to ensure accountability and safety.
Transcript
31 Ms He Ting Ru asked the Minister for Digital Development and Information (a) what is Infocomm Media Development Authority's assessment of social media platforms' efforts and capabilities to detect child sexual exploitation and abuse in livestreams, which are known to be more difficult to detect than in uploaded material; and (b) what measures are being taken currently to address this risk.
Mrs Josephine Teo: The Government has put in place measures to address child sexual exploitation and abuse online. Under the Broadcasting Act and Online Criminal Harms Act, the Government can issue directions to social media services or internet service providers to disable Singapore users' access to child sexual exploitation and abuse material published online.
Under the Code of Practice for Online Safety – Social Media Services (Code), designated Social Media Services are required to minimise Singapore users' exposure to child sexual exploitation and abuse material on their services, including livestreams. The designated two services must put in place technologies and processes to proactively detect and swiftly remove such material. They must also take steps to protect users from child exploitation and abuse activity, such as online grooming.
The Code additionally requires designated services to submit annual online safety reports to the Infocomm Media Development Authority (IMDA), which detail their measures to combat harmful content, including child sexual exploitation and abuse material, and improve users' safety. The reports are published on IMDA's website, alongside IMDA's assessment on each designated service's performance.