Written Answer to Unanswered Oral Question

Government's Assessment of Social Media's Efforts to Detect Child Sexual Exploitation and Abuse in Livestreams

Speakers

Transcript

31 Ms He Ting Ru asked the Minister for Digital Development and Information (a) what is Infocomm Media Development Authority's assessment of social media platforms' efforts and capabilities to detect child sexual exploitation and abuse in livestreams, which are known to be more difficult to detect than in uploaded material; and (b) what measures are being taken currently to address this risk.

Mrs Josephine Teo: The Government has put in place measures to address child sexual exploitation and abuse online. Under the Broadcasting Act and Online Criminal Harms Act, the Government can issue directions to social media services or internet service providers to disable Singapore users' access to child sexual exploitation and abuse material published online.

Under the Code of Practice for Online Safety – Social Media Services (Code), designated Social Media Services are required to minimise Singapore users' exposure to child sexual exploitation and abuse material on their services, including livestreams. The designated two services must put in place technologies and processes to proactively detect and swiftly remove such material. They must also take steps to protect users from child exploitation and abuse activity, such as online grooming.

The Code additionally requires designated services to submit annual online safety reports to the Infocomm Media Development Authority (IMDA), which detail their measures to combat harmful content, including child sexual exploitation and abuse material, and improve users' safety. The reports are published on IMDA's website, alongside IMDA's assessment on each designated service's performance.