Impact on Well-being of Children from Prolonged Exposure to Delayed Removal of Harmful Online Content
Ministry of Digital Development and InformationSpeakers
Summary
This question concerns inquiries by Mr Yip Hon Weng regarding the impact of prolonged exposure to delayed removal of harmful content on children and measures to prioritize timely content moderation. Minister Josephine Teo stated that the Infocomm Media Development Authority (IMDA) requires Designated Social Media Services to implement safety measures under the Code of Practice for Online Safety. Recent assessments identified shortcomings in platforms like X and HardwareZone, noting that services often took five or more days to respond to user reports. IMDA is currently engaging platforms to improve their responsiveness and is studying the use of age assurance technology to better protect young users. While authoritative findings on the impacts of prolonged exposure are currently unavailable, the Government will continue to monitor and review platform safety efforts.
Transcript
6 Mr Yip Hon Weng asked the Minister for Digital Development and Information (a) whether the Ministry can provide insights into the impact of prolonged exposure caused by delayed removal of harmful contents by social media platforms on the well-being of children; and (b) what targeted measures are being considered to ensure social media platforms prioritise the timely removal of contents that pose significant risks to vulnerable users.
Mrs Josephine Teo: The Government seeks to protect vulnerable users, especially children, from harmful and age-inappropriate content on social media services. The Infocomm Media Development Authority's (IMDA's) Code of Practice for Online Safety requires social media services with significant reach or impact to put in place online safety measures, including differentiated measures for young users.
Six Designated Social Media Services (DSMSs) were assessed recently for the comprehensiveness and effectiveness of these measures. X and HardwareZone were found to have shortcomings in user safety measures for children. For example, the services' own community guidelines for children were frequently breached, resulting in children being more exposed to age-inappropriate content than was desirable. Even for DSMSs that did better in this area, such as Facebook and YouTube, children could still access some age-inappropriate content.
Other than user safety features, DSMSs should improve on the effectiveness and timeliness of their response to user reports. More often than not, content that violated the services' community guidelines were not removed even after users reported their presence. Most DSMSs also took an average of five days or more to act on these user reports.
These findings show that DSMSs need to step up efforts to protect users on their platforms. IMDA has engaged DSMSs to do so and will review their responses when their next annual online safety reports are due in June 2025. In addition, IMDA is studying how social media services should use age assurance technology to better protect children and youth from age-inappropriate content. As to the impact of prolonged exposure, the Ministry of Digital Development and Information does not have authoritative findings to share. Nonetheless, we will continue to monitor DSMSs' efforts to enhance online safety.