Written Answer to Unanswered Oral Question

Impact of Social Media Platforms' Algorithmic Content Recommendation Systems on Distribution of Scam Content to Users

Speakers

Summary

This question concerns whether the Government assesses how social media algorithms amplify scam content and if it will mandate platforms to only display content users explicitly follow or search for. Mr Dennis Tan Lip Fong raised these inquiries regarding the impact of algorithmic recommendation systems on the distribution of organic and paid scam content to platform users. Coordinating Minister and Minister K Shanmugam stated that under the Online Criminal Harms Act, the Police impose anti-scam requirements like advertiser verification and user reporting. He explained that making platforms responsible for adjusting their own systems is more productive than the Government directly assessing algorithms and noted that requirements will be further tightened. Finally, the Minister rejected restricting feeds to only followed content, noting this could hinder legitimate advertising and may not effectively stop scams appearing in user-requested content categories.

Transcript

21 Mr Dennis Tan Lip Fong asked the Coordinating Minister for National Security and Minister for Home Affairs (a) whether the Government has conducted assessments on how social media platforms' algorithmic recommendation systems for organic and paid content amplify the distribution of scam content to users; and (b) whether the Government is considering to mandate that social media platforms default to display only content that individuals explicitly follow or search for.

Mr K Shanmugam: Under the Online Criminal Harms Act, the Police impose ex ante anti-scam requirements on online platforms. This includes measures that reduce or remove the circulation of scam content on the platforms, such as advertiser verification and user reporting. We think that this approach is more productive, compared to trying to directly assess and require changes to the platforms' content distribution algorithms. The latter is not easy to do and platforms must be responsible to adjust their algorithms, systems or processes as necessary and as they deem appropriate, to comply with the Government's anti-scam requirements.

We intend to further tighten the anti-scam requirements on platforms and will announce the details in due course. We do not think it is reasonable to stipulate that platforms only display content that individuals explicitly follow or request for. This could prevent legitimate advertising activity and, in any case, may not stop the circulation of scam ads if the ads feature content that the user had requested for.