← Back to Bills

Online Safety (Relief and Accountability) Bill

Bill Summary

  • Purpose: The Bill aims to provide victims of online harms with timely, effective, and accessible redress by establishing the Office of the Commissioner of Online Safety to direct the quick removal of harmful content, introducing statutory torts to allow victims to take civil action against perpetrators or platforms, and providing means to identify anonymous offenders.

  • Key Concerns raised by MPs: Members of Parliament raised concerns regarding the difficulty and high cost for victims to seek remedies through existing court processes, the slow response times from platforms in removing egregious content, and the challenges of identifying anonymous perpetrators. They also highlighted the severe mental and emotional impact of online harms—such as deepfakes, cyberbullying, doxxing, and mob behavior—particularly on women, girls, and youths.

  • Responses: Minister for Digital Development and Information Mrs Josephine Teo and Minister of State for Digital Development and Information Ms Rahayu Mahzam emphasized that the Bill adopts a victim-centric approach by empowering the Commissioner to act swiftly without long-drawn investigations and by defining 13 specific categories of harm to reduce legal ambiguity. They justified the legislation as a necessary step to set new norms for online behavior, hold platforms accountable, and provide a simplified alternative to complex litigation while drawing on international models like Australia's eSafety Commissioner.

Reading Status 2nd Reading
Introduction — no debate

Members Involved

Transcripts

First Reading (15 October 2025)

"to provide persons affected by online harmful activity with timely redress through the office of the Commissioner of Online Safety and rights of action in court proceedings, to improve and promote online safety, to deter and prevent online harmful activity, to promote accountability and responsible and reasonable conduct in the online environment, to make amendments to certain other Acts for alignment with this Act, and to make related amendments to other Acts",

presented by the Minister for Digital Development and Information (Mrs Josephine Teo) read the First time; to be read a Second time on the next available Sitting of Parliament, and to be printed.


Second Reading (5 November 2025)

Order for Second Reading read.

1.07 pm

The Minister for Digital Development and Information (Mrs Josephine Teo): Mr Speaker, I move that the Bill be now read a Second time.

Sir, Ms He Ting Ru has filed a notice to move amendments to the Bill. They relate to the principles and outcomes of the Online Safety (Relief and Accountability) Bill. I therefore seek your consent to have the Second Reading of the Bill and the amendments debated together.

Mr Speaker: Please proceed.

Mrs Josephine Teo: Sir, the Online Safety (Relief and Accountability) Bill or OSRA, is designed to protect victims of online harms. Unfortunately, we know of too many victims. In too many cases, we actually know the victims. We know their families and the pains they suffer. We know too that this is a very difficult problem. We want to take further action, but how? This Bill is our answer.

While the Bill is new, Members' concerns are not. As far back as 2014, when the Protection from Harassment Bill was debated, Ms Tin Pei Ling asked about removing offensive content and identifying perpetrators online. In 2019, we amended the Protection from Harassment Act (POHA) to cover online harassment. Members of Parliament (MPs), including Mr Christopher de Souza, Mr Patrick Tay, and Minister of State Rahayu Mahzam, then a first-term MP, asked if victims would be given sufficient assistance when they approached the courts for help.

In 2021, deep in the trenches of COVID-19, my Ministry set up the Sunlight Alliance for Action. It was a serious attempt to address online harms, especially those targeted at women and girls. Members like Senior Minister of State Sim Ann and Ms Hazlina Abdul Halim were actively involved.

Sunlight shone a spotlight on the prevalence of online harms and their severe impact on victims. Arising from this effort, a new organisation, SG Her Empowerment (SHE) was born. Subsequent studies by SHE showed that two in five online harm survivors experienced at least one severe form of impact. These include serious emotional, mental or physical consequences like depression or self-harm. Three in four victims of online harms choose not to voice their opinions online for fear of being targeted if they did so.

In 2022, Members debated the white paper on Singapore Women's Development. The Government made a clear commitment to tackle online harms. Many victims are women and girls, but men and boys have also suffered in silence. With your permission, Mr Speaker, may I ask the clerk to distribute handouts to Members?

Mr Speaker: Go ahead. [Handouts were distributed to hon Members. Please refer to Annex 1.]

Mrs Josephine Teo: Members may also access these materials through the SG Parl MP mobile app.

Members may refer to Handout 1. We can see that many victims add up to some very disturbing numbers. A Ministry of Digital Development and Information (MDDI) study revealed that more than four in five had encountered harmful online content in the past year. This included content supporting illegal activities, sexual content and violent content. Many of these harms occur on platforms where people go for entertainment, such as Facebook, YouTube and Instagram.

In addition to POHA, the Government has made several moves to enhance online safety. In 2022, we amended the Broadcasting Act to require social media services to disable access to egregious content, including content advocating suicide or self-harm. In 2023, we introduced the Code of Practice for Online Safety. Designated social media services were required to put in place systems and processes to enhance online safety, especially for children and youths. These laws have helped in some measure to stem the tide of online harms. However, they were not a panacea, nor could they be. The impact of technology evolves and is often understood only with time. We have therefore made it a point to review our laws regularly.

Most recently, in January 2024, Parliament debated a Motion on Building an Inclusive and Safe Digital Society. The Motion and other Parliamentary Questions filed by MPs have contributed many useful ideas. They include calls by Mr Darryl David for victim-centric remediation; Ms Mariam Jaafar and Dr Wan Rizal for better platform accountability; Ms Tin Pei Ling, for more timely responses and respectful behaviours online; Ms Nadia Samdin, to lift the veil of online anonymity; Ms Hany Soh to address the use of deepfakes for extortion; Mr Yip Hon Weng, to firmly combat cyberbullying; and Mr Abdul Muhaimin, to deter hate speech.

Mr Speaker, these calls by Members have reinforced where we can do more to strengthen protection for victims of online harms. They align closely with multiple studies conducted over the years by the Government, academics and non-government organisations.

Let me share some important findings from these studies. The first important finding is that victims often want quick takedowns of online harms. A study by the Infocomm Media Development Authority (IMDA) showed that platforms take about five days or more to act on valid reports of online harms. Many valid reports are not even acted on. This is highly unsatisfactory to victims.

The second important finding is that victims are daunted by the existing channels for seeking remedies and in particular, they find court processes complex and expensive. The third important finding is that victims face difficulties seeking restitution. This is because they are often unable to confirm who was responsible for the harm. They continue to be in fear and cannot find closure.

Today, we are taking a decisive step to introduce legislation. It will give victims of online harms an avenue of timely, effective and accessible redress. OSRA aims to do this in three ways.

First, OSRA will establish the Office of the Commissioner of Online Safety. An agency, the Online Safety Commission (OSC), will be set up to support the Office of the Commissioner. OSC will be empowered to issue directions to communicators of harmful content, administrators of groups or pages and platforms. These directions require the recipients to act quickly on the harmful content. OSC will start by tackling online harms that are most prevalent and Singaporeans consider the most severe. Other harms will be dealt with progressively. The Minister of State Rahayu will elaborate on how OSC is intended to operate.

Second, OSRA will introduce statutory torts. This is a major improvement. It provides a legal basis for the victim to take action. Most importantly, it will allow victims to initiate legal proceedings against those responsible for causing or continuing the harm.

Sir, studies have shown that Singaporeans see online safety as a shared responsibility between technology companies, individuals, the Government, parents and schools. The proposed statutory torts will clarify the duties that users, administrators of groups and platforms owe to each other. It will significantly reduce ambiguity about what online behaviours are considered harmful and unlawful.

Third, OSRA provides victims with the means to identify the person who caused the harm. Not all victims want this, but for those who do, OSRA plugs a gap. Minister Edwin Tong will elaborate on this.

Sir, if Parliament passes OSRA, Singapore will be one of only a few countries worldwide to have an agency dedicated to helping victims of online harm. While we have few counterparts to learn from, this does not deter us. In setting up the OSC, we considered various models, such as mediation and dispute resolution.

Victims are at the very heart of the OSC's mission. But the OSC's operational model and its oversight process are designed to be reasonable and fair to all. This is balanced against the need to act expeditiously and reduce the possibility of victim re-traumatisation. We also need to minimise long-drawn investigations and appeal cases.

In designing our set-up for the Office of the Commissioner of Online Safety, we drew lessons from Australia's eSafety Commissioner. We learnt much from their generous sharing of experience to build a model that works for Singapore. For example, eSafety started small, focusing only on image-based child abuse material and cyberbullying of a child or young person under 18.

To put OSC on a firm footing, we will, similarly, also start off by tackling the most severe and prevalent harms. Minister of State Rahayu will share more on the phased implementation approach.

While we learnt important lessons from Australia's eSafety, I should point out, there are differences. For example, the Commissioner of Online Safety in Singapore will be empowered to deal with a wider set of harms. OSRA identifies thirteen types of harms which affect Singaporeans from all walks of life.

The Commissioner can issue directions other than what is colloquially known as a "takedown". For instance, it can issue a Right-of-Reply Direction that will allow the offending content to be seen alongside the victim's reply. The Commissioner can also seek the platform's cooperation to obtain identity information of perpetrators.

Sir, I cannot overstate the importance of the OSC being able to act swiftly and effectively. This victim-centric approach is the cornerstone of the OSC. After all, what Singaporeans have said they want most is a simple and effective way to protect themselves from online harms.

Importantly, it sends a signal that harms to victims inevitably become harms to society. Our collective well-being is compromised when those who are harmed are denied restitution. With online harms becoming more prevalent, our barometer for acceptable online behaviour has been steadily eroded. But just as we would not tolerate harmful behaviours in the physical world, we also cannot allow bad behaviours to become normalised online.

Through OSRA, we hope to avoid such an eventuality. In its place, we want to lay the foundations for our citizens' online interactions by fostering trust in online spaces, Singaporeans can participate safely and confidently in our digital society.

Sir, we will continue to work with all our stakeholders from technology companies to community partners to implement OSRA. We continue to welcome Members' views on how we can further improve online safety for Singaporeans.

Mr Speaker, may I have your permission to provide a summary in Mandarin, please?

(In Mandarin): [Please refer to Vernacular Speech.] Harmful content online has seriously affected our online environment and goes against our social values.

The Online Safety (Relief and Accountability) Bill focuses on three main areas to strengthen online safety and protect Singaporeans from online harm.

First, establishing the OSC will enable victims to receive help more quickly. When harmful content is encountered, the OSC can also issue directions requiring perpetrators or platforms to swiftly remove relevant content and stop its dissemination. This is precisely the outcome that most victims wish for. Second, the Bill also clearly lists out unlawful online behaviour, allowing victims to take civil action against the perpetrators and make perpetrators bear legal consequences for their actions. Finally, the OSC can help victims obtain the identity information of perpetrators. Although perpetrators often hide their identities, we are discussing with various platforms how to ensure that perpetrators cannot hide forever.

Overall, the purpose of the Bill is not only to provide timely assistance to victims, but also to help them seek justice. We also hope the Bill can serve as a deterrent, set new norms for online behaviour and create a safer, more positive online space.

(In English): Sir, our journey towards building a safer online world does not stop with OSRA. We are especially concerned about younger persons and plan to do more to ensure the usage of online services is age-appropriate. We know anecdotally and from studies that youths can easily bypass age restrictions online. They use false birthdays, spam email accounts or borrowed credentials for verification. When one person figures out how to do so, they teach their friends, too.

Such behaviours can cause more youths to be exposed to inappropriate content or persons. To moderate such risks, we have introduced age assurance measures on app stores. Hopefully, young users do not download age-inappropriate apps in the first place, but we know this is not enough. A more holistic approach is needed.

This means extending age assurance requirements to social media services which attract many young users. We will explore better protections on other platforms, such as online games, where children and youths spend much of their time.

In addition, we will look beyond content to address unwanted interactions and excessive time spent on apps. We will review the additional measures needed to better protect younger users online. As and when new technology arises, we will study them and update our safeguards, where required.

Mr Speaker, let me conclude. In the debate to follow, I suggest Members keep in mind three points.

One, OSRA introduces many new features in our online safety ecosystem. They are novel and meaningful. Nevertheless, there is much work ahead to ensure that these features can be effectively implemented.

Two, OSRA is designed to help victims in areas and ways that matter most to them. This was how we determined the types of harms, range of directions and thresholds of harms. We have also introduced safeguards to prevent overreach of the Commissioner and OSRA.

Three, these considerations should, therefore, inform further proposals and suggestions. We should be careful not to distract the Commissioner and the OSC, or impose more conditions which will impede victims from getting timely, effective and accessible relief.

With that, Mr Speaker, I look forward to the debate on this Bill. Sir, I seek to move.

Question proposed.

Mr Speaker: Minister of State Rahayu Mahzam.

1.26 pm

The Minister of State for Digital Development and Information (Ms Rahayu Mahzam): Mr Speaker, earlier, Minister Josephine Teo spoke about the genesis, the intent of the OSRA Bill and the need for more to be done to improve online safety. I will take Members through the Bill.

Let me first give an overview of the structure of the Bill. The Bill is split into 15 Parts. Part 1 consists of standard provisions, such as the purpose and interpretation sections. Part 2 establishes the Office of the Commissioner of Online Safety, or the Commissioner. Part 3 defines the categories of harm that will fall under the scope of the Bill. Part 4 governs the statutory reporting mechanism; and Part 5, the types of directions that may be issued by the Commissioner. Part 6 covers the investigative powers of the Commissioner. Part 7 governs the oversight mechanisms that the Commissioner will be subject to. Part 8, on offences and enforcement. Part 9 consists of provisions relating to service of documents and powers to make regulations. Parts 10 to 13 establish the statutory torts and the damages that may be sought under the torts, which Minister Edwin Tong will speak on later. Part 14 deals with jurisdictional, procedural and miscellaneous matters, and Part 15 covers amendments made to other Acts.

In my speech, I will be focusing on Parts 2 to 8 of the Bill, and will explain how the Bill will help to define what behaviours are unacceptable in the online world.

Part 3 of the Bill will specify 13 categories of harm, with each category defined in legislation, to create a common understanding of the scope and definition of each type of harm. These definitions were developed in consultation with the public, platforms and were guided by a Steering Committee. The Steering Committee was chaired by then-Minister for Law and Home Affairs, Mr K Shanmugam, and made up of Government agencies and experienced industry members.

Let me now explain each harm in turn.

First, online harassment, including sexual harassment. Online harassment is defined as the communication of online material that a reasonable person would conclude is threatening, abusive, insulting, sexual or indecent, and is likely to cause a person harassment, alarm, distress or humiliation.

Second, doxxing, which will cover the publication of a person's identity information that a reasonable person would conclude is likely to have been intended to cause harassment, alarm, distress and humiliation.

Third, online stalking, which will be defined as engaging in a course of online conduct that involves online acts or omissions associated with stalking, and that a reasonable person would conclude is likely to cause harassment, alarm, distress or humiliation to the other person.

Fourth, intimate image abuse or the non-consensual sharing of intimate images, a harm that can result in a severe and lasting impact on victims. This is defined as the communication of online material that contains an intimate image or recording of a person, without their consent, that a reasonable person would conclude is likely to cause that person harassment, alarm, distress or humiliation. This includes images or recordings that may have been altered or generated by artificial intelligence (AI) or any other means. An offer to sell or distribute, and advertisements of an intimate image or recording, are also included.

Fifth, image-based child abuse. Like intimate image abuse, this will include images or recordings that may have been altered or generated by AI or other means and will cover depictions of both physical and sexual child abuse.

Sixth, inauthentic material abuse. With the rapid development in AI technology and the advent of more powerful and easily accessible generative AI models, we have seen more deepfakes being used to cause harm to others. This harm is defined as the communication of inauthentic material that a reasonable person would conclude is likely to cause harassment, alarm, distress or humiliation because it is false or misleading.

The material is regarded to be inauthentic where it is an image, video or soundbite that has been manipulated or generated through digital means to create a false or misleading depiction of the victim's words, actions or conduct, and is realistic enough such that a reasonable person would believe that the victim did say such words or engaged in such actions or conduct. This category includes not only expressed depictions but also implied depictions.

Seventh, online impersonation, which is defined as online activity where one person pretends to be another without their consent and would lead a reasonable person to believe that the activity is conducted by the second person. However, parody, satire or commentary which no reasonable person would believe is made by the victim, is not online impersonation.

The eighth category is online instigation of disproportionate harm. This harm seeks to address the issue of "mob behaviour" or cancel campaigns, where one individual calls for numerous others to pile on to another person, in response to the person's actions or speech, instigating disproportionate harm. This category of harm will not be applicable to statements that tend to rally or call the public to undertake core political activities, such as calling for others to vote for a candidate in a Presidential or Parliamentary election.

The ninth category is the non-consensual disclosure of private information. This harm is defined as the publication of any private information of a person without consent, and that a reasonable person would conclude was likely to cause that person harassment, alarm, distress or humiliation. The types of information that would be deemed as private would be context specific, and would depend on whether the information is already in the public domain. Whether the individual is of the view that the information is sensitive may also be taken into account by the Commissioner. The Minister for Digital Development and Information will also be able to prescribe certain types of information as "private information".

The 10th and 11th categories are the incitement of enmity and the incitement of violence. Enmity refers to feelings of enmity, hatred or hostility against any group. Violence refers to unlawful force or unlawful violence. These harms are what we term "group harms", as they aim to tackle harmful online content that is likely to harm groups in Singapore. A group will be defined as a group of persons of any description and may include, for example, a group distinguished by race or religion.

Lastly, the 12th and 13th categories are the publication of false material and the publication of statements harmful to reputation. The former will cover statements about a person that are false and that a reasonable person would conclude are likely to cause harm. The latter will cover statements that a reasonable person would conclude are likely to harm the reputation of the victim and likely to cause any other additional harm to the victim. These two categories of harm differ slightly from the others in terms of the remedies victims may seek. I will touch more on this later.

There have been suggestions to include two new categories, namely sexual grooming and the publication of online material that encourages or promotes suicide or acts of self-injury. These harms have not been included at this stage, because, firstly, these harms are already addressed through other measures in our broader online safety framework, through the Penal Code for sexual grooming, and the Broadcasting Act for online material that encourages or promotes suicide or acts of self-injury.

Secondly, the OSRA Bill is designed in a way that seeks to stop online harm from occurring, such as through the removal of content. This is made possible because the online harm is identified through a single, discrete post, like the 13 categories of harms I described earlier. In contrast, sexual grooming often occurs over the course of communication or exposure to various pieces of content, where each individual piece of content may not be harmful in and of itself.

Thirdly, OSRA is designed in way that relies on user reports. Recourse can only be provided in cases where the victim recognises that they are victims. Unfortunately, it is often the case that victims of sexual grooming or consumers of self-harm and/or suicide content, are not aware that they are victims. However, in such cases where these victims do come to the Commissioner for assistance, the Commissioner will work with the relevant agencies, including the Police, to provide the necessary assistance.

Members might observe that not all of these 13 categories are new and some of them are already being addressed by existing law. For example, online harassment is already being covered by POHA. The Penal Code as well as the Criminal Law (Miscellaneous Amendments) Bill, which was debated in Parliament earlier, both address intimate image abuse. Such content might also be covered as egregious content under the Broadcasting Act. The suggested additions to the categories of harm would also be covered by the Penal Code and the Broadcasting Act.

The group harms have also already been addressed by the Maintenance of Religious Harmony Act and the Maintenance of Racial Harmony Act, for groups distinguished by religion or race. The common law action of defamation, as supplemented by the Defamation Act, provides individuals with an avenue of recourse for defamatory statements or statements which harm a person's reputation. OSRA will complement these existing laws.

Some laws address harms that affect the public interest, such as the Maintenance of Religious Harmony Act, the Maintenance of Racial Harmony Act and the Broadcasting Act. Others provide individual remedies through legal action in Courts, such as POHA. However, victims have shared that seeking legal remedies is often a lengthy and expensive legal process, which often deters them from seeking recourse.

This can be seen from the SHE's 2023 Online Harms Report, which showed that 28% of respondents who decided not to take legal action did so due to the cost. The same SHE report also found that most respondents preferred the swift and permanent removal of content over taking legal action. OSRA is intended to complement these existing legislations by expanding the scope of harms covered, through including new harms, such as inauthentic material abuse and non-consensual disclosure of private information, and by allowing victims to seek recourse in a simple and timely manner.

We acknowledge that the Internet evolves rapidly and the online harms ecosystem may change drastically over short periods, with new harms emerging or with bad actors finding new ways to cause harm. Thus, OSRA has been designed in a way to allow us to adapt to changes in the online harms ecosystem.

The Minister for Digital Development and Information will be able to prescribe additional types of online harms. This power will be vital in ensuring that we address emerging harms as soon as possible. We will ensure that this power is used judiciously and for harms that are particularly egregious to individuals, to prevent the unchecked expansion of the scope of OSRA.

Part 2 of the Bill establishes the Office of the Commissioner of Online Safety. It will be responsible for administering the statutory reporting mechanism to provide timely relief for victims of online harms. The Commissioner of Online Safety will be appointed by the Minister for Digital Development and Information and will be supported by Deputy Commissioners and Assistant Commissioners.

While the Minister may provide broad guidance to the Commissioner, the Commissioner will be the final decision-maker for all cases. We will also establish a new agency, called the OSC, that will support the Office of the Commissioner for Online Safety. The Commissioner may delegate the exercise of all of, or any of, the functions or powers of the Commissioner to the officers of the OSC, to allow them to effectively support the Commissioner. This includes the power to issue directions, which I will speak more on later.

However, the Commissioner may not delegate the power of appointment or delegation, or the power to issue advisory guidelines. The OSC will be administratively supported by the IMDA.

As announced by Prime Minister Lawrence Wong during his speech at the Smart Nation 2.0 Launch in October 2024 and reiterated by Minister Josephine Teo earlier in March this year, the OSC will be set up in the first half of 2026. We have also considered the importance of transparency. To that end, the Commissioner will consider publishing regular reports for public awareness on online harms and the Commissioner's work, which may include information on aggregated caseloads, and anonymised information, insofar as these do not re-traumatise victims.

Let me now say more about the reporting process, which, as I shared earlier, is governed by Part 4 of the Bill. With your permission, Mr Speaker, may I ask the Clerk to distribute a set of handouts to the Members?

Mr Speaker: Please go ahead. [Handouts were distributed to hon Members. Please refer to Annex 2.]

Ms Rahayu Mahzam: Members may also access these materials through the MP@SGPARL mobile app. Members may refer to Handout 2 for an overview of the user journey with the OSC.

First, prior to submitting a report to the OSC, victims will, generally, be required to report the online harm to the platforms. While we are setting up the OSC to provide timely relief to victims of specified online harms, platforms remain the first port of call. Platforms must continue to play an active role and take responsibility for the safety of their users online. Where platforms fail to act on online harms within 24 hours, victims can then file a report to OSC.

The requirement to first report to platforms before reporting to the OSC will, however, not be required for certain egregious harms, such as intimate image abuse and image-based child abuse. This step will also not be required for doxxing. Victims of these harms can submit a report directly to the OSC.

Our view has always been that platforms should take responsibility for keeping their users safe online. This is evident from the approach we have taken under the Broadcasting Act. For example, under the Code of Practice for Online Safety – Social Media Services, designated social media services (DSMSs) with significant reach or impact in Singapore are already required to submit annual reports to be published on IMDA's website. These reports contain information on the DSMSs' measures to combat harmful and inappropriate content, and metrics, such as the number of reports received from users, as well as the response times to act on these reports.

There are other baseline requirements. To be eligible to submit a report, victims must be Singaporean Citizens, Permanent Residents or have a prescribed connection to Singapore. We intend for this to cover foreigners who are residing in Singapore for the long-term at the onset.

Where victims are under the age of 18, the parent or guardian of the victim may also submit a report on their behalf. For example, earlier in November 2024, we saw deepfake nude photos of students at the Singapore Sports School being created and circulated by the student-athletes. In such a case, both the student victims and the parents of the victims depicted in the deepfakes would be able to submit a report to the OSC.

We have also considered situations where a victim strongly prefers that the report be filed for them. In such situations, victims will be able to authorise other persons to file reports on their behalf. This includes authorising employers or public agencies. For example, a hospital, if authorised, may file a report on behalf of a healthcare worker who is a victim of a specified online harm. In the spirit of the reporting mechanism, OSC will only assess reports submitted. OSC will not actively monitor and identify cases of online harms.

Reports will be submitted through the OSC's website, which will be designed with the user in mind. We are consulting other agencies and third-party organisations, such as SHE, in designing the website and the reporting form, to take into account user-centric language.

Accounting for victims' needs early is important, as survivors of online harms may, sometimes, avoid seeking external support to avoid re-traumatisation. The OSC's website will also provide victims and other persons with access to resources on online harms and advice on how to keep themselves safe online. This will include, for example, information on the different types of online harms and what to do when you have experienced them.

The OSC is also exploring partnerships with third-party organisations which victims may be referred to for support, such as counselling services or further resources. The details of this are still being worked out and we will provide more details in due course.

Each report will be assessed on its own merits and the OSC will be able to act on the face of the report submitted. This is to ensure that the OSC can move fast to address the online harm. Where necessary, such as where the information provided in the report is unclear, the Commissioner will be empowered, under Part 6, to conduct investigations, to better identify the relevant facts of each case. The OSC will also develop internal practices and ensure that case officers are trained to handle each case sensitively.

Let me turn now to directions, which are listed in Part 5 of the Bill.

The Commissioner will be empowered to issue directions to stop online harms from continuing to occur or to prevent further online harms from affecting the victim where there is reason to suspect that the online harm was conducted in respect of the victim or the victim group, as the case may be. The threshold for the issuance of directions is modelled after the threshold set for the Police to take protective action in the Criminal Procedure Code and the Online Criminal Harms Act (OCHA).

We had considered raising the threshold for the issuance of directions to "reasonable grounds to believe". However, we ultimately landed on "reason to suspect", to ensure that online harms can be stopped in a timely manner. We will be putting in place oversight mechanisms to ensure that OSC directions are only issued where appropriate. I will speak more on the oversight mechanism later.

The OSC may issue directions to three different parties: the communicator of the online harm, the administrator of a group or location where the online harm occurred, and the platform on which the online harm occurred.

Broadly, communicators and administrators may be issued directions that require them to remove specified material or disable a specified location, to restrain them from posting certain types of content or carrying out certain types of conduct or to require them to put up a victim's reply.

Administrators may also be issued other directions, such as directions that require them to put up a label to warn visitors that the online location has been subject to previous OSC directions or to restrict access by a Singapore account to the online location managed by them.

Platforms may be issued directions that require them to prevent end-users in Singapore from accessing specified content or online locations, to restrict interactions between an account and end-users in Singapore, to ban a Singapore account or to post a victim’s reply on its service.

As I mentioned earlier, two of the categories of harm – publication of false material and publication of statements harmful to reputation – differ slightly in terms of remedies that victims may receive. Generally, victims of these harms will only be able to seek a Right-of-Reply Direction to allow them to have their side of the story heard. We will be introducing new types of directions requiring recipients to act on content that can be distinguished by unique identifiers, such as usernames, keywords or hashtags.

The OSC may also issue what we consider as Enhanced Directions to certain prescribed online services. These Enhanced Directions impose additional requirements on the prescribed online service and may require the recipient to act on identical online harmful material, to take further steps to prevent online harm from occurring in the future or to reduce engagement of its users with a class of material. Members may refer to Handout 3 for the full list of directions that the OSC may issue, and Handout 4 for illustrations of online harmful activity and possible OSC directions.

In deciding whether to issue a direction and the type of direction to be issued for a case, the OSC may consider a basket of factors, including: the degree of the harm caused or likely to be caused; the number of persons harmed or likely to be harmed; the manner and circumstances in which the online harmful activity occurred; whether the conduct of the online harmful activity was reasonable, such as when a comment or post would be considered to be a "fair comment"; the likelihood of further online harmful activity being conducted; and whether the direction would be contrary to any public interest. These factors provide the OSC with much needed flexibility, to ensure that appropriate action is taken in every case.

The OSC will also publish guidelines detailing the factors that the OSC will consider in its decision-making process. Such guidelines will also include illustrative examples of when the OSC will or will not act.

In cases where the OSC is made aware of non-compliance with its directions, the OSC may take further escalatory actions. These actions include the issuance of orders following non-compliance, such as: access blocking orders, which may be issued to providers of Internet access services, to disable Singapore end-users' access to an online location; or app removal orders, which may be issued to providers of app distribution services, to remove the specified app from the Singapore app stores. As these orders may affect all users in Singapore since they will no longer be able to access the apps or websites, they may only be used after careful consideration.

Part 8 of the Bill provides for offences and enforcement. The directions and orders issued by the OSC are legally binding, and non-compliance with these directions and orders will be a criminal offence. Where there has been non-compliance, the Commissioner will be empowered to conduct further investigations, such as requesting information or documents from individuals. These are the same Part 6 investigative powers that the Commissioner may exercise when assessing reports in the first instance.

In developing the penalties under the OSRA Bill, we have referenced other existing legislation, such as the Broadcasting Act, OCHA and the Penal Code. For example, the penalty for non-compliance with directions will be a fine of up to $20,000, or imprisonment for a period not exceeding 12 months, or both, for individuals. Individuals will also be subject to a continuing fine of up to $2,000 for each day the offence continues after conviction. Entities will be subject to a fine of up to $500,000 and a continuing fine of up to $50,000 for each day the offence continues after conviction.

The provision of false information to the Commissioner or the OSC will also be an offence. Individuals found guilty of this offence will be subject to a fine of up to $20,000, or imprisonment for a period not exceeding 12 months, or both. Entities will be subject to a fine of up to $50,000.

In some cases, it may be more appropriate to focus on rehabilitating the perpetrator rather than prosecuting them for non-compliance with an OSC direction. To that end, the Commissioner may put in place an online harms remedial initiative. This initiative could include the completion of volunteer programmes by the perpetrator, which may be taken into account when considering prosecution for non-compliance with the OSC’s directions.

As I mentioned earlier, we will be putting in place oversight mechanisms, which will be governed by Part 7 of the Bill. These mechanisms ensure that the OSC will be able to move quickly and with confidence on the face of the information it has received.

Victims, recipients of directions or orders, and other prescribed persons will have access to a two-step appeal process. Eligible persons may first apply to the Commissioner to reconsider the OSC’s decision. Thereafter, eligible persons may appeal against the OSC's reconsidered decision to an independent appeal panel that will be appointed by the Minister for Digital Development and Information.

For the reconsideration process, the OSC will re-assess the relevant case afresh, and may also take into account any new information that may be presented by the relevant parties after the initial assessment was conducted. Applicants will be able to submit such new information to the OSC. When a decision has been made, the OSC will inform the applicant and other affected parties of their reconsidered decision. The OSC may affirm, revoke, vary or substitute any earlier decision, direction or order issued.

The appeal panel will consist of individuals from academia, society and industry, across different areas of expertise, and will focus on assessing whether a specified online harm had occurred and whether the reconsidered decision made by the OSC is proportionate and justifiable. Where the victim, recipient or prescribed person is dissatisfied with the OSC’s reconsidered decision, they may submit an application to the appeal panel to appeal against the OSC’s reconsidered decision. Similar to the reconsideration process, the appeal panel will be able to affirm, revoke, vary or substitute decisions of the OSC in relation to the issuance or non-issuance of a direction. The appeal panel will also be able to hear appeals on the issuance of an order following non-compliance.

Each individual will be given one chance to have their case heard by the independent appeal panel. Should the individual continue to be dissatisfied with the outcome of the appeal, they may seek judicial review.

The establishment of a new office and the statutory reporting mechanism is a monumental task, one that will require close collaboration across Government. We have to do the OSC right, so that we can do right by the victims. This is why we will be implementing the reporting mechanism in phases. What this means is that we will be bringing the first five harms I shared about earlier into force within the first six months after the OSC opens its doors. These are: intimate image abuse; image-based child abuse; online harassment, including sexual harassment, doxxing and online stalking. The rest of the harms will follow progressively.

These five harms are prioritised for a reason. They represent the most prevalent and serious harms faced by Singapore users online. These are also the online harms that Singaporeans are the most concerned about. For example, The Institute of Policy Studies’ 2025 Online Safety Study showed that targeted harassment was seen by Singaporeans as “highly harmful”. Members may refer to Handouts 1 and 5, for a broader overview of recent survey results and studies relating to online harms. This will allow us to better manage the OSC's caseload and to properly develop the necessary guidelines, frameworks and capabilities to ensure that the OSC's decisions are consistent and appropriate in all cases. Mr Speaker, allow me to say a few words in Malay.

(In Malay): [Please refer to Vernacular Speech.] The need to provide accessible and timely programmes and resources to victims of online harms is not new. Over the years, we have taken steps to educate and better empower our citizens to act against online dangers. For example, in 2021, we established the Sunlight Alliance for Action (AfA) to tackle online harms.

In the span of one year, AfA organised campaigns to raise public awareness about online harms and their impact. These campaigns also equipped youths to provide better support to their peers who were affected by online harms.

As co-chair of this AfA, I was informed by our partners about the experiences that many victims went through. This is what drives me, to this day, to build a safer digital space.

The Sunlight AfA also inspired many of its members to continue their noble efforts to help victims. One example is the establishment of SHECARES@SCWO, a collaboration between SHE and the Singapore Council of Women's Organisations (SCWO). It is Singapore's first one-stop support centre for victims of online harms.

Efforts to enhance online safety require the involvement of every level of society, and the Government is ready to step forward to strengthen these efforts. With the establishment of the OSC, we will be able to provide timely follow-up actions for victims, help stop harms as quickly as possible and build a more robust support ecosystem for victims.

(In English): Mr Speaker, the OSRA and the OSC are just the one of the many steps that we are taking to enhance online safety. I am heartened to note that online safety is a matter that both sides of this House are passionate about and I invite all Members of this House and the public to continue our conversations on how we can better enhance online safety.

Mr Speaker: Minister Edwin Tong.

1.59 pm

The Minister for Law (Mr Edwin Tong Chun Fai): Mr Speaker, Sir, today, we take an important step to make our online spaces safer and fairer. The Bill before this House clarifies duties, provides relief and also strengthens accountability across the online ecosystem to deal with online harms. My colleagues have earlier taken Members through the key aspects of the Bill, and I will focus on outlining the new Statutory Torts Framework, as well as the End-User Identification measures in this Bill.

Sir, let me start by reiterating why we saw a need for this Bill. The victims of online harms are not just statistics. They may be our children, classmates, colleagues or neighbours. Their confidence, studies and indeed their livelihoods can be shaken and seriously affected.

For one Primary 4 student, cyber-bullying reared its ugly head when she started using Instagram. She based her self-esteem on how many "likes" and "followers" she had. She would ask her friends to "like" her photos or "follow" her account to appear popular, and if a photo did not receive over 100 "likes", she would delete it.

She began receiving comments about her appearance and hurtful messages from those who were supposedly her friends. Her mental health worsened. She self-harmed and she was later diagnosed with Post-Traumatic Stress Disorder (PTSD) when she eventually sought help at the Institute of Mental Health (IMH). She was on medical leave from school for most of the year.

Sir, this is a sad story. But what is worse is that it is not an isolated case. With your permission, Mr Speaker, may I ask the Clerks to distribute a handout?

Mr Speaker: Please proceed. [Handouts were distributed to hon Members. Please refer to Annex 3.]

Mr Edwin Tong Chun Fai: Sir, Members may also access these materials through the MP@SGPARL app.

Sir, Handout 6 sets out several other accounts: students, women, working adults and their loved ones. Online harms affect not only individuals and their families, but also the confidence and the trust of our society as a whole.

In the SHE 2023 Study on Online Harms cited by Minister of State Rahayu a short while ago, 76% of respondents were not comfortable expressing their personal views on potentially controversial issues and topics online. Women, youths and minorities are especially vulnerable: 22% of female youths experienced sexual harassment compared to 14% across all respondents; 52% of respondents aged 15 to 24 years old personally experienced online harms compared to 38% across all respondents; 14% to 21% of respondents believed they experienced harms because of their identity, such as race and religion.

As a result, victims exit from public life or they choose to stay offline. Their voices, and their representation, are lost. And proper discourse in the community becomes weaker. If this continues, we will have a divided society, and a weakening of our social fabric and collective trust.

Sir, in contrast, most Singaporeans feel safe on our streets. About 98% of Singaporeans feel safe walking out alone, even at night. In the offline world, we are mindful of how we behave towards one another. We know not to threaten, harass or insult other people. And in our everyday interactions, there are social norms that we know to abide by. In public spaces, schools and also in work spaces. Such norms allow us to function and thrive as a society.

But when it comes to the online world, these norms are not quite observed in the same way. The sense of safety we have offline does not translate into the online space. The SHE 2023 study found that 58% of respondents reported personally experiencing and/or knowing others who faced online harms.

We studied why this might be so. And in the course of our study, three issues stood out: first, fragmented standards and weak accountability in the online harms regulation space; second, economic incentives that unfortunately reward sensational content; and third, online anonymity that emboldens misbehaviour.

These three issues have very much shaped the thinking behind the construct and the framework of this current Bill. Let me elaborate that for Members.

Thus far, the development of the rules and norms of the Internet has been largely left to the tech companies. Left on their own, different platforms will apply different rules, largely shaped by their own interests.

And in the absence of any common set of enforceable norms, it is not easy to expect or enforce accountability. Instead, these differing standards will continue to exist and operate in a manner which allows wrongdoers to exploit the gaps. This lack of accountability is exacerbated by the structural features of the Internet. There is a misalignment between tech companies' profit motive and the need and desire to enhance online safety.

At the Online Harms Symposium co-organised by the Ministry of Law (MinLaw) and the Singapore Management University (SMU) Yong Pung How School of Law in 2023, former Facebook employee turned whistle-blower, Ms Frances Haugen spoke about how tech companies do have solutions to address online harms, but implementing them will eat into their profits and hurt their bottom line. Social media platforms profit from the amplification of sensational and inflammatory content. Viral, harmful content draws engagement, higher engagements draw eyeballs so their removal will impact revenue.

We have some basis to believe that online safety considerations might well yield to profit generation, if left unregulated. Today, tech companies have community standards which ostensibly address online harms. Members can see some examples of these standards which are set out in Handout 7. However, most platforms do not necessarily adhere to their own community standards. Indeed, in the IMDA study cited by Minister Josephine Teo a short while ago, over 50% of legitimate user complaints were not addressed in the first instance. And this, we believe, is far from ideal, as victims are reliant on platforms to stop online harms.

At present, there is no framework which can give redress to victims for harms that happen at the speed and the extent to which they happen online. While victims may try and seek relief in court, there is a limit to how fast court proceedings can be and may also be costly. And we all know that in these types of cases, speed of redress is crucial. And most victims of such harms do not want to have to seek relief in Court, through the Court process.

It is also clear that online users behave on the Internet differently from the offline world. Perpetrators are emboldened to act with impunity especially when they can remain anonymous.

First, they experience what we call the "Online Disinhibition Effect", a term coined by Prof John Suler. Prof Suler explains that anonymous Internet users separate their online persona from their in-person identity and they do not feel responsible for their behaviour online.

Second, there is little that victims can do to hold anonymous users accountable, simply because they remain anonymous and perpetrators know that they are unlikely to get caught and do not fear consequences.

Overall, in general, anonymity fuels bad behaviour.

To address these issues, the Government has taken proactive steps. Minister Josephine Teo outlined the reforms that protect users. Minister of State Rahayu explained how the new OSC will provide timely relief to stop harm. Let me address the remaining mechanisms in this Bill for Members.

Sir, we start with the proposition that, in some cases, stopping the harm alone might not be enough. Victims might require additional recourse, for example, compensation. And to do that, they will also need to know who lies behind that anonymous social media handle.

The Bill therefore seeks to introduce a framework to close those gaps. It will introduce clear statutory duties for online actors, defining what responsible conduct and therefore, what the standard of duty that is expected of them, ought to be. It provides civil remedies for breaches of those duties, giving victims the right to seek justice in Court and it creates an additional avenue of relief, complementing the quick administrative recourse that can be obtained through the OSC, so that victims can choose the path that best suits their needs.

During the Online Harms Symposium that I referred to a short while ago, Ms Haugen likened Internet and social media regulation to the evolution of road safety, something that happened several decades ago. In the 1960s, US car manufacturers vigorously resisted safety reforms. But informed legislators persisted with greater regulation, coupled with a push, a very determined push from concerned citizens as well as investors.

Through legislative changes, safety was made a core design principle in the manufacturing of cars. And it has been estimated that between 1960 and 2012, over 600,000 lives were saved in the United States (US) as a result of this.

In a similar way, not exactly the same, but in a similar way, we hope that this Bill will make online safety a design principle of the online space upfront, and not just an afterthought.

Sir, let me now outline how the Statutory Torts Framework is intended to work.

At the outset, our focus and intent is to empower victims. It is a very victim-centric approach and today you look at the example of one victim who tries to speak to the platforms, you heard the statistics that Minister Josephine Teo cited earlier, it is very difficult, and they are pretty much powerless today.

At the Government level, we have the Broadcasting Act, we have the Protection from Online Falsehoods and Manipulation Act (POFMA), we have OCHA that empower the Government to act in a variety of ways. OSRA provisions will empower private persons to obtain relief, and Members might, therefore, be aware that clause 4 makes it clear that a public agency cannot commence a claim under the Statutory Torts Framework. This is a private citizens' private remedy.

I will now explain the types of online harms covered by the statutory torts. This framework will cover the same categories of harms that the OSC will act on, the same categories that Minister of State Rahayu took you through earlier, with some exceptions and refinements for coherence.

Clauses 83 to 88 cover the following harmful online activities: intimate image abuse; image-based child abuse; online impersonation; inauthentic material abuse; online instigation of disproportionate harm; and incitement of violence.

Harassment, doxxing and stalking will continue to be dealt with under POHA for communicators, but the new statutory duties for these harms will extend to administrators and platforms under this Bill, since POHA does not cover them.

The online harms omitted from the statutory torts, mainly, false material, statements harmful to reputation and non-consensual disclosure of private information – are already well-covered under the existing legal framework, laws on defamation as well as on privacy and confidentiality.

This alignment ensures coherence in our legislation – no overlapping and no double remedy.

In this Bill, the statutory torts will also not cover Incitement of Enmity. We think it is unwise to encourage such matters – which can be potentially explosive, emotive and divisive – to be dealt with litigiously, in a courtroom and so it will be dealt with by the OSC.

The statutory torts will be implemented in phases, as you heard Minister of State Rahayu sketch out the OSC’s implementation of harms framework earlier in coordination with the OSC.

Second, Sir, the Bill assigns clear duties to the key actors in the online ecosystem, and they are the communicators, the administrators and the platforms.

Clauses 83 to 88 therefore impose duties on communicators not to make or share any communication which constitutes an online harm.

Clauses 90 and 91 impose two duties on administrators.

First, they must not develop or maintain an online location in any manner that facilitates or permits online harm to take place – with the intention or knowledge that online harm is likely to take place. This duty covers administrators that are complicit in the online harms, and Members might be aware of the infamous chatgroup "SG Nasi Lemak" previously. This will cover the administrators of such a chatgroup.

Second, when notified of harm, they must act reasonably – more specifically, they must take reasonable care to assess if there is harm and if so, to take reasonable steps to address it. Clause 94 therefore imposes a similar duty on platforms to act reasonably when notified of harm.

I want to make clear to Members that the duty to take "reasonable steps" does not require the platform to do constant surveillance and monitoring. Their liability arises only when an actor fails to act reasonably after receiving proper notice. They are not liable if, through no fault of their own, they did not receive the notice sent by the victim.

In assessing reasonableness, the Court will consider the circumstances of the case, including the seriousness as well as the persistence of the harm.

Let me illustrate this: an administrator or platform that receives notice of a harmful post for the first time may act reasonably by simply removing that post if that is the appropriate remedy under the framework. But if the same account repeatedly causes harm in the same way, simply taking down each time the post is put up may no longer suffice. In such cases, taking stronger action, such as suspending or disabling the offending account, may be the reasonable steps required under clauses 91 and 94 and that is what I meant when I said you assess the entire factual matrix and situation holistically.

What is “reasonable” will therefore depend on the facts. The Courts can take into account factors such as the nature of the conduct; the context in which it occurred; and the effect and impact on the victim.

For example, putting up a post or creating a website to whistle-blow on serious misconduct may well be "reasonable" if done for a legitimate purpose and in a proportionate manner, even if it might cause harassment, or might be considered as online instigation.

There are also safeguards to address concerns that administrators and platforms may be inundated with frivolous notices or notices with insufficient information. The Bill provides that the particulars, which an online harm notice must contain are to be prescribed. We set it out clearly in a prescribed form so that the categories of information is known upfront and this ensures that only genuine, properly documented cases trigger the duty to act, and the administrators and platforms have enough information to take "reasonable steps".

Ultimately, it is the Court that will look at the facts of each case, weigh the totality of the evidence and decide whether a claim is made out and if so, what remedy should follow and against which party. These are all fact-sensitive judgments that reflect the diverse realities of online interaction. The Bill therefore avoids the use of fixed or rigid formulas, to allow the Court to develop the law incrementally, while at the same time, keeping the focus squarely on online safety and responsibility.

The Bill recognises that platforms and administrators need not proactively scan for all harms. They only have to act responsibly once notified. Take together, these duties encourage vigilance without imposing impossible burdens. They reflect a simple idea embodied in many legal principles, which is, if you control the space, then you must play your part in keeping it safe.

Next, Sir, let me turn to remedies. If a victim successfully establishes a claim, the victim must have access to effective and fair remedies. Under clause 96, victims may seek damages that the Court finds just and equitable, and other heads of damages that the Minister may prescribe in regulations, such as compensation for loss of earnings or an account of profits where perpetrators benefited from the harm.

The intent is to ensure that victims are properly compensated and that wrongdoers are not allowed to benefit from their behaviour. For some harms, therefore, the victim's earning capacity or livelihood may be affected and they should be compensated for loss of future earnings or loss of earning capacity, as the case may be or as may be appropriate. In other cases, such as where intimate images have been put online and for sale and perpetrators profit from this harm, then, in those cases, an account of profits may be ordered so that the wrongdoer does not get to retain the benefits of the harm caused.

The regulations in this Bill reduce victims' uncertainty as to what remedies they are entitled to. But ultimately, it is for the Court to decide on the appropriate orders, based on the facts of each case.

Clause 98 introduces the concept of enhanced damages and empowers the Court to award such damages where a communicator or administrator persists with their conduct despite notice. We think that enhanced damages should apply to those who are the root cause of the harm, such as recalcitrant communicators or administrators who create harmful websites or chatgroups. We have, therefore, excluded the platforms. This framework is intended to incentivise and drive reasonable compliance and, in some cases, as quickly as possible. And the enhanced damages framework also compensates victims for any additional harm resulting from failure to comply. And we hope to drive overall a strong enough messaging with a deterrent impact on the actors in the online space.

Therefore, enhanced damages may be awarded to compensate the victim for additional harm caused by the refusal to stop the online harm, penalise the communicator or the administrator for bad conduct, and the Court will consider the overall justice of the situation, when assessing whether to impose enhanced damages.

In addition, clause 99 empowers the Court to issue injunctions, both interim as well as permanent, to stop harm swiftly. These injunctions operate independently of any direction from the OSC, giving victims complementary routes to relief. The OSC and the Courts operate independently of each other, and neither is bound by the decision of each other. The OSC seeks to act quickly and takes public interest into account in making its decisions. The Court decides any claim for statutory tort relief based on the applicable legal principles and a framework for remedies. Taken together, Sir, we believe that we have fashioned a suite of remedies that strike the right balance: accountability for wrongdoers; fair recourse for victims; and flexibility for the courts.

The statutory torts are designed to change and strengthen norms – to make self-responsibility a default in our online spaces.

And I come back to the time when the first motor-safety laws were introduced in the 1960s. There was an American publication, Automotive News, which lamented the passing of these laws with the headline, and I quote: "Tough safety law strips auto industry of freedom". There was fear; there was resistance in the industry.

But with the passage of time, history has proved those laws right. They made cars safer, saved countless lives, changed attitudes and mindset, and re-shaped how the industry designed every vehicle thereafter.

We hope that this Bill, with a clear framework, can also have the effect of setting the right tone for online behaviour, shape mindset and attitudes, of both users as well as service providers. They define what is acceptable and what is not. They will guide conduct not only through monetary damages, but through shared expectations made explicit.

Overall, our intention is that as our online norms mature, we will rely less and less on reports and lawsuits, because, like road safety, the law will have done what it set out to do, not just punish harm, but nurture the habits that prevent harm in the first place.

Sir, I move on now to address how the Bill handles anonymity. From time to time, anonymity can serve a good purpose – it allows users to speak freely, sometimes, obtain assistance and, on occasions, allows marginalised groups to speak up. But at the same time, it must not shield wrongdoing. And unfortunately, many online users abuse the power and privacy which online anonymity affords them.

I had earlier covered how online anonymity is a driver of online harms and leads to the Online Disinhibition Effect. Anonymity also exacerbates the impact of harm on victims.

First, victims may become more distrustful of those around them. They wonder who it is posting on their social media sites. They do not know if the perpetrator is, indeed, someone they might know. Second, victims will not be able to obtain legal recourse from perpetrators. By definition, they cannot commence legal proceedings, or enforce Court judgments, against an unknown person.

There are existing mechanisms currently available in Court, such as pre-action discovery and non-party discovery. All of these are mechanisms which can be used to obtain information about the identity of wrongdoers. But victims will still need to commence Court proceedings which may be costly and time-consuming. And so, we believe the Bill's proposed End-User Identification measures offer an accessible option.

To start with, clause 49 empowers the OSC to obtain information and documents for the discharge of its functions. This includes identity information of an end-user which is in the possession of platforms. This is akin to how law enforcement agencies are empowered to obtain such information for investigative purposes, akin but not similar.

Second, clause 52 empowers the OSC, where it reasonably suspects a user of committing an online harm, to require prescribed platforms to take reasonable steps to obtain specified information that may identify the user. This can take the form of the user's name or perhaps, verified phone numbers or credit card information, which can then be used to make further inquiries with telcos or banks.

This obligation to collect information is carefully scoped to target those users who are suspected of carrying out online harms. This is following close consultations that we had with industry partners who expressed difficulty with a general obligation upfront for platforms to collect information of all their Singapore users.

Third, clause 53 empowers the OSC to disclose the perpetrator's identity information to a victim or to their authorised representatives, upon receiving an application from the said victim. At the initial stage, disclosure will be limited to the purpose of enabling victims to bring their claim. We intend to eventually extend this for other purposes as well, such as allowing victims to safeguard themselves from the perpetrator, and to take proactive future measures.

Mr Speaker, we recognise that some may have concerns that these measures might intrude on users' privacy or go too far. Let me be clear: that is not the case, and we thought about this framework quite carefully.

The measures are aimed squarely at those who hide behind anonymity to cause online harm. They are not meant to affect ordinary users who act responsibly. In fact, for the vast majority and for most users, nothing will change. Most platforms today already require some form of verified contact or payment information at the point of registration.

Additionally, when the OSC discloses a perpetrator's identity to a victim, there will be safeguards to ensure that the information is protected and not misused. First, the OSC may impose strict conditions on how the information can be used, such as limiting the use of the information to seeking protection or pursuing legal remedies. Any breach of those conditions will be a criminal offence.

Second, the misuse of the information may itself attract legal consequences. For example, if the victim were to use the information obtained from the OSC to dox the perpetrator, that could, itself, be an offence under POHA or under an online harm under this Bill.

In short, the Bill has in-built safeguards. They balance and they protect both the victim's right to know as well as the perpetrator's right against misuse.

Sir, this Bill is designed above all, as Members can see from how I have articulated the framework and the schema of this Bill, to be as victim-centric as possible, to give swift accessible access to those who have suffered real harm online. And the provisions have been drafted with that goal in mind. Minister Josephine Teo spoke about how the OSC will be empowered to issue directions quickly to address harmful content, and Minister of State Rahayu explained the appeal mechanisms available.

Sir, the hon Member, Ms He Ting Ru has proposed two amendments which I would like to address. Her amendments speak to the removal of finality of an Appeal Committee's decision, and second, to add a right to appeal to the General Division of the High Court.

Sir, I would like the House to know that both the MinLaw and MDDI teams had carefully considered the appeal process, and it includes options similar to Ms He's proposals. However, we felt that we could not support them in this Bill and let me explain why.

These mechanisms will make the process slower, with less finality to the proceedings. It will make it more complex and ultimately, less accessible for victims.

Let me reiterate that the purpose of the OSC is to deliver speedy, practical relief to give redress to what has objectively been determined to be an online harm. Allowing repeated appeals would prolong litigation and each new appeal means fresh rounds of arguments, delay and also uncertainty in dealing with harmful content, as well as renewed anxiety for those already hurt, who quite likely will have to remain engaged throughout the appeal process. We expect that there will be likely higher case volume in OSRA cases, which also adds to the administrative load of the OSC.

And, Sir, the further point is this. If a case goes on appeal to the High Court, lawyers will probably be instructed. In such an instance, will there be equality in how this might play out? One can imagine – most platforms are very well resourced, and likewise, a number of administrators and content creators too. What happens when an individual victim might need to seek redress against one of these giants, with deep pockets, in Court and with lawyers? With the additional prospect of having to bear substantial costs in the litigation if one does not succeed? Overall, we fear that this will dissuade victims from coming forward. And over time, this will render the framework toothless, not because of the provisions, but because individual victims will find it more difficult to seek redress and might shy away. This will make the framework less inclusive, and we hope not to see that.

In contrast, we believe that the current framework already strikes the right balance. In the first place, the framework that you heard Minister of State Rahayu outline earlier, these are administrative decisions by the OSC, who assess the harm based on the prescribed factors in this framework and they make a poly-centric decision, taking into account policy and public interest considerations when deciding on whether it is a harm and if so, what the appropriate remedy ought to be. Such administrative decisions are subject to judicial review and not an appeal. In fact, this is not unusual.

Sir, at its heart, this Bill, as I said at the outset, is about empowering victims. The OSC's process is deliberately designed to be straightforward, fast and simple and focused on stopping harm quickly and hopefully, not spending time arguing about it. We think that Ms He's proposed amendments, though well-intentioned, would probably make that journey harder and not easier.

Mr Speaker, Sir, this Bill is pragmatic, proportionate and principled. It protects victims, sets fair expectations for online actors and strengthens trust and accountability in our digital commons. If we proceed steadily and work together – Government, industry and users – I believe we can keep our online spaces open, but also safe; vibrant, but also responsible.

Sir, we have shown clearly how online harms exact a cost – on individuals, on families and on the social fabric that holds us together. And as technology evolves, new harms will emerge. Our laws must, therefore, remain future-ready. We must be bold and innovative to stay ahead but also compassionate in how we protect those who are most vulnerable. This Bill gives victims a clear and practical framework to seek relief when harm occurs. It also sends a clear, unambiguous signal – that everyone who shapes our digital spaces in Singapore must act responsibly.

Through OSC, the Statutory Torts Framework as well as the End-User Identification provisions, we are building a coherent system of protection and accountability. Each prong complements the other: the OSC is a safety net, providing rapid relief to victims of online harms; the Statutory Torts Framework provides private remedies, and it also is the standard-setter, encouraging all actors to play their part; and the End-User Identification measures ensure that no one can cause harm from behind a mask.

Sir, public support for these measures has been strong across communities, professions and also generations. In the Public Consultation launched by MinLaw and MDDI in 2024, respondents expressed strong support for establishing a dedicated agency to address online harms – over 90% support; allowing victims to take legal action, such as seeking compensation in Court for private remedies on top of the OSC framework – over 95%; disclosing a perpetrator's user information to the victim for certain specified purposes – over 80%.

Sir, the Government started this work a long time ago. We started looking at developing this Bill as far back as 2021, even as the amendments to the Broadcasting Act and OCHA were being worked on. We spent close to five years carefully examining the issues. We conducted numerous surveys and studies into the issue of online harms in Singapore, the findings of which have been presented to this House earlier. We also partnered with the SMU Yong Pung How School of Law to organise the Online Harms Symposium, where distinguished speakers and panellists, including experts on online safety from around the world, shared their insight on key issues and solutions for online harms.

In addition to the Public Consultation exercise, we also met and consulted extensively with over 100 different stakeholders over the years. This includes local and foreign experts, foreign regulators, victims of online harms, social service agencies, lawyers and the Judiciary, the Ministry of Education (MOE) and other educational institutions.

We recognise that the impact of online harms may be felt and experienced differently across different communities. We have heard from various segments of society, including the youths, disability and the community groups. We learnt much from their experiences and their insights and their stories. We also conducted over 20 engagement sessions with technology companies in the past two years to ensure that the provisions in the Bill are robust, workable, feasible and can be carried out when OSC issues directions.

We discovered, through these extensive engagements, a shared belief that the online world should reflect the same values of respect, decency and fairness that we all know and often assume, and which guide us in the offline space.

Around the world, societies are grappling with similar challenges – in Europe, the United Kingdom (UK), Australia and the US. We are moving in step with these global efforts, but shaping our own path, our own course, contextualised and nuanced to what Singapore needs.

Sir, ultimately, law and regulation alone cannot keep our people safe. We will require a whole-of-society effort. Public education must teach users to protect themselves, and every user must take ownership of their safety and behaviour online.

But if we can do this together – build sound laws, responsible platforms and a thoughtful public – we will strengthen not only our digital safety, but over time, our social fabric. And in time, our online norms will not erode, but endure – grounded in respect, anchored in responsibility and guided by the same values that make Singapore strong.

Sir, that leaves me to express gratitude to a few persons and groups who have contributed deeply to our work. As mentioned by Minister of State Rahayu, we convened a Steering Committee who guided our team in shaping our policy. In particular, let me acknowledge two members from the private sector: Ms Stefanie Yuen Thio, joint managing partner of TSMP Law Corporation, member of the Sunlight Alliance for Action and founder and chairperson of SHE; as well as Assoc Prof Eugene Tan from SMU.

In addition, SMU Yong Pung How School of Law partnered us to organise our Online Harms Symposium. The sharing from the experts and survivors at the Symposium informed much of our thinking on this matter. SHE's surveys and research on online harms and their experience in running SHECARES@SCWO, Singapore's first support centre for targets of online harms, provided us with data and insights to refine our policy.

And, finally, all those, many from the public who responded to our public consultation or who have engaged with us with very constructive comments and suggestions or written to my Ministry to share their stories. Every story helps us to shape the contours of this Bill. We thank them for their suggestions over the years.

Mr Speaker, we have in the audience today, in the gallery above, a few who have contributed deeply to our work and gave valuable feedback in developing our proposals. We have members from SHE and SHECARES@SCWO – Natalie, Hemavalli, Lorraine, Saira and Si Han – who together with their teams, served as vital pillars of support for those experiencing online harms today.

We also have representatives from YouthTechSG – Ben, Zoe, Beatrice and Kok Thong – who, together with many others, shared the perspectives of young Singaporeans with us. We are deeply grateful for their partnership and commitment to making our digital spaces safer for all and we record our thanks and gratitude for the time that they have taken and the experiences that they have so generously shared.

Mr Speaker: Mr Henry Kwek.

2.37 pm

Mr Kwek Hian Chuan Henry (Kebun Baru): Mr Speaker, Sir, I stand in support of this Bill, because for victims of online harm, speed matters and justice delayed is justice denied.

A deepfake now takes seconds to create. With powerful AI, harm scales fast and it is not rare. Surveys show that more than 84% of residents have encountered harmful content online and roughly one in three have faced harassment or bullying. In particular, as many of our Ministers and political officeholders have mentioned, for our youth, online and offline are one continuous reality. So, harm in one space, whether it is deepfakes, doxxing, non-consensual intimate content, bleeds into the other.

Yet removal through current Court processes can take months and even years and traces often persist after takedown. This is the gap where victims suffer. Mental health declines, some turn to self-harm, drop out of school or lose jobs. Embarrassments follow, harassment piles on and cancel culture amplifies the pain. These are real psychological injuries. Sometimes a mental prison is worse than a physical one and we cannot keep telling victims to wait another three more months for relief.

That is why, for years, I have consistently called for legislation to curb this danger. In the 2022 Penal Code debate, I urged for laws to prevent cancel culture from taking root. In the 2018 debate on online falsehoods, I argued that democracies depend on facts, transparency and truth in public discourse, and warned that social media algorithms turn some of the most sensational content viral and some of these contents are false and harmful.

That is why I welcome this Bill. It closes the gap and strengthens trust – the foundation of a "we first" society. It protects people, it does not police speech. Expression is not a licence to harass, and freedom comes with responsibilities. It gives our people, especially our youths, room to think and disagree without destruction – a healthy digital commons to engage, create, and learn without humiliation. It delivers timely justice because harm spreads fast and hours matter, and OSC can act in days to remove content, restrict accounts and disclose identities so victims can seek damages.

Sir, for online falsehoods, speed is justice. At the same time, I have a number of clarifying questions to the Ministers for this landmark Bill.

One, implication of the amendments added on top of this Bill. In the amendments put forward, the evidentiary threshold moves from "reason to suspect" to "reasonable grounds to believe". Also, the amendments called for a process to appeal to the General Division of the High Courts. And the amendments call for the inclusion of serious harms like abetment or encouragement of suicide and sexual grooming into this Bill.

But I believe these are already addressed in criminal law, including the Penal Code, sections 306 and 376E; and the Vulnerable Adults Act 2018. To me, these amendments seem to be a substantive change to the due process proposed by this Bill, with implications to the victims – more legal cost, delayed justice. As such, I would like to ask the Ministers to inform us of the implications of these additional amendments put before Parliament votes.

My second point is that, no wrong doors. We already have laws on harmful online content – POHA, Broadcasting Act, POFMA, the Foreign Interference (Countermeasures) Act and OCHA. If someone is harmed online, how do they figure out which law to use? Will OSRA overlap with these or fit in with them? Do victims have to report to several agencies or will the agencies coordinate? And if OSC and the Courts disagree about whether online harm has happened, what happens then? Who has the final say and how is the conflict resolved?

Three, mental health integration. Removing content is necessary, but not sufficient. Victims need healing, not just remedies. Protection and healing must move together. How will OSC work with community groups and appropriate services to support victims, especially youth, of online harms? Can we consider offering psychological support along legal relief? And how will OSC make sure that the staff are properly trained so that they can help without causing further harm?

Four, Personal Data Protection Act (PDPA) and disclosures. Victims need identity disclosure to pursue redress. Platforms must also, at the same time, meet data protection duties. Will regulations clarify how Disclosure Directions interact with PDPA obligations, including purpose limitation and legal‑obligation exceptions? And will good‑faith compliance with valid directions be recognised so platforms can act quickly without fear of breaching confidentiality?

Five, cross border and evasion. How will OSC enforce its directions outside Singapore and will claims under the statutory torts actually help in such cases? The Bill lets OSC tell platforms to collect certain user information, but a person can ignore the request, ditch old accounts and open new ones. In such cases, how will the Bill still get the perpetrator's user information?

My final point, six, preventing abuse and protecting identities. What guardrails will prevent the statutory torts being weaponised for "lawfare", by that I mean frivolous claims or claims without merits to harass or pressurise respondents? As the Bill empowers OSC to identify users and disclose that information to purported victims, what safeguards are there to prevent misuse of that disclosed data for reverse doxxing?

In closing, Speaker, Sir, this Bill makes our online space safer, especially for our youths. It strengthens trust, delivers speed as part of justice and holds platforms and perpetrators to account. With these clarifications, I support this landmark Bill.

Mr Speaker: Ms He Ting Ru.

2.44 pm

Ms He Ting Ru (Sengkang): Mr Speaker, I believe that Members of this House appreciate the growing concerns relating to online harms, and also the complex and evolving nature of these potential harms. From cyberbullying, rise in hate speech, to AI generated content, online exploitation, misinformation and gender-based violence, we are having to deal with a wide variety of harms which have the added threat of going viral. Singapore has a virtually 100% Internet connectivity rate and consequently, everyone is at risk from online harms.

Research in Singapore appears to bear out this fear. A survey released by MDDI last month found that four in five respondents have encountered harmful online content. In May and June 2023, SHE's survey on online harms found that one in two respondents aged between 15 and 24 reported having been victims of online harms. The same survey also reported that two in five victims reported experiencing at least one form of severe adverse impacts, including suicide ideation and physical and mental health issues.

To further add to this, a World Economic Forum article published at the start of this year noted that there is a broader trend where online platforms move away from centralised content moderations and instead, rely more on user contributions to address potentially misleading or harmful content. This has led to various groups being concerned that this shift would worsen the situation for vulnerable groups and create a less safe digital environment.

And this is concerning. As I mentioned during the Committee of Supply for the Ministry of Home Affairs, in my cut on safe AI this year, women and children are two groups that have been found to bear disproportionately the harms associated with such online activities. With the subject garnering greater concern in recent years, the Online Safety (Relief and Accountability) Bill is now before us.

The Workers' Party (WP) agrees that tough measures are required to tackle these harms to allow victims to seek better redress and healing, and to ensure that platforms act responsibly. Whatever the online harms, be they categorised by content, contact, conduct or contract, or by types of harm, such as aggressive sexual or extremist values, the thing that unifies them is that their effect on victims or witnesses are cross cutting. All these harms have effects which are threats to physical or mental health, threats to privacy in the forms of violations or further promote inequalities or discrimination.

However, our position is that the Bill before us leaves certain areas of concern unaddressed. It is, thus, in the spirit of strengthening our online protection regime to better protect victims of such harms that we propose these three main areas of amendment.

First, the Bill leaves out certain harms which result in severe impact on victims. Second, we have concerns about the legal procedures enabling the Commissioner to be the final arbiter and that certain clauses of what constitutes an online harm need further refinement. Third, that there should be reporting requirements from the Commission, which will enhance public understanding and education about the harms, our protection regime and ultimately, build confidence in what we are doing to tackle it.

I will focus my speech on the first set of amendments and how we can better ensure that our tackling of online harms puts victims and vulnerable persons front and centre of our efforts. My other WP colleagues will also cover the other sets of amendments.

In our amendments, we proposed a statutory addition of two sets of activities and a definition of online harm activities.

First, sexual exploitation of children or vulnerable adults, otherwise, known as sexual grooming, with wording proposed in a new subsection O to the definition of online harmful activities contained in clause 3 and expanded definitions of what this comprises in the new clause A.

Second, publication of online material, encouraging or promoting suicide or self-harm, with our proposed insertion of new subsection O to clause 3's definition of online harmful activities and expansions on this in the new clause B.

In a study published by the Institute of Policy Studies in October 2025, child sexual exploitation and promotion of dangerous behaviours were identified as top harms and perceived as online harms of greater security. Yet, the Bill before us does not include the publication of online material, encouraging or promoting suicide or self-harm. This is concerning, as in September 2025, it was reported that teenagers on Instagram were still able to access content relating to suicide and self-harm, and that its teen account's function did not appear to be stopping sexual content being uploaded by children.

Instagram is not the only platform noted for the risks associated with the promotion of self-harm and suicide. In Singapore, SHE's survey ranked sexual harassment as the top online harms encountered by survivors and witnesses, with female youths aged between 15 and 34, more concerned about sexual-based harms. An MDDI survey also noted that 26% of the respondents reported that they have encountered harmful content of a sexual nature.

Thus, with easy access to posts and content online, the high risks of sexual grooming of our minors and vulnerable adults are also of grave concern, but do not appear to be explicitly covered by the provisions of this Bill. Even though the OCHA covers certain sexual offences and various Codes of Practice have been introduced over the years, these harms do not appear to benefit from the full suite of mechanisms in this Bill, which I believe are more responsive and effective in tackling online safety issues that are highly context dependent and time sensitive.

In particular, I refer to the Bill's ability to request to restrain or stop the communication of a class of material by an administrator or communicator. Furthermore, perpetrators seeking to groom minors or vulnerable adults may share online content that encourages, promotes or provides instructions of sexual communication or sexual activity to minors or vulnerable adults. Additionally, publication or communication of such online content, which may be directed at certain groups of persons, not just an individual, would appear not to be covered by the OCHA.

Thus, by excluding sexual grooming of minors or vulnerable adults under this Bill, we miss out an important protection of our children and vulnerable adults in the form of, for example, allowing takedown orders to be made. As the European Union (EU) has pointed out, the risk is that when a child is exposed to or engages with inappropriate sexual content, they risk ending up at greater risk of related content, be it in the form of becoming targets of or perpetrators of sexual grooming of minors, because it has become normalised or desensitised for them. These children may also then become targets for sexual exploitation or streaming of child sexual abuse material.

Thus, the harm posed by content posted to glorify suicide or self-harm and sexual grooming of minors and vulnerable adults is substantial. Thus, we hope that Singapore takes a strong stance against such online content by accepting our proposed amendments and including them under the scope of this Bill to provide more tools to tackle these types of online harm.

Next, victim protection and support. This is complex, as harassment, humiliation and abuse of victims comes in various shapes and forms. Harassment, as we all know, is not limited to just physical stalking, sending thousands of text messages or making dodgy phone calls. For example, our Family Courts are now also enroute to recognising that it is not just physical abuse that causes real and sometimes lifelong and life-threatening harms to victims. Increasingly, they know that other forms of abuse can be just as harmful and online media can be one means through which they are propagated.

We already have laws on the books regarding self-harm, sexualised grooming of minors and protecting vulnerable adults. But they do not explicitly extend to online behaviour. Our amendments seek to harmonise the laws and extend protections given emergent risks in the online space. Additionally, we have to understand that the harm experienced by victims does not end after a report or complaint is filed. It could also extend beyond when the perpetrator has been sentenced, long after the justice system has taken its course. If there are any parallel processes, such as actions taken out of statutory torts or Penal Code offences, the victim would often have to relive and recount their experiences, the impact it has had on them and if there is a trial, subject to cross examination of the extremely traumatic event or events.

This is backed up by a SHE survey, which found that two in five victims of online harm experienced serious emotional or mental health impact, such as depression or fear for safety. Many withdrew from social media entirely, as documented in one of the case studies from the Institute of Policy Studies survey. A handful of respondents even contemplated harming themselves physically or attempted suicide. Even more crucially, for each person who steps forward, how many decide against pursuing matters through the justice system because it is daunting, because they are fearful of it or decide they did not want to re-traumatise themselves by recounting and reliving what has happened to them many times as they go through the systems to seek redress.

Thus, we also have concerns about the access to justice for victims who are contemplating pursuing justice under the statutory tort provisions. The inclusion of these statutory torts is welcome and empowering, but we must not forget that to file a civil claim, victims must gather evidence, bear legal costs and relive the harms done to them time and again. Many would be young people, students, workers who already feel powerless. For them, the idea of commencing a lawsuit is unimaginable.

So, how do we tackle concerns that a fragmented system of relief may emerge that those who are resourced can fight and those without must simply tolerate and try to move on? Will the legal processes be simplified for victims to obtain remedies that they deserve? Directions or orders issued by the Commissioner should also be granted swiftly.

And since under the Bill, the Commissioner is to use quasi-judicial powers who will sit in judgment, and handling reports and complaints filed by victims or agencies, the Commission must be staffed and resourced like a quasi-judicial body, not a customer service centre. Decisions to be made by the Commissioner are not mechanical run-of-the-mill decisions. They require legal, psychological, societal and cultural sensitivity. Will the new agency have specialists, such as psychologists who understand trauma-informed approaches, gender-based violence experts, lawyers experienced in not just defamation and harassment law, but who also understand the often insidious and subtle means in which perpetrators of online harms attempt to assert power and coercive control over their targets, sometimes in the context of post-separation abuse.

This means that our processes to address online harms and provide redress and justice for victims must incorporate a victim-centric and trauma-informed approach. Staff from frontliners receiving reports from victims, to decision makers within the agency should be supported by professionals who understand how best to continue to support and protect victims. Trying to take a stand against harassment, humiliation and abuse requires much from victims. It will run contrary to the intent of this Bill if victims instead find themselves with no support or even met with disbelief when seeking justice.

The principle behind tackling these harms would be to ensure that there is restorative justice, especially for non-criminal harms, which we, as a society, have decided do not warrant criminalisation. As for perpetrators, we too should try to get to the bottom of their motivations and what causes such behaviour. Rehabilitation is thus as important as deterrence.

As the Association of Women for Action and Research (AWARE) has recommended, would counselling orders for perpetrators be one of the tools that could be given to the perpetrators so that, where appropriate, they will receive appropriate treatment to stop them from re-offending after their sentence has been meted out?

In this vein, I believe that, aside from providing psychological support for victims, the Commission should prioritise education as an inoculator against these harms. For OSC officers, this should mean ensuring evidence-based, up-to-date victim-centric education and training for those handling reports or complaints to continually incorporate latest victim-centric approaches to provide victims the support they need. Ground-up knowledge sharing is also important, given that so much of online behaviour is driven by social media culture that is global, fast changing and trend-driven.

We must continue and step up cross agency and sectorial education efforts for both children and adults to increase awareness of online harms and the help available to them. These efforts have to continue to be informed by updated research on the evolving nature of the harms, how they are propagated and take on board the latest online trends, which often move rapidly. Apart from targeting the broad public who may be or were exposed to harmful online content, they should also be proactive in nature and target perpetrators or those at risk of offending. Educational efforts should also be sensitive to and address research findings that those exposed to online harmful content seem to display a higher probability of becoming offenders themselves and then be designed to target those who may be causing both criminal and non-criminal harms.

Now, I will turn to the other categories of amendments.

While we take a strong stance against online harm, we must balance it against the risk that overreach will strip away the normalcy of our online usage. As such, we have tabled amendments to clauses 9, 11, 19 and 26, to ensure that individuals can communicate online material that constitutes fair comment on matters of public interest. The Commissioner can still issue directions or orders when it has reasonable grounds to believe that online harmful activity was conducted.

To further strengthen our understanding and faith in the regime, we have also included new clause C, allowing appeals to the judicial system. This will allow both online entities and individual users transparency. My colleague, Non-Constituency Member of Parliament (NCMP), Mr Andre Low, will elaborate more on these points.

The final group of amendments that we have tabled call for the Commissioner to prepare and submit an annual report to Parliament and specify the areas which should be covered by the report. These include the number of reports received by the Commissioner, the categories of these reports, the number of directions and orders issued by the Commission, the categories of persons or entities who have been issued with a direction or order, and findings by the Commissioner on the risk assessment and trends of online harm.

The proposed new clause E also gives the Commissioner explicit powers to require online service providers to disclose information about their own measures to tackle online harms. Such information is particularly necessary in line with the Institute of Policy Studies' findings that 75% think that, apart from Government and users, tech companies also must do more to tackle online harms and that the 2024 MDDI survey found that 80% of respondents who reported online harms experience issues with platforms reporting processes.

These proposed amendments are driven by the need to ensure that our laws are better understood and remain ahead of the rapidly evolving landscape of online harms. They also provide that groups disproportionately affected by online harms – women, children and vulnerable adults – would also have their interests represented. NCMP Eileen Chong will speak more on these amendments.

Mr Speaker, in conclusion, I would like to emphasise that these proposed amendments aim to strengthen and develop this Bill in line with the Government's stated intentions. With this Bill, we believe that there is an opportunity to act more effectively against material on suicide, self-injury and sexual grooming of minors and vulnerable adults. We have the chance to legislate more clearly to ensure that victims can efficiently, effectively and safely get recourse for online harms. We should also work to make online harms less likely to happen in the first place through education, research and by ensuring platforms are aware of their role in facing up to this challenge, all the while ensuring the justice system works effectively and as intended.

I believe that all of us in this House, as Parliamentarians, as Singaporeans and as human beings, wish online harms were less common and less hurtful than they are. That said, we cannot uninvent the Internet and we are at a point in time where the digital world has made it so easy for people to harm each another without having to directly experience the hurt they have caused others. Our duty to remedy harms is highly complicated and the balance that we seek in legislation and practice will be tested by edge cases, unforeseen circumstances and changes in trends and technology. But today, to the best of my knowledge and from speaking to experts, non-governmental organisations (NGOs) and everyday people who have to deal with these harms, we can legislate better with the amendments I have proposed and I hope that this House will accept them.

Mr Speaker: Mr Sharael Taha.

3.10 pm

Mr Sharael Taha (Pasir Ris-Changi): Thank you, Mr Speaker, Sir. Mr Speaker, for many in my generation and those before, we remember life before the Internet when communication meant pagers and for some lucky few, tumbler handphones and the familiar sound of a dial-up modem. Some may recall an even earlier time, where snail mail is just known as mail.

But for today's generation, our children, students and younger workers, they have grown up knowing only a world that is integrated with online. For most of us, the digital space has become inseparable from daily life. It is where we learn, we create networks, we work, and forms part of our identity.

While the digital realm has brought immense good in connecting us, empowering creativity and expanding opportunity, it has also become a space where real harm can occur. Over the years, we have seen painful examples right here in Singapore that remind us that what happens online can wound deeply offline.

In one tragic case, a teenage student took her own life after enduring prolonged cyberbullying from classmates on social media. Hurtful messages and ridicule in the digital space compounded the pressures she faced in the real world.

There was also the incident of a National Serviceman sentenced to jail after uploading his ex-girlfriend's private images onto a website, tagging her name and social media handle. The images spread within hours and, despite takedown efforts, the victim continued to face humiliation long after.

There are many more examples of online harms and these are not isolated incidents. They show that online harms are not just virtual. They are real, they are lasting and they are deeply personal.

From the early days of Singapore's digital journey, we have always been mindful of the risks that come with greater connectivity and many have played their part in building a safer online space. At the ground level, numerous organisations have been promoting digital well-being and online safety education and many of my fellow People's Action Party (PAP) MPs have also done so in their constituencies. In Pasir Ris East, we began our Digital Well-being and Safety lessons in 2020 and later expanded them through the M³ Goes Digital programme to reach more members of the minority community.

At the legislative level, Singapore has taken progressive and steady steps to strengthen online safety. We enacted POHA in 2014, introduced the OCHA in 2023, and updated the Broadcasting Act and IMDA Codes of Practice to hold online platforms accountable for harmful content.

A safe and responsible digital environment has long been a key advocacy point among PAP MPs. In 2024, our Government Parliamentary Committee (GPC) for Communications and Information, then chaired by Ms Tin Pei Ling with Members Ms Hany Soh, Ms Jessica Tan, Mr Alex Yam and myself, filed a Motion titled "Building an Inclusive and Safe Digital Society." Together with more than 10 PAP MPs, we called for stronger safeguards against online harms such as cyberbullying and urged platforms to do more to protect users.

Hence, our MDDI GPC welcomes the Online Safety (Relief and Accountability) Bill, which builds on these foundations – from ground feedback, to action, to legislation and now continued advocacy in this House. This Bill strengthens the framework to ensure that victims of online harm can receive swift, effective and meaningful relief, while reinforcing the responsibilities of platforms and perpetrators alike.

Globally, jurisdictions such as the UK’s Online Safety Act, the EU’s Digital Services Act, Australia’s Online Safety Act, Canada’s proposed Online Harms Act and New Zealand’s Harmful Digital Communications Act are grappling with the same question: how to protect users while preserving legitimate expression?

Online harms evolve rapidly. To remain effective, our laws must first and foremost protect victims, provide quick and accessible recourse, for in the digital age, harm spreads faster than healing; and ensure that justice is fair, balanced and proportionate, shaping a safer and more respectable online space for all.

I therefore stand in support of this Bill, which takes a pragmatic and, as both Minister Teo and Minister Tong described as a "victim-centric approach," something which I agree. It focuses on timely relief, clear accountability and shared responsibility across all stakeholders in the digital ecosystem.

That said, I would like to raise three areas for clarification: firstly, the definition and application of OSRA; secondly, how we can provide better support for victims of online harm; and thirdly, the possible unintended effects of the Bill.

Mr Speaker, the Bill defines 13 categories of online harms, of which the first five – online harassment, doxxing, online stalking, intimate-image abuse and image-based child abuse – will be implemented first. Minister Teo shared that the remaining eight will be followed progressively.

The Bill defines online harassment as the communication of material that a reasonable person would consider threatening, abusive, insulting, sexual or indecent, and likely to cause harassment, alarm, distress, or humiliation to the victim. While this provides a sound foundation, I seek clarification on how the threshold will be determined for online harassment? Will it depend on the number of incidents, the severity of content, or the type of platform involved, something to which Minister of State Rahayu alluded to. How will the OSC determine what breached the online harassment threshold and how will we compare one versus the other?

I would also like to seek clarification about the timeline and criteria for introducing the remaining categories. Also, if a victim experiences a form of harm covered only in a later phase, will the OSC have discretion to act on the case?

For the law to be effective, the OSC's investigations and directives must be swift and timely. Hence, may I ask, what is the expected case load under the first five categories? In Handout 2, which was distributed earlier, on the USC User Journey, what is the expected time between "a file reported to OSC" to when the "OSC may issue direction against the communicator, administrator and platform?"

In the earlier speech, Minister of State Rahayu shared that victims can file a report after reporting to the online platform where the harm is occurring. How do we ensure that this proactive reporting is not abused?

Who will be the case workers? How will OSC case workers assess the incidents? Will they be doing it manually and how do we scale it up quickly? The key point I am trying to make is, how do we adequately resource the OSC with the expertise, the manpower and the technology required to ensure that victims can get effective and timely relief? We need to set the OSC up for success.

Lastly, the Bill empowers the OSC to issue directions to the communicator, administrator, or online service provider involved. Can the Government clarify what qualifies as an "online service provider"? Will this extend to closed group or platforms such as WhatsApp, Telegram and WeChat?

Many Singaporeans are administrators of private chatgroups among friends, colleagues, classmates or even strangers such as in Build-To-Order chatgroups. Some of the most hurtful cyberbullying can occur within such closed platforms. How will the Bill handle cases arising from private or semi-private groups, where the boundary between personal and public communication is slightly blurred?

Mr Speaker, while the Bill rightly focuses on timely and accessible relief, a truly victim-centric approach must not only enable victims to act against perpetrators but also help them recover from the harm. Victims may take civil action through a statutory tort and claim enhanced damages in serious or malicious cases, a clear message that accountability applies equally online. However, the Bill does not address psychological and social support, especially for victims of cyberbullying, doxxing, or image-based abuse. The emotional scars often outlast the digital content itself.

The OSC can also play a broader and more supportive role in caring for victims of online harm, acting as a one-stop point of contact that connects them to counselling services, community partners and social agencies that can provide the help they need.

In Handout 2, it is also noted that counselling support today is self-initiated by victims at the point when the harm is experienced. But we know that, for many, this is often the most difficult time to reach out, when the victims may feel alone, frightened, or unsure of what to do next. Can I suggest for the system to be a bit more proactive? So, when a victim files a report with the OSC, could there be an automatic prompt or offer for counselling support, so that help is not just available but actively extended? This small change could make a big difference, ensuring that victims receive structured, compassionate and timely support at the moment they need it most.

Working with the Ministry of Social and Family Development, Ministry of Education, the Institute of Mental Health and social service providers, the OSC could establish a Victim Support and Assurance Framework comprising a confidential helpline or online portal for guidance; access to counselling and trauma-support services; and collaboration to track and prevent recurring patterns of online abuse.

Being victim-centric means more than punishing offenders. The law must deliver justice but beyond that, it is must also help victims rebuild confidence and trust in our digital space. With such a framework, the OSC can be not just a regulator but a guardian of digital well-being for all Singaporeans.

Mr Speaker, while the statutory torts clarify duties of online actors, we must be alert to unintended effects. Clearer legal rights may lead to more frequent or premature litigation, as victims, intermediaries, or even platforms resort to the courts as a first response rather than a last. We must guard against creating a more litigious society, where disputes are too readily taken to court. This would only make it harder for victims, many who are already distressed, who may hesitate to come forward because of the financial, emotional and psychological toll of litigation, and the pain of having to relive their trauma. Some may also fear losing a civil case and being burdened with legal costs.

This concern is even more acute when victims face large, well-resourced companies or global platforms. Such entities could exploit the prohibitive cost of legal action to deter victims from pursuing justice. Many may simply give up, unable to sustain long or expensive proceedings.

Hence, I support the approach of limiting appeals, to ensure that relief remains timely and accessible. If every dispute were taken into lengthy court processes, many victims would refrain from seeking help altogether, something which Minister Tong has shared earlier too. Justice must never depend on one’s capacity to afford it.

That said, I urge the Government and the OSC to monitor the use of statutory torts closely, ensuring that they empower victims rather than intimidate them; promoting fairness, accountability and confidence in our online safety framework. Mediation and conciliation should be encouraged as first-line remedies, and the OSC could monitor the civil actions to ensure the law remains fair and accessible to all.

Ultimately, this Bill must empower victims, not overwhelm them, making justice attainable, not intimidating; strengthening accountability without fostering a culture of fear or excessive litigation. Mr Speaker, allow me to provide a summary in Malay please.

(In Malay): [Please refer to Vernacular Speech.] Mr Speaker, this Bill is extremely important because the digital world is now integral to our daily lives. Online spaces bring about many benefits for learning, working and networking, but it also comes with harms such as cyberbullying, sharing of private images without consent and doxxing, which can cause profound emotional impact.

All this while, the Government and our people have worked to protect online users. We already have POHA, OHCA and IMDA guidelines to ensure digital platforms behave more responsibly.

Many PAP MPs have also conducted programmes such as Digital Well-being and Safety, and at M3 Pasir Ris, we organised M3 Goes Digital to educate citizens about cybersecurity. This new Bill complements those efforts and provides greater protection to victims.

With that, I would like to seek three clarifications.

First, how do we determine online harassment and when will the remaining categories of online harms be implemented?

Second, the importance of support for victims, including counselling assistance and guidance that can be coordinated by the OSC.

Third, ensuring this law does not bring about an excessive litigation culture, especially against the bigger firms that have the ability to intimidate victims with high legal costs.

This law must ensure swift action and immediate relief to victims, so that justice can be achieved expeditiously. Our goal will be to build a safe and responsible digital space for all Singaporeans.

(In English): Mr Speaker, Singapore's online space must be a place of opportunity, not harm. The Online Safety (Relief and Accountability) Bill represents another important step in our collective effort to make our Internet a safer and more trusted place for all.

This Bill builds on more than a decade of sustained progress and advocacy from the ground-up initiatives by community partners, PAP MPs who have championed digital well-being and online safety, to key legislative milestones, such as POHA and OCHA, and many others.

Together, these efforts reflect our continued commitment to protect Singaporeans from online harm while promoting a culture of digital responsibility. The Bill before us today is not a standalone measure, but part of a long-term journey; one that combines education, legislation and compassion to safeguard every user, especially the most vulnerable.

This, Mr Speaker, is the spirit of our digital future – a connected society that is also a caring one where freedom of expression is matched by responsibility and where justice online is real and reachable as justice offline.

I stand in support of this Bill and the shared goal it represents: to build a safe, responsible and inclusive digital Singapore for all.

Mr Speaker: Mr Andre Low.

3.17 pm

Mr Low Wu Yang Andre (Non-Constituency Member): Mr Speaker, before I begin, I would like to first declare that I work for a financial technology company that may fall under the definition of online service provider as envisaged in this Bill.

Mr Speaker, the WP understands the motivations behind this Bill and we deeply wish to support it but we have some reservations.

Ms He Ting Ru has addressed the first question the Bill raises which is would it adequately protect the most vulnerable victims. She has outlined how we can strengthen that protection – I support these amendments.

I address the second question: will this Bill be fair, accountable and properly calibrated in its exercise of its powers?

The Bill grants the Commissioner significant authority to issue directions, compel removal of content, impose obligations on service providers to reduce engagement with material without the creator's knowledge. These are necessary powers to address real harms, but there are also powers that should be carefully designed for their intended purpose and appropriately constrained by institutional checks and balances.

I will address three amendments that address our concerns about the Bill's architecture and six areas requiring Ministerial clarification.

Let me begin with our proposed amendments.

First is raising the threshold for state action. Clause 26 of the Bill sets the threshold at which the Commissioner may issue directions. The current text reads "reason to suspect". We believe that threshold is too low, too low for the powers being granted. "Reason to suspect" is a subjective test, it permits actions based on intuition or preliminary information without requiring objective evidence.

Our proposed amendment raises this to "reasonable grounds to believe". This is the same standard used in comparable legislation overseas, including the UK's Online Safety Act and Canada's proposed Bill C63, or their Online Harms Act. This requires evidence that would satisfy a reasonable person, not merely a suspicion. Some may argue this is semantics. We do not believe this is so.

In the UK Act, they explicitly distinguish between "reasonable grounds to suspect" in order to begin an investigation as opposed to "reasonable grounds to believe" to take enforcement action. We believe that this is the right approach. The former permits inquiry, the latter permits coercion. The distinction matters. If we are serious about protecting victims, we must be equally serious about ensuring the Commissioner's enforcement powers rest on evidence, not suspicion. Our amendment achieves both.

The second set of amendments I will be addressing concern legitimate discourse.

Clauses 9, 11 and 19 define three online harms: online harassment, non-consensual disclosure of private information and instigation of disproportionate harm. These definitions are necessary, but we think they are incomplete.

Let me give three scenarios.

First, a citizen posts fair criticism of a public official's conduct. If a reasonable person were to conclude that that criticism is, "abusive or insulting", and it is likely to cause the official distress, under clause 9, this could be harassment. Secondly, a victim of harassment publishes text messages from their harasser online as a call for help or perhaps, a warning to others, under clause 11 this could constitute non-consensual disclosure of private information. In the third scenario, a journalist publishes leaked documents exposing corruption in a government-linked entity, but under clause 11 this could also fall under the definition of non-consensual disclosure.

Mr Speaker, I do not suggest these outcomes are intended, but the Bill, as drafted, seems to permit them.

The definitions contain minimal carve-outs. There are no obvious exclusions for public interest and they do not go far enough in recognising that not all disclosures of private information are harmful and not all uncomfortable speech is harassment.

So, our amendments insert these safeguards.

For clause 9 we add that communication is not harassment if it constitutes fair comment on a matter of public interest. This is drawn from the established common law defence to the tort of defamation.

For clause 11, we add that disclosure does not fall within the definition of if the public interest in disclosure outweighs the public interest in privacy. We list seven examples – including exposing wrongdoing, informing the public on matters of significant concern and protecting public health and safety. This amendment is modelled on a well-established balancing test in comparable common law jurisdictions, including the UK. As drafted, it is also very similar to the test as codified in the Australian Privacy Act.

For clause 19, we add communication does not constitute instigation if it relates to a matter of public interest.

Our amendments do not weaken the Bill. They sharpen it. They ensure that the Commissioner's powers are used to protect victims, but not to chill legitimate speech. They prevent this Bill from inadvertently silencing criticism, investigative journalism or public interest disclosures.

The third set of amendments that I will be addressing establishes independent oversight.

The Bill establishes an interim appeal mechanism. Clause 60 creates an appeal panel whose members are appointed by the Minister. Clause 63 provides the right to appeal Commissioner decisions to an Appeal Committee drawn from this panel. Crucially, clauses 63(5) and (6) state that no further appeals will be permitted beyond this Appeal Committee. This makes a Ministerially-appointed committee the final arbiter.

Our amendment deletes these two clauses and sub-clauses and inserts a new clause C, establishing a right of appeal to the General Division of the High Court. Our proposed appeal mechanism is not unlimited. It is confined to three grounds, a point of law; secondly, that the harmful activity did not occur; and thirdly, that compliance is not technically feasible.

This is similar to the appeal mechanism in POFMA. This ensures that the Courts do not become a general review body for every single Commissioner's decision but remain available as an independent check on questions of legality, fact and feasibility.

Mr Speaker, this is not about distrusting the Commissioner or the Minister. This is about institutional design. When the state exercises coercive powers, especially a power that could affect livelihoods, reputations and businesses, there must be a route to independent judicial appeal of cases.

So, some may say that our proposed amendments may introduce additional burdens on the Courts.

I also appreciate Minister of State Rahayu and Minister Tong's earlier clarifications as to the policy considerations at play here. Minister of State Rahayu referred to the inherent right of judicial review, which is still available as there is no ouster clause within the Bill. We appreciate that judicial review is always available, but the scope is generally limited, in this case, it will be generally limited to the process of the Commissioner making their decisions and not about the merits of the case itself.

The proposed amendment we have tabled is very limited in scope. It proposes expanding this right of appeal to the Courts to three limited grounds. We understand that a balance needs to be struck and we are striving to achieve that.

With reference to Minister Tong's suggestions that that there may be a David and Goliath situation if we avail of appeals to the Courts. We believe that the existing provisions within the Bill, as drafted, notably section 63(4) which provides that there is no automatic stay on directions, even when an appeal process is proceeding, helps to ameliorate that concern. There will be no continuing harm to victims as the appeals process is proceeding. Furthermore, when we avail of the Courts, there are mechanisms that can address some of these concerns as well, such as in-camera private proceedings as well as gag orders to protect the identities of victims.

Mr Speaker, beyond these three amendments, I now turn to six areas where the Bill requires clarifications from the Minister.

First, I would like to address the scope of doxxing. Clause 10 defines doxxing as publishing identity information where a reasonable person would conclude was likely to have been intended to cause harassment, alarm, distress or humiliation.

Here is my concern. If a victim identifies their hitherto anonymous harasser online to warn others in the community, could a reasonable person conclude that this was intended to cause the harasser distress or alarm? The Bill does not answer this; so, I ask the Minister, will victims who expose their harassers in this manner be caught by clause 10? If not, what safeguards exist in the Bill's design to prevent this?

Second, I would like to seek clarifications on standing to appeal where directions are given to platforms. Clause 28 sets out who receives Part 5 directions. Some, including the stop communication and restraining directions, can be issued directly to the communicator, but others, such as Access Disabling, Account Restriction and Engagement Reduction Directions are issued to platforms or administrators. When a direction is not issued to the communicator directly, can they still appeal?

Clause 61(1)(e) allows the recipient to appeal, but the communicator is not the recipient. The subsequent subsection says that they may appeal only if they fall within a description, the Minister may prescribe under clause 82. So, I ask the Minister, will the regulations be made to ensure that communicators have standing to appeal directions that restrict their content, even if those directions were not issued directly to them or will this remain subject to Ministerial discretion?

Third, the meaning of prescribed connection to Singapore. I understand Minister of State Rahayu has addressed this example. She has given the example of long-term residents in Singapore who will fall under the ambit of this provision. We would like to seek further clarifications.

So, it is clear that Citizens, Permanent Residents (PRs) and long-term residents are eligible to make a report. We would like to understand what else the ambit of prescribed connection in Singapore could mean. Does it cover a foreign spouse on a long-term visit pass harassed by someone in Singapore? Subsequently, does it also cover a former Singaporean resident who has now moved overseas, who is still being targeted by the Singapore-based individual? So, I ask the Minister who is protected by this Bill, who is excluded and whether the prescribed connection will be made clearer?

Fourth, the exemption for public agencies. Clause 4(2) states that public agencies cannot be given directions or orders under Part 5. Part 5 contains the Commissioner's enforcement powers. This means that if harmful content originates from, is hosted by or is facilitated by a public agency, the Commissioner cannot compel the agency to act. Clause 4(3) goes further. Public agencies cannot be sued under the civil proceedings – provisions in Parts 10, 11 and 12.

So, Mr Speaker, online harm is online harm, regardless of its source. A citizen harassed through content on a Government platform or by a Government account experiences the same distress as one harassed on a private platform or by a private account. So, I ask the Minister, why are public agencies exempt from the Commissioner's enforcement powers and from civil liability? What is the policy rationale for the asymmetry and what recourse does an individual have if they experience harassment, perhaps, from a rogue public employee using an official account?

Fifth, Engagement Reduction Directions and Class of Material Directions. Clause 40 grants the Commissioner power to issue engagement Reduction Directions. This allows the Commissioner to require a service provider to reduce the engagement of end users with a Class of Material Direction without removing it. Clause 41(3) explicitly states that it is not necessary to give any person who may be subject to a Part 5 direction, an opportunity to be heard before the direction is given. This means content posters are not necessarily informed, the content stays online. It remains visible to the poster; it also remains visible to the victim. None will be aware that the contents' reach has been throttled. The victim will continue to see the harm, wondering if the report achieved anything at all.

Mr Speaker, this is a shadow ban. So, my question to the Minister is simple: what is the envisaged use of this particular direction? Surely, in cases of legitimate serious harms, one of the other more forceful directions is the better solution for victims, and if this is meant as a less forceful direction, why does the Commissioner see the need to intervene at all?

Clause 40, along with clauses 30 and 33, is also a Class of Material Direction, so this raises related concerns. These Class of Material Directions allow the Commissioner to issue Stop Communication Directions against entire categories of content, identified by specific identifiers, such as a username, a term, an online location.

There is a high risk that this will function as a digital dragnet that may capture legitimate content alongside harmful material. I give an example. If a victim's advocacy group reporting on an emerging category of online harms, as a warning to the community, posts about it, they may see that their material is swept up by mistake.

So, on Class of Material Directions, I have two questions: what safeguards exist to prevent over-blocking and what remedy exists for users whose legitimate content is caught inadvertently by the directions?

Sixth, and finally, I address consistency in decision making. Parts 10 to 12 of this Bill establish statutory torts that will be adjudicated by the Courts. These will generate judicial precedence and publish decisions. There will be transparency and consistency through the common law process.

But Part 5 directions, which are quasi-judicial in nature, there is no requirement for publishing reasons or directions. There is no public record of how Commissioner interprets and applies the definitions of harm in Part 3 and other sections. So, I ask the Minister, will the Commissioner be bound by prior decisions that they have made, or will each case be decided on fresh discretion? And secondly, will the Commissioner be bound by clarifications given today, in this House? If the Minister states that, for example, Clause 10's doxxing definition does not capture victims exposing their harassers, can future Commissioners be held to this clarification?

Mr Speaker, this Bill introduces fundamental protections for victims of online harm. We support these protections. Our amendments ensure that the powers are exercised on evidence with safeguards for legitimate speech and with independent judicial oversight, but oversight is meaningless if those affected cannot assess it. So, the Bill's ambiguities on doxxing, on standing to appeal, on coverage, on public agency exemptions, on shadow bans, on consistency in decision making, these are not minor. They go to the heart of who is protected, who is excluded and whether the regime operates fairly. I hope that in the course of this debate, that we can get clarity and comfort on these questions.

So, I urge the Minister to accept our amendments. If the Minister declines, I ask for clear answers to the questions I have raised. I think Singaporeans deserve a regime that protects victims and respects fairness in equal measure.

Mr Speaker: Ms Yeo Wan Ling.

3.34 pm

Ms Yeo Wan Ling (Punggol): Mr Speaker, Sir, I rise in support of the Online Safety (Relief and Accountability) Bill, because behind every clause of this Bill is a face, a family and a cry for help. This is not just a Bill about digital platforms or statutory duties. It is about people, especially our young, our parents and those whose lives have been shaken by what is said or spread online.

Mr Speaker, the Internet moves faster than our feelings can heal and, indeed, faster than our reputations can be rebuilt. A harmful post, an exposed address, a doctored photo – once uploaded, it spreads in seconds. That is why the speed of takedown must match the speed of harm.

In an earlier debate, I shared the story of a young lady, Ms K, living in Punggol who was doxxed by her best friend's ex. He had created an account, just for doxxing and had put up one post with her face, name, place of work and where she used to study. When she came looking to me for help, the post had been up for two weeks. But even with the help of her loved ones, friends and, indeed, the extended Punggol family reporting the doxxing, the post was still not taken down even after a month had passed. I reported this as well, but was told, two weeks after I reported, that the platform is experiencing high volumes of reports and that I should report on the post, and not the account.

Note, this was a one-post account. It finally took several interventions by Government agencies and a certain trusted reporter/flagger NGOs to get the post taken down. By then, the post was already seen by a few hundred people, Ms K was aghast, but relieved and thankful. That is, until a few weeks later, she contacted me again, to tell me that her friend's ex, had doxxed her again, this time on the same account and on another platform. If this House feels weary just hearing about this, imagine how defeated she felt. This "Groundhog Day" is what many victims face – harm that repeats and hope that fades.

That is why I support the creation of the OSC, which is a one-stop centre to coordinate Government action and issue Takedown Directions across platforms. Victims should not have to file multiple reports to multiple agencies. OSC can help victims in navigating and simplifying the reporting process, from navigating which protective online harms statutes to invoke – is it POHA, POFMA, the Foreign Interference (Countermeasures) Act, OCHA, the Broadcasting Act – to ensuring that victims do not need to make multiple reports to multiple agencies and multiple social media platforms.

I also urge that Ministry to set clear service standards for takedown speed. This should not be measured not in days, but in hours. Online harm does not wait and neither should help. Because someone under siege, an hour can feel like forever. Families deserve both a response and a resolution.

I have several clarifications for the Minister: (a) how will the Commission ensure reports are simple to file, yet protected from false or vexatious submissions? In extreme cases, how do we guard against misuse of the system, and prevent it from becoming weaponised against the very same people it had set out to protect; (b) given that platforms themselves face high reporting volumes, how will OSC manage a potentially large caseload without compromising response speed; and (c) finally, while the first schedule covers harassment and doxxing, other categories, such as online impersonation and deepfake abuse, will only be implemented later. Could the Minister share the timelines and what recourse victims have, if their harm type is not yet covered or even identified?

Mr Speaker, I also support the Bill's recognition of the different players in the online ecosystem and, in particular, the responsibility of platforms. Platforms are not neutral bulletin boards. They are amplifiers of information, of emotion and, sometimes, of harm. When harm is done through them, they must be part of the solution, not silent observers.

I hope that with this Bill, delays in response will no longer be acceptable. Platforms must design systems that detect and throttle the spread of harmful content; provide real-time cooperation channels within OSC; and be transparent about how quickly and how often they act on Takedown Directions.

Families should not have to fight for accountability post by post. Responsibility must also rest with those who profit from the platforms that host them. And the Bill rightly requires platforms to "take reasonable measures" once notified of an online harm. But I would like to clarify on how this will be measured, what counts as "reasonable", and how OSC will assess compliance, particularly for major platforms with wide reach and significant influence.

Mr Speaker, laws can set boundaries, but they cannot mend a wounded heart. Real safety depends not just on legislation, but on the strength of our community: one that notices, listens and steps forward when someone is in distress. In Punggol Shore, we started a community initiative several years ago – the Punggol Positivity Pizza Movement. It began with our youth leaders and families who wanted to create a visible, warm symbol of encouragement for our young residents.

Their message was simple: "P.S. Punggol Shore – I love you. If you're going through a hard time, please reach out, because your community loves you."

Through the Positivity Pizza movement, volunteers from every corner of Punggol Shore – seniors, youths, women's groups, sports teams and religious organisations – have come together to crochet pizza plushie keychains, with a heartfelt goal: to make 20,000 pizzas, one for every young person living in Punggol Shore. Each slice carries a message of solidarity, "It's not okay to be not okay, reach out and your community has your back."

The inspiration for this movement came from a young resident, C, whom I met at a local mall. I had known her since her primary school days and when I asked how she was coping with secondary school, she confided that she was struggling with online bullying and thoughts of self-harm. As we spoke, she noticed a small keychain plushie on my bag and she said softly and shyly to me, "Ms Yeo, that is very cute. May I have it, so that I know that someone cared enough to stop to ask me if I was okay."

To C, if you are watching this, thank you. Know that you have started a movement that has touched thousands of lives.

Our Punggol Shore handcrafted pizza slices have become small but mighty ambassadors of empathy – connecting neighbours, bridging generations and providing that kindness, even if it is handmade and in real life, can still go viral. If cruelty can spread fast, then kindness must spread faster. When we see a friend being bullied or humiliated online, do not scroll past. Reach out, report, reassure. When we teach our children to stay safe online, let us also teach them to be kind online.

To complement this Bill, we, the Members of this House, the OSC, the community, must strengthen partnerships with our Family Service Centres, schools, youth networks and grassroots organisations, to build digital literacy, emotional resilience and first aid for the heart. So that our parents, our seniors and our young persons alike, can learn not only to protect themselves, but also how to support one another. Because real safety, Mr Speaker, does not come just from stronger laws. It must also come from stronger community bonds. Mr Speaker, in Mandarin.

(In Mandarin): [Please refer to Vernacular Speech.] In the online world, a harmful post or a malicious photo can spread across the entire island within minutes. So, we must act quickly. When harmful content appears, it must be taken down as soon as possible. Every hour of delay causes additional pain to affected families. We must also care for and protect one another. At the same time, I want to call upon all major online platforms to work together with us.

Let us join hands to create a safe, caring and responsible online environment. Harmful content is not someone else's problem, but an issue that affect every Singaporean family.

(In English): Mr Speaker, at the heart of this Bill lies our families and children who form the soul of our communities. Our online world must reflect the same values that we keep our physical neighbourhoods safe and that is, respect, responsibility and care. The Bill is part of this answer. Mr Speaker, I support the Bill.

Mr Speaker: Ms Eileen Chong.

3.44 pm

Ms Eileen Chong Pei Shan (Non-Constituency Member): Mr Speaker, in Mandarin, please.

(In Mandarin): (In Mandarin): [Please refer to Vernacular Speech.] Mr Speaker, I support the objectives of Online Safety (Relief and Accountability) Bill.

This can provide more timely channels for victims of online harm to seek help, improve online safety, deter and prevent online harmful behaviour and promote rational and responsible online conduct.

We can and should, ensure that victims receive more timely and comprehensive relief. At the same time, the WP believes this Bill can be further optimised and therefore proposes several amendments to the Bill.

We suggest expanding the definition of "harmful online content". The Bill currently includes 13 categories of online harm, including common online harassment, doxxing, online stalking, intimate image abuse and child sexual abuse material. We suggest including two additional categories of online harm, namely content that encourages self-harm and suicide and sexual grooming of children and vulnerable adults. This would enhance the scope of the Bill's jurisdiction.

Furthermore, we can also strengthen the fair and just implementation of the Bill. The Bill will establish the OSC to assist victims of online harm through unified regulatory and appeals mechanisms. Our amendment proposal advocates for the courts to be the final arbiter for appeals, in order to improve the appeals mechanism and strengthen independent oversight.

Another suggestion is to enhance transparency and accountability mechanisms. We propose amendments requiring the commission to submit annual reports to Parliament. The report content should include the number and types of reports received, processing times, the content of orders and directions issued, online harm risk assessments and trend analyses and personal privacy protection measures taken. Publishing annual reports can make the mechanism more transparent and provide public reassurance that this Bill is effective.

(In English): Mr Speaker, I want to begin by expressing my support for the intent of the Online Safety (Relief and Accountability) Bill to provide victims of timely means of redress, promote and improve online safety, deter and prevent online harmful activity, and promote accountability as well as responsible and reasonable conduct online.

The statistics tell us a troubling story. MDDI's Perceptions of Digitalisation survey found that 84% of Singapore residents encountered harmful content last year. One in three experienced harmful behaviour directly. The SHE report also shows that female youths were twice as likely to experience sexual harassment online.

Mr Speaker, online harms are not a mere inconvenience. They can be severe, life-altering threats to safety and peace of mind. Because online harms can be constant, invasive and hard to escape, they can be as harmful, if not more harmful than physical harm. We know that the current system is inadequate. Victims of online harmful activities face challenges reporting such behaviours to online service providers. They report the content; they wait hours, usually days. Often the content stays up.

Ministers Teo and Tong referred to the IMDA's 2024 report, which found that platforms took an average of five days or more to act on user reports of harmful content that violated their own community guidelines. Still, most platforms acted appropriately on only half of the harmful content reported. For harms that are, by nature, constant, invasive and hard to escape, five days feel like forever. This Bill offers victims more tools, ways to get harmful content taken down faster, and to hold perpetrators accountable. That matters, and I support it.

At the same time, a good idea still needs good implementation. Given how this Bill grants significant powers to a new agency, the OSC, that will safeguard our rights in this fast-evolving space, I believe it warrants further examination of the Bill with care and rigour.

Mr Speaker, whether the Bill's policy objectives are achieved hinges heavily on the proper functioning of the OSC. The Commission will have an extensive mandate. It will receive and triage reports, investigate whether thresholds of harms are met, issue directions and orders, monitor their compliance and handle appeals. This is a lot of responsibility and it is crucial to get three things right: capacity, independence and transparency.

First, will the Commission have what it takes to do the job? It will be on the frontlines dealing with cases that will require an exacting mix of legal expertise, technical knowledge and sound judgment under pressure, all set amidst the fast-moving digital landscape. Given these responsibilities, I second my colleague He Ting Ru's point that the Commission should be staffed and resourced like a quasi-judicial body, not a customer service centre.

I look forward to the Minister's clarification of what is the anticipated resourcing for the Commission in terms of manpower and budget. What specific expertise will be recruited? Will it include lawyers and technologists who can keep up with the fast-evolving digital landscape, mental health professionals who understand trauma, and people with lived experience of online harms?

Additionally, Minister Teo also rightly pointed out that victims are at the heart of the Commission, and she also importantly flagged the need to minimise the traumatisation of victims. How will the Ministry ensure that there are trauma-informed care and processes for both the victims who engage with the Commission and Commission staff who will have extensive interactions with these victims?

Second, will the Commission remain independent when it matters the most? The Commissioner is appointed by a Minister and is subjected to Ministerial Direction. The Commissioner will make important assessments on the factual veracity and reputational harm of statements, a profoundly difficult and at times, contentious exercise. Unlike public agencies, political officeholders themselves can make reports to the Commission. This means that the Commissioner may find herself having to make judgments about factual accuracy and reputational harm, judgments that could, in some cases, involve content critical of political officeholders, possibly including the very Minister who appointed her.

To be clear, this is not a question about the integrity of any future Commissioner or Minister. It is about how good governance is not solely built on trust in individuals but also on systems that work, and which remain independent and accountable. For the Commission to become a trusted institution that endures, we need to ensure that it is structurally resilient against potential conflict of interest and/or abuse. I welcome the Minister to clarify what mechanisms there are to ensure the Commission and Commissioners will remain operationally independent, particularly when handling content that is politically sensitive.

Third, how will Singaporeans know that the Commission is working as intended? Mr Speaker, the powers granted to the Commission can only be justified if Singaporeans can see that they are being used effectively and appropriately. We deserve to know. Is the system working? Are reports being handled fairly and quickly? What kinds of harms are the most common and are we adapting to emerging threats?

I invite the Minister to clarify how the Commission’s effectiveness will be measured and to whom it will be held accountable. We believe that the Commission can and should do better to counteract the opacity of online service providers in regulating online harms. For this reason, the WP has tabled an amendment requiring the Commission to publish annual reports. The report should contain: the number of reports received; quantity and types of directions and orders issued by the Commissioner; turnaround time for resolving reports; findings on risk assessments and trends relating to online harms.

This is neither radical nor unprecedented. It is the standard practice of the Commission’s foreign counterparts with similar mandates – the Australian eSafety Commission, the UK’s OFCOM and the EU’s Digital Services Coordinators. Australia and the EU even go a step further to enshrine mandatory review of the legislation after a fixed period to ensure that they remain relevant. I was heartened to hear Minister of State Rahayu Mahzam state that the Commission will consider publishing regular reports for public awareness. I would like to urge this House to go a step further by getting the Commission to commit to doing so by accepting our amendment.

Public reports by the Commission will help build public confidence in its work. They will also help researchers, civil society groups and even Government agencies better understand what is happening online so we can respond more effectively. Transparency is not a burden. It makes good regulation sustainable.

And related to transparency is the question of consistency. Beyond aggregate reporting in the form of annual reports, there is also the question of how the Commissioner makes individual decisions for each case and whether they create precedent. As the Bill does not require for decisions made by the Commissioner to be published, my colleague, Andre Low, has sought clarification on whether the Commissioner will be bound by prior decisions or if each case will be decided with fresh discretion. This matters because consistency in decision making is fundamental to fairness. Published decisions, even in anonymous or partially redacted form, provides predictability and prevents arbitrary outcomes.

I would also like to thank the Minister for his clarification that the framework on end-user identity disclosure is developed through close consultation with industry partners, and I appreciate the graduated approach.

One defining feature of the Bill is the powers that it grants the Commissioner to unmask anonymous users. Section 52 allows the Commissioner to require an online service provider to obtain end-user identity information even if it is not already in the provider’s possession. Section 53 allows the end-user identity information to be disclosed to victims for a prescribed purpose, based on reasonable suspicion that the end-user has engaged in harmful activity.

This power has real value. First, it serves as a deterrent. When perpetrators know they can be identified and held accountable, some might think twice before posting that intimate image, sending a threatening message or coordinating a harassment campaign. Second, it empowers victims. They deserve the option to pursue justice beyond content removal. They will have the option to pursue civil remedies under the new statutory torts. They can seek damages for the harm they have suffered. But none of that is possible if they do not know who they are taking to Court.

So, while I support this power in principle, the bigger question is how it can be implemented responsibly. Because, Mr Speaker, identity disclosure is a one-way door. Once disclosed, information cannot be undisclosed. And the risks are real, not theoretical. We have seen the bad that can happen when someone’s identity is exposed. They include some of the very harms this Bill targets, like doxxing, and in extreme cases, even extend to actual physical harms.

And this is why the standard matters. I welcome the Minister’s assurance that these measures are not meant to affect ordinary users who act responsibly. The Bill allows identity information disclosure for a prescribed purpose based on “reasonable suspicion” of harmful activity, short of a formal determination. This means that someone could be unmasked based on an allegation that may not ultimately be proven. And yes, penalties exist for misuse. But penalties are also reactive. They punish the harm after it occurs. They do not prevent someone from using identity information for vigilantism or sharing it with others in the window before enforcement.

Mr Speaker, the Minister has also pointed out that anonymity is not inherently bad. For some users, it serves as protection. Academic literature on online safety recognises the dual nature of online anonymity. It can enable authentic expression while also necessitating education to promote constructive dialogue. It enables marginalised voices to speak up, facilitates frank discussion of sensitive topics and protects some from discrimination or violence.

And this is why we are not arguing against end-user identity information disclosure. We are urging that we get the balance right so it works as intended. Deter perpetrators while empowering victims, yet also carefully calibrated to prevent misuse and protect legitimate anonymity. The threshold should be clear enough for consistent application, yet flexible enough to account for context and severity.

In this regard, I would still appreciate the Minister’s clarifications on the following.

First, on the standard of “reasonable suspicion”. What specific guidelines, training and objective criteria will be used to ensure the definitions of the various “online harmful activities” are applied consistently, predictably and fairly by the Commissioner?

Second, on “prescribed purpose”. Minister had earlier mentioned that it would enable victims to bring their case and eventually consider extending it to other purposes. Could the Minister clarify how broad this scope could be? Additionally, how will the Commission verify that applicants genuinely intend to use the information for a prescribed purpose?

And third, safeguards for disclosure. I was heartened to hear Minister mention about limiting its use for legal remedies. What conditions will be imposed on the applicants who receive the identity information? Will there be consequences if they subsequently chose not to pursue the prescribed purpose? How will the Commission intervene if it suspects that information was misused before criminal liability is established?

And finally, Mr Speaker, no enforcement mechanism, however well-designed, can eliminate online harms. Our goal should not be to create a system where Singaporeans constantly defer to regulatory power. We must continue to empower Singaporeans young and old to navigate the online world safely, think critically about what they encounter and act responsibly.

While it is important that victims have practical solutions to seek timely recourse, it is equally important to prevent that harm in the first place. How do we build a generation of digitally literate, resilient Singaporeans who can recognise risks, respond appropriately and support others? We should actively work to normalise the reporting of harmful content, promote positive online behaviour and challenge the bystander effect. Research consistently shows that bystanders often fail to intervene or report, not from apathy, but from uncertainty about whether intervention is appropriate or how to do it effectively.

There is also the potential for the Commission to be a resource that helps all Singaporeans navigate digital spaces more safely and constructively. Enforcement is but one pillar of work for the Commission’s global counterparts. Australia’s eSafety Commission and the UK’s OFCOM also invest in supporting, conducting and evaluating research on online safety. Conducting demographic-specific research can help us form a better, more up-to-date understanding of the changing nature of online risks both in terms of content and medium and help inform policy-making. It also feeds into the development of accessible, regularly updated resources to increase confidence, skills and online safety of citizens.

Mr Speaker, I reiterate my support for the fundamental intent of this Bill. Victims of online harmful activity need and deserve better recourse than what is available today, and the establishment of the Commission is a necessary step towards that. In the near term, with the establishment of the Commission, we may see more reported instances of online harm, not because more harm is committed, but because fewer people will suffer in silence. This will be a positive development as it means that people trust the system enough to use it.

In the long term, online harms will evolve as quickly as technology itself, because they are driven by perverse incentives, financial profit, manipulation or malice. This is why getting the Commission's design right matters. It needs to be capable, independent and transparent enough to adapt to whatever comes next.

Mr Speaker, the true marker of Singapore's success in safeguarding our digital spaces will not be one that can be easily quantified. Instead, it will be in something harder to measure, but more important, do Singaporeans feel better equipped to identify and respond to online risks? Do we feel supported and empowered to seek help? Are we more likely to intervene and report harmful content? Is our online discourse becoming more constructive and less toxic? If more Singaporeans can answer yes to most or all of the above questions, then we know we have made good progress in building not just a safer digital space, but a healthier digital society.

Mr Speaker: Ms Valerie Lee.

4.01 pm

Ms Valerie Lee (Pasir Ris-Changi): Mr Speaker, I rise in support of the Bill. The Internet has transformed how we live, work and connect, driving innovation, enabling ideas and creating digital marketplaces where transactions occur at unprecedented speed. Yet, as with every invention, it has also created new opportunities for harm, for deception, harassment and abuse.

This Bill is therefore both necessary and timely. It equips vulnerable individuals with tools to protect themselves and ensures that our laws stay relevant in the digital age. It allows victims of online harm to have faster access to justice and we must ensure that in doing so it does not introduce new obstacles that make it harder for victims to seek redress.

Sir, allow me to highlight three key aspects that mark a significant step forward in addressing online harm in Singapore. The first, empowering victims through civil remedies; the second, introducing a new thought of online impersonation; the third and last, addressing inauthentic and manipulated media. Together, these measures strengthen accountability, protect dignity and promote responsibility in our digital spaces.

Sir, I am heartened that under Part 10, certain online harms, such as image-based child abuse and online impersonation are recognised not only as criminal wrongs but also as civil wrongs. This allows victims to seek redress directly without depending solely on state prosecution. This represents a crucial evolution. As the online sphere becomes central to daily life, individuals should have both the agency and responsibility to safeguard themselves – contributing to a safer shared online environment.

Mr Speaker, I particularly applaud the creation of the tort of online impersonation. By recognising this as a statutory civil wrong, the Bill offers victims a clear path to justice. They no longer need to wait for perpetrators to be unmasked or for criminal proof beyond reasonable doubt. Victims can act to restore their good name, remove false content and prevent further misuse of their identity. This reform affirms that digital identity is personal identity – restoring dignity and control to victims and signalling that accountability does not end at the keyboard.

Mr Speaker, I also support the introduction of the tort of inauthentic material abuse. With the rise of generative AI, manipulated images and fabricated videos are increasingly used to mislead the public, exploit reputations and erode trust. As technology blurs the line between truth and fabrication, our laws must evolve. Timely action against false and deceptive imagery will help preserve the integrity of public discourse and protect individuals from reputational harm.

Mr Speaker, but I would still like to seek clarification on a few points. First, on the Right-of-Reply User Directions under clause 34. This mechanism offers a fast, low-cost way for victims to correct false or defamatory content. However, will it complement or substitute existing remedies under the Defamation Act? Can a victim pursue both concurrently, or only if the Commissioner declines to issue a direction? Clarifying this interplay will help lawyers and users understand how best to obtain swift redress.

Second, on directions and orders under clause 26. While immediate intervention is sometimes essential, will the Commissioner conduct a preliminary screening before issuing directions? This ensures that powers are exercised not only quickly but proportionately.

Relatedly, would the Ministry consider issuing guidelines to clarify which harms will be prioritised for urgent attention? Such transparency would enhance trust and efficiency. This would also help manage public expectations, streamline workflows within the OSC and enhance confidence in its impartiality. Such a measure could serve as a more proactive alternative as compared to retrospective proposals like an annual report, which for example calls for the publication of annual reports and risk assessments for the preceding financial year.

Third, on the OSC. As the OSC is to be established by next year, I seek elaboration on its structure, its staffing and its expertise. Given its wide-ranging powers over privacy, speech and accountability, public confidence in its competence and neutrality is essential.

Mr Edwin Tong acknowledges that public education is essential, but I would also like to clarify if OSC is taking this educational role, promoting public awareness of online safety, informing users of their rights and responsibilities, and guiding victims to seek help because an informed public is our very first line of defence.

In conclusion, Mr Speaker, Sir, I fully support this Bill. It represents a comprehensive and forward-looking framework to make our digital space safer and more accountable. Globally, we see similar efforts: from the UK’s Online Safety Act 2023, Germany’s Network Enforcement Act and the EU’s Digital Services Act. Yet Singapore’s approach goes further by recognising new civil wrongs such as online impersonation and inauthentic material abuse, empowering victims beyond mere takedown or criminal models.

At the same time, I think we can also learn from others: proactive risk mitigation in the UK, systemic harm prevention in the EU, and clear removal timelines from Germany. These remind us that while the digital frontier is global, regulation can reflect and must reflect our local context.

I commend MDDI, MinLaw, the Attorney-General’s Chambers and all involved agencies for developing this landmark legislation. This Bill continues Singapore’s proactive response to online harms, giving victims real avenues for redress and ensuring that our digital space remains one of safety, trust and accountability. Mr Speaker, Sir, this Bill has my full support.

Mr Speaker: Mr Pritam Singh.

4.08 pm

Mr Pritam Singh (Aljunied): Mr Speaker, I support the amendments filed by Sengkang group representation constituency (GRC) MP, Ms He Ting Ru, and further elaborated upon by NCMPs Andre Low and Eileen Chong, and agree that these amendments strengthen the Bill for victims and better calibrate the significant powers Parliament is investing in the Commission.

I have some queries that cover about 10 clauses in the Bill as tabled by the Government for Second Reading.

First, clause 5(6) of the Bill allows the Minister to give the Commissioner directions of a general or special character, not inconsistent with the provisions of the Act. The Explanatory Statement does not shed light on what the ambit or scope of these two types of directions are. What general or special character directions does this clause envisage, which are not already fully captured or foreshadowed by the Bill? Can the Minister clarify what are some of the directions that the Minister envisages to give under this clause, or is this clause to be understood as a general one that confers broad and inchoate powers to the Minister?

Clause 8 empowers the Commissioner to issue advisory guidelines. As much as public attention on this Bill has focused on online harmful activities and the statutory torts that arise from them, I believe one important barometer for the success of the Bill will lie in its capacity for preventive education.

The Australian government's eSafety Online Commissioner's website is a useful guide to take note of as a one stop site to assist victims navigate the difficult terrain of online harms and educate young and old Singaporeans about the impact of their words and actions online, with a view to establish acceptable norms of online behaviour. To this end, can the Minister share the scope and outreach plans the Government has under this clause to address online harms, with particular focus on juveniles and young adults? Does it intend to copy the Australian approach with adjustments for the local context, or does it, for example, intend to work closely with the Ministry of Education to consider novel approaches? Are there specific autochthonous approaches on preventive education that pertain to online conduct and behaviours the Ministry plans on introducing suited for local circumstances?

Clause 11 on the non-consensual disclosure of private information is not clear as to what private information entails. While the clause leaves open clarity for future regulations, as drafted it reads too widely to be of practical application, and is thus unsatisfactory. On a plain reading, it begs more questions than it answers.

The clause states that private information is information about a person that is not widely available to the public at large. The Explanatory Statement to the Bill does not aid at all to clarify the ambit or context of "not widely available", and what it means. I hope the Minister can shed light on this clause and briefly outline how regulations will address the ambit of private information in this clause. If the private information is, for example, found behind a paywall of an exclusive and expensive online publication with limited circulation, would that information be construed as not being widely available?

Clause 20 covers a specific online harm, namely incitement of the enmity, which means the communication of online material that a reasonable person would conclude incites, or is likely to incite feelings of enmity, hatred or hostility against any group in Singapore. To this end, in finding the same, clause 20A of the Bill considers relevant factors, such as whether the material dehumanises any person or persons, or otherwise portrays them as less than human. For example, in the event a foreign embassy puts up a post that tends to incite, would the Commissioner issue an order against the embassy, an online service provider, or Singaporeans who share the post, or all three?

In connection to this, I seek clarity on the ambit of clause 22, which states that a victim of online harmful activity can be anyone who is a citizen or Permanent Resident of Singapore. Clause 22(1)C also includes an individual who has a prescribed connection to Singapore to make a report to the Commission. The interpretation section of the Bill does not define this prescribed connection, unlike how it defines prescribed online service provider.

Can the Minister clarify if the Bill can be expected to extend to all work pass holders in Singapore, including foreigners, or even diplomatic staff who make reports to the OSC seeking relief outside the excluded online harmful activities covering incitement of enmity and incitement of violence? And if not, what are the exceptions?

I move to Part 6 of the Bill covering information and end user identity matters. The Commissioner is empowered to require any person to provide in any form or manner any information or document, whether kept in Singapore or not. This includes requiring the person to provide an explanation of the information or document, including providing access by way of username, password or any other authentication information.

Clause 49(4) states that the Commissioner is entitled without payment to keep any information or document. Can the Minister confirm, notwithstanding Minister's comments about the usual police procedures, if anything in this part of the Bill allows the Commissioner to, for example, seize or retain a mobile device or laptop on grounds of an investigation into an alleged online harm under this Bill?

Clause 52 gives the Commissioner powers to obtain specified information about an end user and it does not matter if that end user is outside Singapore. I would like to enquire how the online service providers the Ministry has sought feedback from for the purposes of the Bill have reacted to this requirement and whether they would be able to comply, particularly online providers such as Telegram, an entity the Government is reported to have had difficulty with in the past. In this connection, can the Minister provide an update to the House on whether Meta has complied with the competent authorities' implementation directive to strengthen Facebook's measures against scams by 31 October this year as part of the OCHA. While I accept the relevant law is different, it would be useful for the House to understand how our laws interface with how promptly and the extent to which social media platforms can comply with them, and if not, why not?

Part 7 of the Bill covers reconsiderations and appeals. Clause 59(1)(d) requires the Commissioner to inform an applicant of the result of a reconsidered decision within a reasonable time. This clause does not require the Commissioner to provide reasons for his decision. In contrast, on the matter of the Appeal Committee in clause 64(5)(a), the Committee is required to state the reasons for his decision in respect of the appeal. Can the Minister share the rationale behind the treatment or the differentiated treatment, because from a layman's perspective, both are essentially appeals?

One approach to consider is for the Government or the Commissioner to provide reasons in both cases, or the Commission to provide reasons in both cases. Doing so would be helpful in reducing the number of appeals if the Commissioner's decisions are clearly explained at the outset, subject to the usual privacy concerns. Can the Minister explain the approach taken in the Bill so as not to give the impression that the first appeal to the Commissioner to reconsider the decision is not perfunctory?

Clause 94 creates the tort of failing to respond reasonably to an online harm notice. In this particular case, while it is foreseeable that the form of an online harm notice will be forthcoming, can the Minister shed light on what constitutes a reasonable response time from the online service provider? The illustration to this clause makes reference to the word "promptly" when a social media service responds to an online harm notice. This suggests that the interpretation of "reasonable" ought to be construed as "forthwith" or "almost at once". Would this be a correct reading and expectation of how quickly an online service provider is expected to react?

Sir, these are the clarifications I seek and I look forward to the Minister's reply. Notwithstanding the amendments in the name of He Ting Ru and subject to the clarifications from the Ministers, the WP will support the Bill.

Mr Speaker: Order. We have been in the Chamber for almost six hours. I propose to take a break now. I suspend the Sitting and will assume the Chair at 4.40 pm. Order. Order.

Sitting accordingly suspended

at 4.17 pm until 4.40 pm.

Sitting resumed at 4.40 pm.

[Deputy Speaker (Mr Christopher de Souza) in the Chair]

Online Safety (Relief and Accountability) Bill

Debate resumed.

Mr Deputy Speaker: Mr Zhulkarnain Rahim.

4.40 pm

Mr Zhulkarnain Abdul Rahim (Chua Chu Kang): Mr Deputy Speaker, Sir, I rise in support of this Bill. I wish to make three points.

First, I speak from a position of advocacy that began in 2021. As a lawyer assisting victims of online harms on a pro bono basis, I have seen firsthand how online abuse can devastate victims, not only emotionally, but socially and professionally. This Bill will be a powerful tool to help victims stop the immediate harm, obtain justice swiftly and crucially, avoid being re-traumatised by a long and painful process.

Second, this is a novel piece of legislation. It places Singapore at the vanguard of legal innovation in the online safety space, with provisions not yet seen even in comparable jurisdictions such as Australia.

Third, drawing from my experience on the ground, I urge that we keep the victim's perspective front and centre by ensuring process and access to swift, affordable and final justice, without unnecessary delay or re-traumatisation.

First, on advocacy. In 2021, I founded Defence Guild SG, a ground-up network of pro bono lawyers who assist victims of online harms and sexual harassment. This was started following the lewd online poll involving female religious teachers in our Malay/Muslim community. Together with fellow lawyers, we wanted to send a clear message that such acts have no place in our society and that victims deserve access to legal protection and recourse. The initiative, supported by the Lawyers@M³ Network and the M³ movement, provides legal advice and assistance for survivors who may not qualify for legal aid. But this is a small but important step to close the gap for those who fall between the cracks and those who cannot afford representation. Yet, they are in need, desperate need of justice.

Through Defence Guild SG, I have seen how online harassment often extends beyond the screen as it affects confidence, careers and even the will to participate in the community. Many victims suffer in silence because justice feels distant, complex and often expensive.

In 2022, together with PAP MPs Ms Nadia Ahmad Samdin and Ms Hany Soh and the PAP Women’s Wing, we developed an Online Harms Resource Toolkit to empower community activists, women and girls, with practical guidance on what to do and where to seek help. This effort complements existing legislation by building bottom-up community awareness, not just laws on paper, but empowerment in practice.

In Parliament, I have also consistently raised questions on legal aid and tribunal access for online harm victims. I spoke on the Online Safety Bill in 2022 and the Online Criminal Harms Bill in 2023, pressing for accountability of platforms, standardised reporting and swifter takedown processes.

That is why I welcome this Bill because it directly strengthens the protection that many of us, on the ground, have been advocating for years.

My second point on the forward-looking and the novel features of this Bill. Mr Deputy Speaker, as a lawyer, I believe that this Bill is groundbreaking in several aspects. I will highlight two.

First, it introduces new statutory torts for online harms. Victims will now have direct civil claims, not just against perpetrators, but also intermediaries or platform administrators who fail to act reasonably upon receiving a victim's notice. This bridges the gap between the public regulation and private redress.

Unlike Australia's eSafety framework, which relies mainly on regulator-led penalties, our model empowers victims themselves to seek justice directly with enhanced damages or an account of profits where appropriate. This is empowerment. This development does more than create new rights. It shapes societal norms. By codifying what responsible digital conduct looks like, the law educates all users that our online actions carry consequences and that respect, civility and accountability are non-negotiable.

Second, the Bill introduces Right-of-Reply and Engagement Reduction Directions remedies that go beyond simple content removal. Platforms can now be required to push corrective replies to users who viewed the harmful content, ensuring that misinformation or defamation is countered in the same digital space.

Further, the Commissioner may direct platforms to reduce engagement or amplification; tackling not just content, but the algorithms that fuel virality and the harm. This is both technically astute and morally sound because this will address not only the symptom, but the system that can cause the perpetuation of online harms.

Finally, Mr Speaker, on justice with speed, with fairness and with in finality. I urge caution in two areas: first, the over-extension of "fair comment" and "public interest" defences; and second, allowing extensive appellate processes that risk re-traumatising victims.

Let me address the first. Some may argue that comments made in the public interest should be exempt from liability. But while fair comment is important, it must never be a carte blanche excuse for online abuse or harassment.

Singapore’s jurisprudence has long recognised that freedom of expression comes with duties and limits; particularly when words cross into the realm of targeted harassment, doxxing or intimidation.

Our current legal framework already strikes a careful balance by weighing public interest and fair comment alongside intent, tone, persistence and also impact. This ensures that genuine whistleblowing or responsible criticism remains protected, while malicious campaigns disguised as "commentary" are not. We must retain this proportional approach by allowing robust discourse, but drawing a clear line where speech becomes conduct that harms.

On the second issue: appeals and finality. In online harm cases, Mr Deputy Speaker, speed is justice. Victims are often individuals like students, parents, working adults who are facing powerful platforms or determined harassers. Every day of delay compounds the harm. A specialised tribunal system offers the right balance: quick, affordable, trauma-sensitive decisions, with limited appeal only on points of law.

If we open the door to full appeals, victims may find themselves trapped in lengthy and costly litigation. Thus, defeating the very purpose of relief. Most of the time, it is the party with the deep pockets that can bear this war of attrition. In any event, the recourse to judicial review is always there as another check and safeguard.

Mr Deputy Speaker, Sir, in Singapore, the principle of judicial review serves as a cornerstone of our rule of law. It ensures that executive and administrative decisions are made within the bounds of legality, fairness and rationality. Our courts have consistently held that judicial review is not an appeal on merits, but a safeguard against illegality, irrationality and procedural impropriety.

Within the context of this Bill, that principle remains fully preserved. The decisions of the Commissioner and of any tribunal established under this Bill are not insulated from scrutiny. They remain subject to judicial review on conventional administrative law grounds. However, the Bill rightly limits appeals on the merits to ensure that outcomes are delivered swiftly and proportionately, especially where victims need urgent relief.

I believe that this is a balanced approach: it maintains judicial oversight to prevent abuse of power but avoids turning each case into protracted litigation. In doing so, we uphold both the integrity of our legal system and the need for timely justice for victims of online harm.

Justice must not only be fair; it must also be fast and final for those in pain.

In this regard, would the Minister agree that our current legislative framework already strikes a fair balance between safeguarding free expression and protecting individuals from harm? And that, for the sake of victims of online harms, we must, we must avoid approaches that prolong their trauma, by allowing perpetrators, or well-resourced platforms or providers or any other party to hide behind technical defences, escalate legal costs and drag victims through a drawn-out process that ultimately denies them the swift justice and protection they deserve? Mr Deputy Speaker, in Malay, please.

(In Malay): [Please refer to Vernacular Speech.] Mr Speaker, as a lawyer who has helped victims of online harassment and cybercrime on a pro bono basis, I have witnessed firsthand how online incidents can destroy a person not just emotionally, but also socially and professionally.

In 2021, my fellow lawyers and I established Defence Guild SG, a pro bono lawyer network that defends victims of online harassment and sexual harassment. This came about after the incident involving a lewd online discussion that demeaned the female religious teachers and asatizah in our community.

Together with my fellow lawyers, we wanted to send a clear message: That such behaviour has no place in our society and our country. That no one can hide or conceal themselves online and do these things without facing any consequences. And that every victim has the right to legal protection that defends their dignity.

This Defence Guild SG initiative, supported by the Lawyers@M³ network, provides advisory services and legal help to those who may not qualify for formal legal aid. In the span of just a week, over 20 lawyers came forward to offer their pro bono services. It is a small but significant step because there are gaps in our legal system where people may have been left behind; who are unable to bear the cost of legal fees, but are too hurt to remain silent.

Through Defence Guild SG, I have seen that cyber harassment does not simply end at the screen. It extends into real life; shaking self-confidence, destroying careers and crushing a person's spirit. Many victims choose to remain silent, because the judicial process takes time and money.

But today, this Bill is not merely a piece of legislation, it will enable victims to stop harassment immediately, obtain justice quickly and most importantly, avoid prolonged trauma.

In this fight of ours, we are upholding not just the law but also the values that we hold dear, that justice, compassion and wisdom must go hand in hand; both in the physical world and online.

(In English): Mr Deputy Speaker, Sir, let me conclude by giving voice to some of the victims who have reached out to me and my colleagues.

One victim J shared that she had been "subjected to prolonged and persistent online harassment since 2014", but could not afford the cost and time for injunctive relief. Another named Y, a Singaporean who was studying overseas said she remained "fearful of returning to Singapore" because the harassment had so deeply affected her sense of safety. A third victim S, a single mother, suffered from severe depression and anxiety, described how her personal information was circulated in unsolicited emails to various parties.

Each and every one of them are desperate for help quickly and would have to turn to the system for protection. Their stories remind us, all of us that behind every case, every case number, every matter is a human being: seeking peace, not revenge; dignity, not exposure; closure, not spectacle. They do not want sympathy. They want swift and humane justice.

Mr Deputy Speaker, Sir, let us ensure that our laws do not merely punish wrongdoing, but protect the vulnerable. Let us make the online world safer, so that our generations to come may inherit not a space of fear, but one of respect and responsibility.

As I once pledged with my colleagues, my pro bono lawyers at Defence Guild SG: let us stand together, against violence and harassment here and online so that the world we live in today will be a better one when we leave it for our children. I stand in support of this Bill.

Mr Deputy Speaker: Ms Cassandra Lee.

4.56 pm

Ms Cassandra Lee (West Coast-Jurong West): Mr Deputy Speaker, the Online Safety (Relief and Accountability) Bill is timely. I have met with residents affected by online harms and Singaporeans have written to me asking for reforms. From how we stay in touch to how we work, shop, learn and govern, the Internet has evolved into more than just a means of communication – it has become a second life. And the divide between the online and offline worlds continues to blur as technology becomes increasingly interwoven into our daily lives. Our laws need to keep pace with modern day advances and the threats of today.

The recent report by SHE, "404 Help Not Found: Lived Experiences of Online Harms Survivors" sheds light on the harsh realities that define today’s digital landscape. Let me share the story of one of the online harms survivors mentioned in the report.

As a child, Zane experienced bullying in school. His peers took to online platforms weekly to spread negative remarks about him. Later in life, during his polytechnic years, Zane experienced doxxing. The perpetrator shared his name, school and course on Twitter and threatened Zane. These incidents had an impact on Zane in his growing years. He kept his pain to himself, fearing retaliation and judgement. The emotional toll eventually caught up with him. During his National Service days, Zane experienced panic attacks and had to seek professional help.

Zane's journey reflects how cumulative online harms can leave deep psychological scars; ones that linger long after the screen goes dark. And what happened to Zane is not rare. It is a glimpse into a culture where online harm has become casual, even accepted. Where cruelty hides behind screens, and victims often suffer in silence – many times too afraid, too tired and too ashamed to seek help.

It is important for us to be deliberate in shaping the tone of our online world for our children, our friends and our neighbours. Because the wounds may be invisible, but they are real – and they last.

The Bill is a step in the right direction. The OSC is vested with the authority to issue directions and orders to online service providers, communicators and administrators. Additionally, victims can turn to civil recourse and remedies for redress. In the age of AI, where false and harmful content can spread faster and cut deeper, such protections have never been more important.

The youths in particular are particularly and especially vulnerable to online harms. To find out what the youths thought of the Bill, I spoke with YouthTech SG, formerly known as Cyber Youth Singapore. YouthTech SG is a youth-led social movement and charity. One of their divisions is YouthTech Institute which focuses on helping young people navigate the risks and opportunities of technology, including online safety.

They wrote to me and expressed their support for the Bill, describing it as a signal for Singaporeans to foster a more cohesive, civil and wholesome digital environment. They also welcomed the Bill’s introduction of new avenues for youths to seek help and relief when harmed online; options that previously did not exist. While the Bill makes important strides in fostering a responsible digital environment, the online culture is ultimately one which requires users to build and that endeavour goes beyond legislation. Having reviewed the Bill, I find that the Bill is a meaningful step forward. It is unlikely to be a quick or perfect fix given how common online harms are and how deep rooted they are in our daily lives, but it signals our shared commitment to build a kinder, safer Singapore – both offline and online.

I support this Bill and would like to share a few reflections and seek some clarifications.

My first observation: the creation of the OSC and the scope of its enforcement powers is noteworthy. The speed of takedown is critical. While victims often suffer immediate harm, current means of redress are reportedly slow and sometimes ineffective. IMDA’s 2024 Online Safety Assessment Report found that major platforms acted appropriately only on half of the reported harmful content, often taking more than five days. The unfortunate reality is that victims are often left to navigate complex and sometimes opaque complaint processes, and structural challenges may further hinder platform responsiveness.

The OSC plays a vital role in addressing these gaps. Its powers, such as issuing Stop Communication or Restraining Directions, enable victims to take timely and decisive steps to safeguard themselves. Moreover, clause 26(1) provides that the Commissioner may issue a Part 5 direction if the OSC has reason to suspect that the online harmful activity was conducted. This allows OSC the flexibility to swiftly take down online harmful activity.

I see that Australia had similar experiences from which we can draw some lessons. The Australian eSafety Commissioner has the power to issue directives to address online harmful activities. Speaking at a 2023 symposium organised by MinLaw and the SMU Yong Pung How School of Law, the eSafety Commissioner from Australia highlighted the value of a dedicated and independent safety regulator in addressing online harms. In a Straits Times article, dated 26 September 2023, it was reported that she said: "Often when people come to us, they just want their images taken down, they don’t want to be tethered to their former partner through litigation, which, of course, can take a long time. So, what we provide is rapid assistance. I don’t think we would have the same degree of success with the platforms, particularly where there are grey areas, if we didn’t have these (remedial) powers."

To this extent, I note that there is a requirement for victims to first file a report to the online platform before approaching OSC. The duty imposed on the platforms also only starts when the report is accurately and properly filed to the platform. And therefore, the accessibility and ease of reporting on the platforms will be critical in ensuring access to justice for victims through OSC. I seek clarifications on how the Ministries plan to work with the platforms to enable this.

Secondly, I also seek clarification if the same content appears across different platforms and sites, how should the victim take action? Should the victim report all incidences he or she is aware of at the same time? Are OSC's directions specific? Or will we land in a situation where the victim is required to repeatedly go through the process of reporting to the platform and then to OSC for each piece of material? Can a victim apply to OSC for the pre-emptive ban of similar materials resurfacing?

Member Zhulkarnain had mentioned the advantages that OSC offers in being a more responsive and less adversarial mechanism as compared to traditional litigation. I will not repeat that, but I stand in support of his point.

My second observation is on anonymity. Victims of online harm may be reluctant to seek redress due to fear of repercussions, whether from the perpetrator, societal stigma, workplace consequences, or familial pressures. This hesitation is particularly pronounced in cases involving intimate image abuse, doxxing, or harassment, where the victim’s identity is often central to the harm itself.

I am heartened to see that the Ministry has been sensitive to this issue. In respect of court proceedings, the Bill envisages that the Rules of Court may be amended to empower the Court to order the redaction of identifying information, including parties' names, in court documents, to conduct proceedings in private, and to issue orders preventing the disclosure of parties’ or witnesses’ identities. I am concerned that in practice, however, victims may remain vulnerable and additional administrative steps may be required to secure their anonymity during the interim period before such orders are granted.

I wish to suggest for the Minister's consideration whether a more victim-centric approach would be to introduce automatic redaction, subject to judicial discretion to lift such redaction where appropriate. This would mirror existing protections under the Children and Young Persons Act, where proceedings involving minors are automatically anonymised, and the Women’s Charter, which provides automatic identity protection in the form of prohibition of publication or broadcast of information identifying victims or witnesses. This is applied under the Women's Charter in cases involving sexual offences. Extending automatic anonymity to victims of online harm, particularly those from vulnerable groups, would not only reduce procedural burden but also encourage victims to avail themselves to the options under this Bill.

Beyond the courtroom, I would be grateful for further clarity on how the Ministry intends to safeguard the confidentiality of victims during the reporting process. Specifically, how will the Ministry ensure that victims’ personal information remains protected, while also upholding the rights of alleged perpetrators to be informed of the nature of the allegations made against them?

My third observation relates to the new statutory tort of online instigation of disproportionate harm. Clause 87 of the Bill, which introduces the "tort of online instigation of disproportionate harm", which is designed to offer civil recourse to victims and their associates where public statements, made in response to a victim’s conduct, are likely to incite others to act in ways that could cause harm to the victim.

Civil action under this tort requires proof of actual or reasonably foreseeable loss or damage, while some heads of losses, such as financial losses, may be easier to prove, others, particularly psychological injury or emotional distress, have traditionally presented evidentiary challenges. I would be grateful for the Minister's clarification on how the latter cases will be treated.

My fourth observation is that the Bill seeks to address "inauthentic material abuse", which is any audio, visual or audiovisual material that has been digitally altered or generated that is false or misleading depiction of the victim’s words, actions or conduct, but is realistic enough that a reasonable person would believe that the material emanated from the victim. In this regard, a depiction of the victim saying or engaging in anything includes a depiction of the victim’s likeness saying or engaging in that thing.

The Bill would benefit from clarity on how likeness is assessed in the context of AI-generated content. For example, does resemblance alone suffice? Must there be an intent to impersonate or mislead? What evidentiary standards apply? In my view, these clarifications are especially important since resemblance may be subtle, stylised, or partial.

In closing, Mr Deputy Speaker, I welcome this Bill. It is a timely and necessary step towards preventing the propagation of online harmful activities. This is especially important today, when many of us, including young and old, spend much of our time online. The youths in particular are especially vulnerable to online harms, and I am heartened that this Bill provides additional avenues for their protection.

In particular, I am encouraged by the Ministries’ efforts to consult widely and engage with the youths while drafting this Bill. The youths form a significant demographic that would be affected and protected by this Bill. In their letter to me, YouthTech Institute shared with me that they participated in the public consultations and dialogues organised by the Government. They voiced their concerns and provided feedback on how the Bill can continue to stay true to its policy goals without chilling legitimate online participation by youths. They tell me that their feedback was seriously taken in and considered. By participating in the drafting process of this legislation, our youths felt seen and heard, inspired and empowered. The youths had this to say: "Thank you for showing us that policy can be made with us, not just for us." I am grateful to the Government for allowing the youth voices to be heard on a Bill that affects matters close to their hearts. I thank the Ministers and their teams for their efforts, and I am happy to support this Bill.

Mr Deputy Speaker: Mr Gabriel Lam.

5.10 pm

Mr Gabriel Lam (Sembawang): Mr Deputy Speaker, I rise to speak on the Online Safety (Relief and Accountability) Bill, not merely as legislation but as an assertion of how we wish our digital society to function: justly, humanely and with room for redemption.

Let me start with a scenario many Singaporeans will recognise. Suppose a person commits a crime. The courts hear the evidence, the verdict is pronounced, sentence is spent. That is where justice should end. Yet, far too often, the second sentence is delivered online by self-appointed vigilantes. Almost immediately, commenters weave threads of public shaming: images, addresses, workplaces, friends – all exposed, shared, argued about. This is not journalism or public interest. It is doxxing, harassment, digital mob justice.

In Singapore, we already see the signs. According to SHECARE@SCWO, victims of doxxing have risen from 13 in 2023 to 32 in 2024. Harassment and cyberbullying cases rose from 48 to 80 in the same period. Applications to the Protection from Harassment Court have increased steadily: from 346 in 2021, to 520 in 2022, 526 in 2023, and 631 in 2024.

One legal practitioner recounted a man in his 30s with mild autism, whose personal traits, speech, appearance and insecurities were relentlessly attacked online in gaming communities. Whether or not the original crime was serious, the secondary assault online multiplies harm and deepens exclusion.

The burden does not fall evenly. The person who already faces legal consequences now must contend with perpetual digital scrutiny, unable to move forward even after "paying their dues". And sometimes, the person targeted online is not even guilty at all. In 2020, many will remember the case of the woman who refused to wear a mask at Shunfu Mart and claimed to be a "sovereign". The video went viral almost instantly. In the rush to identify her, online users wrongly accused another woman of being that person. Within hours, her photographs, company details and even the names of her colleagues were posted across social media. She and her firm faced a barrage of racist and xenophobic abuse, even though she had absolutely nothing to do with the incident. The victim later described feeling bewildered and frightened, a bystander suddenly thrust into the centre of a public storm.

It was a textbook case of doxxing and mistaken identity. And it showed how quickly online outrage can spill over into real harm, even for those completely uninvolved. If that is what can happen to an innocent person, imagine what happens to someone who has actually made a mistake, served their sentence and is trying to move on.

We already have laws against doxxing in Singapore: amendments to the Protection from Harassment Act make it an offence to publish someone’s personal details with intent to cause harassment, alarm or distress. And yet, enforcement is limited: the Singapore Police Force investigated fewer than 10 cases of doxxing in the first four months of a recent year. Why the gap? Because doxxing often lies in a grey area where exposure is legal, for example, a court report, but the layering of personal data and commentary turns exposure into attack. It tests the limits of what laws can do alone. That is where this new Bill is crucial: it gives victims stronger recourse and empowers the Commissioner to put across platforms to remove or restrict content, require the right-of-reply, or label harmful material.

I return to the question: when someone serves their sentence, should society still punish them? When does justice become vengeance? Worse still, when harm hides under the guise of public interest or fair comment. If a so-called fair comment is sent repeatedly every day to a victim, their family, or their employer, it ceases to be comment. It becomes harassment. We must be clear about where legitimate expression ends and abuse begins and ensure that enforcement is consistent so victims do not hesitate to come forward for fear of uncertainty.

If we allow perpetual online shaming, we deny redemption. We do not rehabilitate; we annihilate futures. Employers, neighbours, friends, all filtered by search results, by rumours, by what the digital crowd decided to publish. This Bill helps guard against that unending sentence. It lets the Courts and the Commissioner intervene. It says: the past does not have to be a life sentence in the digital realm.

Mr Deputy Speaker, while I support the Bill, I would like to offer three modest suggestions for its implementation.

First, enforcement must be accessible. Many victims of online harm may lack the legal literacy, confidence or financial means to approach the Commissioner or file for redress. The process should therefore be simple, affordable and trauma-informed with clear guidance and support for vulnerable groups such as youths, persons with disabilities and lower-income users. A centralised "one-stop" reporting channel that connects the Singapore Police Force, IMDA and the Commissioner's office could make seeking help far less intimidating.

Second, prevention must go hand in hand with enforcement. We cannot legislate empathy, but we can educate for empathy. Public campaigns and digital citizenship programmes, especially in schools and workplaces, should teach not only what is illegal online, but what is unethical. It is as important to cultivate digital restraint as it is to punish digital harm.

Third, we should examine the appeals framework carefully. The Bill currently limits further appeals after an online safety Appeal Committee's decision. That finality provides efficiency but there is a risk that victims could still face drawn-out proceedings or repeated challenges from offenders exploiting procedural loopholes. We should ensure that the appeals process is fair but finite, one that safeguards due process while sparing victims prolonged distress.

In conclusion, Mr Deputy Speaker, I support this Bill – not because it is perfect, but because it is necessary. Because justice demands both accountability and mercy. Because our society should not allow people to be condemned twice: once by the Court, and again by the crowd. I ask that as we debate, we remember those caught in the crossfire of legal punishment and digital vengeance and build protections so that justice ends where it should.

Mr Deputy Speaker: Ms Tin Pei Ling.

5.18 pm

Ms Tin Pei Ling (Marine Parade-Braddell Heights): Mr Deputy Speaker, online harms – whether harassment, non-consensual intimate imagery, deepfakes, scams or coordinated disinformation – inflict real damage to mental and physical well-being, economic security, public discourse, safety and public trust in digital systems.

Unlike physical or verbal harassment, online harms spread instantaneously, exponentially and often beyond our immediate control. Content posted today can be duplicated, archived, indexed and re‑shared, leaving permanent scars on victims' reputations and livelihoods. The asymmetry of power is stark. It is easy for anyone to publish, but hard for victims, especially ordinary users, to have harmful content removed. The burden of proof and the effort to seek redress often fall on those already traumatised. Hence, whether at the individual or society level, we must have in place effective and firm mitigation across policy, enforcement, technology, education and civil society to tackle the ills of online harms.

As mentioned, this House has debated these issues before. In January last year, I moved the Motion on Building an Inclusive and Safe Digital Society together with our then-Digital Development and Information GPC colleagues. During the Motion debate, we reaffirmed our commitment to adopt a whole-of-nation approach to sustain trust by building an inclusive and safe digital society. Members who participated actively discussed, amongst other things, online harms. In all, the then-Digital Development and Information GPC members and PAP MPs who spoke set out a total of 13 Calls to Action, which included:

Calling on Social Media Services to be accountable for the proliferation of harmful content and malicious ads on their platforms.

Calling for the Social Media Services (SMSs) and App Distribution Services (ADSs) to step up in measures to better protect young users from being exposed to age-inappropriate or harmful content.

Calling for these SMSs and ADSs to improve timeliness in responding to user reports on harmful content on their platforms.

Here, I must stress that the importance of timely interventions cannot be understated. Thankfully, both sides of the House supported the Motion. Therefore, I am glad that we are now taking a step further in building an inclusive and safe digital society through stronger legislation and measures against online harms.

The key features of this amendment are: firstly, a clearer definition of online harms; secondly, the establishment of an Office of the Commissioner of Online Safety; and thirdly, powers to enable speedy interventions when harms are detected. These measures address the problems victims often face – speed and uncertainty.

Allow me to share a story of my resident's fiancée back in 2014. She was a victim of "Revenge Porn". This story remains pertinent today. My resident's fiancée's ex-boyfriend uploaded very compromising photos of her onto websites such as "Revenge Porn", which was outlawed by California but was still available online because the sites were hosted overseas. These photos were taken while they were still in the relationship and when there was still trust. Following the uploads, there were many unsavoury comments by people she knew and did not know and after a while, even her friends started to notice it and asked her about it. Imagine the shame, the embarrassment, the kind of distress that she had to endure.

Quoting my resident, he stated, "to complicate matters, the website is hosted in the US. The company is registered in Amsterdam and even the individual, who happens to my fiance's ex-boyfriend, who uploaded the photo is unable to remove them without the website host's approval who, ultimately, would only do so if the removal fee is paid. Since then, my fiancée has lost sleep, could not eat or work in peace, especially not while witnessing the links to her compromising photos appear on search results while looking up her name."

At that time, POHA was just introduced and yet to take effect. But the issue at hand was real and pressing. I recall we had to approach various agencies to try to figure out how to stop it and protect the fiancée. Eventually, we managed block access domestically to protect her.

Fast forward to today, things have improved. We have more legislation in place and stronger enforcement capabilities. There is greater societal awareness and to be fair, the social media industry has also been more responsive than before. But online harms have not gone away. We may no longer find the "Revenge Porn" website now, but we can still remember the fairly recent "SG Nasi Lemak" Telegram chatgroup. The format and vectors have changed – from public websites to private chatgroups and dodgy apps. And so, our legislation and interventions must change too.

So, I believe the clearer definitions in this round of amendments will accelerate investigation and enforcement. Speed is critical. The longer harmful material remains public, the greater the lasting damage. Yet, victims today still find the reporting process confusing and daunting. Reporting to social media platforms often feels like a "lucky draw", you cannot be sure whether they will agree to remove the post or when they will do so, if they agree to do so. The ordinary users, that is, the "small fry", are often disadvantaged.

It is not that there is no legal recourse at present. But when a victim is already stressed out by the fire burning online, how can she/he find the mental space to navigate the laws and take the right actions? As it is, the relevant statutes addressing harmful online content and conduct include a wide range – POHA, Broadcasting Act, POFMA, Foreign Interference (Countermeasures) Act and OCHA. How can a victim navigate these laws and will OSRA, which is the Bill that we are debating now, overlap with existing laws and frameworks?

Must the victim make reports to multiple agencies, or will agencies coordinate amongst themselves? In the 2014 case that I shared earlier, my resident and his fiancée felt lost, even I felt somewhat exhausted trying to help them navigate the various agencies. How will the new OSC help, or not help, to overcome this? That being the case, we must also guard against abuse of this new mechanism. What measures are in place to prevent OSC being made use of by devious characters to make vexatious reporting or be weaponised to fix another person due to private disputes?

Adding on, I wish to express my welcome to the amendments' focus on finality in appeals. Without timely finality, resourceful perpetrators can exploit appeals to delay takedowns, prolong exposure and continue with the abuse. All things equal, some considerations simply should take priority. Hence here, protection against harm must be prioritised.

I refer to hon Member He Ting Ru's proposed amendment. I had a little bit of concern that I wish to set out here. I find the use of "fair comments" here somewhat fuzzy and highly subjective. Playing the devil's advocate, since the premise of this is to prevent the premature ending of the discussion of topics of public interests, possibly because of fear of political motivation, then does "fair comment" also mean that if a public figure happens to fall victim to baseless allegation, fake news or even being sexualised online, that this "fair comment" will prevent or delay protection and justice, just because the subject happens to be a public figure?

Also, the Member's proposed amendments suggest, or seems to suggest, the removal of finality to appeals and to switch from "reason to suspect" to "reason to believe" – these will drag out the process and extend a victim's exposure to harm. I do not think that these are helpful to victims of online harm. Perhaps, the Ministers, in your responses, can comment and shed more light on the possible consequences of such amendments as proposed by the hon Member.

Ultimately, legislation and enforcement are necessary but not sufficient. Social norms and education matter. In 2021, PAP's Women's Wing launched a campaign, #ActionForHer, to encourage Singaporeans to take action to support women. In 2022, the campaign focused on tackling online harms. There were pledges to action, and a toolkit on "Supporting Residents Facing Online Harms", just to name a few. And as part of the campaign, youth activists from my branch, MacPherson, decided to do more and organised the #ActionForHer Webinar Series to discuss topics pertinent to their peers and with a strong focus on protecting women in the virtual space.

I am sure there are many other examples of positive ground-up efforts by different volunteer groups and organisations. We should certainly recognise and encourage such efforts, to encourage positive social behaviours and norms. Allow me to speak in Mandarin.

(In Mandarin): [Please refer to Vernacular Speech.] Online harm, whether it is harassment, non-consensual intimate images, deepfake scams or coordinated disinformation, damages mental and physical health, economic security, public discourse, safety and public trust in digital systems.

Online harm differs from physical and verbal harassment. Online harm spreads rapidly, often exponentially and frequently beyond our immediate control. Content published today can be copied, archived, indexed and repeatedly shared, leaving long-term, or even permanent, damage to victims' reputations and livelihoods. Therefore, the Internet often favours perpetrators, while being unfair to victims. Anyone can easily publish information, but victims, especially ordinary users, find it difficult to remove harmful content.

The burden of proof and seeking redress often falls on those who are already traumatised. This is asymmetry of power.

Therefore, both at the individual and societal levels, we must establish effective and robust mitigation mechanisms in policy, law enforcement, technology, education and civil society to address the consequences of online harm.

I am therefore pleased that we will now combat online harm through stronger legislations and measures. The key points of this amendment includes: first, providing a clearer definition of online harm; second, establishing the OSC; and third, granting powers to intervene swiftly when harm is discovered. These measures can better address the problems victims frequently face, that is, speed and uncertainty.

As technology continues to develop, new methods of cyber threats and online harms will also emerge. Therefore, legislation and intervention measures must also keep pace with the times. We may not be able to totally stem out such risks or threats at one go. We must continue to evolve our legislation and ensure there is teeth in the enforcement.

Meanwhile, it is not enough to just put in place such hard measures. Social norms and education are equally important. If anyone is willing to take on a larger share of the responsibility, we can protect more people and reduce the harms or damage to individuals, so that ordinary citizens can be free to live full and meaningful lives.

(In English): As technology continues to develop, new methods of cyber threats and online harms will also emerge. We may not be able to totally stamp out such risks or threats at one go, but we must continue to evolve our legislation and ensure there is teeth in our enforcement. That being said, it is also not enough to just put in place such "hard" measures. We need the whole-of-society to believe in upholding a safe online space as well.

So, if we can do more, if everyone is willing to take on a larger share of the responsibility, we can protect more people and reduce the harms or damage to individuals, so that ordinary citizens can be free to live full and meaningful lives. With that, I support the Bill.

Mr Deputy Speaker: Mr Alex Yeo. Before that, Ms He Ting Ru, do you have a clarification of a speech by a Member before you? May I know what that clarification is, please.

5.32 pm

Ms He Ting Ru: Thank you, Sir. I have a clarification on Ms Tin's speech from earlier. I think I heard Ms Tin said on my amendment to change the standard from "reason to suspect" to "reasonable grounds to believe", I believe the Member said that this might actually prolong the process and cause further harms to victims. I just wanted to clarify how she thought that this would be the case.

Mr Deputy Speaker: Ms Tin, would you like to respond to the clarification?

Ms Tin Pei Ling: Deputy Speaker, from "reason to suspect", it means that when the harm is first detected, that allows the agency or the authority to quickly step in and take measures to remove the harm. That is one. But to change it to "reason to believe" will mean that there must be sufficient grounds established to prove that there is indeed harm, and this process will take time.

So, during this period of time, I am fearful or I am worried that this will cause the victim, who is already traumatised, to be exposed to the harm further. And this is what I meant by if we change from "reason to suspect" to "reason to believe", it may therefore extend that exposure to harm on the victim and therefore, cause more harm to the victim.

Mr Deputy Speaker: Ms He, do you have a clarification arising out of Ms Tin's response to you?

Ms He Ting Ru: Yes, I do.

Mr Deputy Speaker: Please proceed.

Ms He Ting Ru: I thank Ms Tin for her explanation. I just want to explain our thinking behind why we actually called for the amendment. Our thinking behind is that if there is a serious harm needing immediate address, I believe the test of reasonable grounds to believe will meet the threshold upfront and therefore, allow a takedown of the content.

However, when there is ambiguous harm that is worthy of further investigation, our thought process was that where this might not satisfy reasonable grounds to believe and therefore, we will actually use investigative powers, and for the Commissioner to satisfy himself that it is appropriate to take action.

Mr Deputy Speaker: So, do you have a specific question for Ms Tin or not, Ms He? No. Alright. Ms Tin, would you like to respond to any of that?

Ms Tin Pei Ling: Thank you to the hon Member for the clarification. I do have a question in response to that. I am not sure whether I am allowed to. Perhaps, in the thought process that Ms He had articulated, could she share some examples of what such ambiguous situations may be so that I can have a better understanding of her thought process?

Mr Deputy Speaker: Ms He, would you like to respond to that?

Ms He Ting Ru: Yes, thank you. So, I think for something that is ambiguous, for example, what is considered harmful to a reputation, that would be something that sometimes, it could be a little bit ambiguous; and therefore, that is when, perhaps, there is a bit of a chance for the investigative powers to be used to investigate further.

Mr Deputy Speaker: Do either Ms Tin or Ms He like to say anything before we move on to Mr Alex Yeo who is the next speaker? None? Mr Alex Yeo.

5.36 pm

Mr Alex Yeo (Potong Pasir): Mr Deputy Speaker, the objective of this Bill, if passed into law, is to enhance online safety for all in Singapore.

We operate every day in the digital world. It is part and parcel of our lives. While digital connectivity has brought undeniable benefits, a potent side effect has been the proliferation of online harms. Online harms are all pervasive, have the capacity to go viral very quickly and often bring immeasurable harm to the victims.

Victims find it difficult and challenging to seek recourse or relief from the mental and emotional toil that online harms inflict on them. For many, there is an innate helplessness in not knowing how to make it stop; how to make it go away. Ironically, even though it is the digital world, they have nowhere to run and nowhere to hide.

Many jurisdictions like us are also grappling with this phenomenon. Australia, New Zealand, Germany, the UK, the EU, the US and India have enacted varied legislation to address this.

In Singapore, the introduction of this Bill is, therefore, a timely one. I appreciate the policy intent of this Bill and the actionable levers that it seeks to introduce to address online harms in Singapore. I shall highlight three aspects that stand out for me.

One, this is a victim-centric Bill. Its main purpose is to provide quick and effective remedies and relief to victims of online harms, through a straightforward and accessible reporting platform, the Commissioner of Online Safety. All 13 classes of online harms are addressed from the victim's perspective.

Two, it addresses the pervasiveness of the online harm, by factoring in considerations, such as the number, frequency, nature and circumstances of the online communication. The Commissioner will take a holistic approach in considering each report to carry out his or her duties effectively and in good time.

Third and importantly, this Bill should not be seen as an attempt to regulate online content. Over time, it is hoped that this will promote accountable and responsible conduct that moves us towards a safer online environment in Singapore.

Mr Deputy Speaker, at this juncture, I would also like to address one aspect of the Notice of Amendments to the Bill filed by the hon Member Ms He Ting Ru.

I just like to make two observations to the inclusion of fair comment as an exception in certain clauses. Let us take clause 9 – online harassment – as an example. At the outset, I will say that I appreciate the good intent of the amendment. It is to carve out an exception of fair comment on a matter of public interest so that any such communication of online material will not constitute online harassment. Mr Andre Low, for example, gave an illustration of a public official: a person who made or gave a purported fair comment to a public official that could be perceived as an online harassment.

I will, however, make two observations. First, I note that the amendments do not propose a definition of what constitutes fair comment. As such, we have to look elsewhere for interpretation. One possible area is in existing defamation law where fair comment is a defence. There are four elements to the defence. But in simple terms, if the comment or opinion that a fair-minded or reasonable person could honestly made based on facts and importantly, it must be made without malice.

With that in mind, let us look at what constitutes online harassment in clause 9 of the Bill. It is the communication of online material that a reasonable person would conclude is threatening, abusive, insulting, sexual or indecent, and likely to cause a victim harassment, alarm, distress or humiliation. So, my first observation is this.

If we are to compare both definitions of online harassment and fair comment side by side, could a fair comment reasonably constitute as online harassment? Even with the example provided by Mr Andre Low, based on the definitions, is it possible for fair comment to be threatening, abusive or insulting? If the answer is no, then, the amendment, in my opinion, is unnecessary. In fact, I am of the respectful view that a fair comment should be a comment or opinion conveyed in a respectful, civil and courteous manner. I am sure such a comment will not be captured under clause 9 as online harassment.

This leads me then to my second observation. As I had shared earlier, my reading of the intent of the Bill is for the Commissioner to make an assessment that is first, victim-centric, how the online communication is received by the victim. Second, focused not on its content, but rather on the nature and manner in which the online communication is conveyed. And third, provide victims with quick and effective relief.

To necessitate the Commissioner to make further legal assessment on whether an online communication is fair comment, especially in light of my first observation will, in my opinion, distract from and potentially defeat the purpose of the Bill.

Having said that, Mr Deputy Speaker, while I rise in support of this Bill, I would like to take this opportunity to seek some clarifications on how the Government intends to address online harms suffered by a particularly vulnerable class of victims, children or minors.

In my maiden speech, I spoke about the concerns surrounding our children and their exposure to the digital space. Several Members of this House also raised similar concerns.

I had shared that, as a father of two Gen Alpha children, one of whom will be a teenager soon, no matter how much we try to regulate their interactions with the digital world, my wife and I worry. Where parents once told their children to beware of the strangers outside of the home, today, we must also guard against "strangers" and dangers lurking within, in the digital world. Even the home is no longer a safe sanctuary for our children.

Digital connectivity has dramatically altered the experiences of the growing up years. In the analogue world of the past, if you had an unpleasant experience at school or with your friends, you could go home, get "away" from the problem. You could, in essence, "disconnect". Today, these experiences follow you, everywhere. In fact, they become viral, multiplying and reinforcing the awfulness of the experience or encounter. It becomes the last thing you read or see before you sleep and the first thing you see or read when you wake.

For a child or a youth, this incessant barrage could lead to unthinkable devastating consequences. We have all read of such cases happening all over the world. I do not need to raise them here.

Mr Deputy Speaker, in this context, I raise two points for clarification.

First, how will the proposed Office of the Commissioner of Online Safety distinguish online harms between adults, children or youths? Second, whether the proposed Office of the Commissioner for Online Safety will take a differentiated approach in the reporting regime and the provision of remedies and reliefs when the victim is a child or a youth. I will elaborate on each of the above in turn.

Under the Bill, most of the classes of online harms depend on the "reasonable person" test. This is the test that the Commissioner will apply to assess and conclude whether the reported online material fulfils the criteria of an online harm. Currently, this is a single, uniform test to be applied to persons of all ages.

I accept that the Commissioner is within his or her remit to apply a differentiated standard of the "reasonable person" depending on whether the alleged victim is an adult or a child. This, however, as we can all appreciate, could be an extremely delicate exercise. There is, at present, little guidance provided in the Bill.

Other jurisdictions have drawn this distinction in their legislation. The Australian Online Safety Act 2021, for instance, draws a distinction between cyberbullying for children and cyber-abuse for adults. In fact, it appears that the Australian eSafety Commissioner has set up different channels in dealing with a range of different persons: adults, educators, parents, young people, children, seniors and women.

I believe that the perception of harm, such as what is threatening, abusive or insulting, between an adult and a child and even between minors of different ages is different. The type of online materials that may cause harassment, alarm, distress or humiliation to an adult as opposed to a child or youth, will also be different.

In fact, I thought Minister of State Rahayu had actually — when she shared Handout 4, if I can bring the House to Handout 4, paragraph 7 under (A) Online Harassment says this: "However, if the facts were different and B had disagreed with A, only by calling A's stance stupid and flawed, this is not necessarily online harassment. Criticism, even if strongly worded, is not online harassment if it does not involve what a reasonable person would conclude is threatening, abusive, insulting, sexual or indecent language, which is likely to cause harassment, alarm, distress or humiliation".

It is my humble opinion, Mr Deputy Speaker, that when words such as stupid and flawed are addressed to a child or a minor, it is perceived very differently. And so, when we look at the reasonable person test, we must be mindful that it must be viewed in the context of age and maturity.

Mr Deputy Speaker, this is a hypothetical, and I am certain that any future Commissioner will calibrate this accordingly. My only wish is to make the point that there needs to be a distinguishment between how we view online harms between adults and children. Studies have also shown that online harms impact children differently from adults. For example, online harms are likely to have a longer-lasting impact on children. Cyberbullied children experience depression, anxiety, suicidal ideation and self-esteem deficits that persist into adulthood.

Secondly, I hope that the Office of the Commissioner for Online Safety could consider a differentiated approach when setting up the reporting regime and the provision of remedies and reliefs when the victim is a child or a youth. Reports and surveys suggest that in the cases of cyberbullying of children or minors, under-reporting is prevalent. In February 2024, the then Ministry of Communications and Information released findings from their youth online gaming survey. The results are as follows: 17% or almost one in five gamers aged 13 to 18 felt that they have been bullied in video games by other players. Out of the 17% who felt bullied, nearly half or 48% did not take any action at all. Only about 8% spoke to their parents about the experience. Of the parents surveyed, one in four parents were not aware of who their child gamed with.

This is not unique to Singapore. Studies elsewhere have also reflected the under-reporting phenomenon. In another survey conducted by researchers from the University of California, Los Angeles (UCLA) on how 1,400 youths aged 12 to 17 deal with cyberbullying, 90% of the respondents reported that they did not tell an adult about the incident. The common reasons for this are the belief that they should "deal with it themselves", that is 50%; the fear of losing Internet access or face device restriction, or 31%. In other surveys, embarrassment, shame, fear of judgement or reprisal are also reasons children or youths cite for not reporting cyberbullying.

These surveys struck a chord with me. Ensuring that the reporting mechanism is one that provides confidence to our minors and a safe space to share their experiences and fears without judgement, is vital and crucial.

There are many reasons why a child or a youth may shy away from sharing cyberbullying incidents with parents or family. A compromising indecent image of themselves, for example. Will there be a reporting mechanism that would allow children or minors to directly report cyberbullying, even in the absence of a parent or a trusted adult? How will we reassure and encourage our children and youths that such a reporting mechanism is safe for their use?

To achieve this, it will likely entail a multi-pronged approach led by the Commissioner for Online Safety, involving the entire ecosystem that supports our children and youths from parents, educators, social workers, child psychologists and student guidance counsellors to name a few. In this context, I would ask the Minister of the Government’s views and plans to address the complexities and nuances of online harms suffered by children or youths in the digital world and, if there are plans to augment the Office of the Commissioner for Online Safety with the necessary support, such as a professionally trained team of experts to address these issues. Sir, notwithstanding my clarifications, I support the Bill.

Mr Deputy Speaker: Mr Andre Low, do you have a clarification of a Member who gave a speech prior to you?

5.50 pm

Mr Low Wu Yang Andre: Yes, I do. I have a clarification of Mr Alex Yeo.

Mr Deputy Speaker: Of whom?

Mr Low Wu Yang Andre: For Mr Alex Yeo.

Mr Deputy Speaker: Yes, please proceed.

Mr Low Wu Yang Andre: Thank you. Before I proceed with my substantive clarifications, I think I want to get a preliminary clarification from the Member. Is the crux of his point on the fair comment issue that essentially he thinks the amendment we have proposed is extraneous because it is already covered by the definition of the harm?

Mr Alex Yeo: Yes.

Mr Low Wu Yang Andre: Thank you. So, I think my response to that would be, I believe Mr Yeo was referring to the defence of fair comment, the defence, the tort of defamation. I think it has been stated that the test for fair comment involves four elements, but none of that refers to malice. So, as stated in the case of Review Publishing v. Lee Hsien Loong, there are four elements to the test. The words complained of are "comments", "though they may consist of or include inferences of facts", "the comment is a matter of public interest, "the comment is based on facts", and finally "the comment is one of which a fair-minded person can honestly make on the facts proved".

So, the element of malice actually comes later as a rebuttal to the usage of this defence. I do not intend to belabour the court with an extended discussion on the legalese here. I think suffice to say —

Mr Deputy Speaker: This is not a court. Yes. This is the House.

Mr Low Wu Yang Andre: Yes, sorry.

Mr Deputy Speaker: Carry on.

Mr Low Wu Yang Andre: Belabour the House. So, the crux of my point is that section 19, also a definition of online harm, has the structure of first setting out a definition before proceeding to have a carve-out. So, I do not see why we do not – we cannot treat other definitions and other harms the same way, where we are proposing that you have a broad definition of what online harassment could be, and we explicitly and expressly carve out fair comments, so that there is clarity for everyone who is reading the legislation. Thank you.

Mr Deputy Speaker: So, is that a clarification of your own volition, or are you seeking a response from Mr Alex Yeo? I would like to understand.

Mr Low Wu Yang Andre: Yes, I would like to seek a response from Mr Alex Yeo. Thank you.

Mr Deputy Speaker: Very well. Mr Yeo, would you like to respond?

Mr Alex Yeo: Mr Deputy Speaker, I thank Mr Andre Low for his clarification. My position is a simple one. When I say that fair comment is unnecessary, I mean that there is no example in my mind, given the definition of fair comment. And I agree with Mr Andre Low, I did not want to list out all four elements and then list out the rebuttal to the defence, and so on, because I did not want it to be too legalistic in my speech.

Based on that definition, I cannot imagine that there could be any online communication that would be caught under the definition of clause 9 – online harassment – and that is my point. And if you look at what it is, as I said in my speech earlier, online harassment under clause 9 requires a communication of online material that is threatening, abusive, insulting, sexual or indecent, and causes a victim humiliation – and something else, I cannot remember, my notes are over there – but the point is that, suffice that the point to be made, Mr Deputy Speaker, is that a fair comment would not be caught under section 9, and therefore, it is unnecessary.

Mr Deputy Speaker: Mr Andre Low, do you require a clarification arising out of that? No? Mr Yeo, do you have anything else to add? In that case, we will move on. Ms Mariam Jaafar.

5.54 pm

Ms Mariam Jaafar (Sembawang): Mr Speaker, I rise in strong support of this Bill. This Bill matters – to every parent who has ever worried about what their children see online, to every small business owner targeted by scams or false associations and to every young person who has faced abuse, ridicule and humiliation online, sometimes spilling offline. This Bill tells them something powerful: that the law sees their pain. That we are serious about safeguarding the dignity, privacy and well-being of every person online without stifling creativity, innovation, expression or enterprise.

This Bill is a major step forward. It is principled and practical – the product of years of careful consultation and iteration, including insights from last year’s Motion on Building a Safe and Inclusive Digital Safety in this House. In that debate, I called for an inclusive and safe digital society even in the age of AI. I said that platforms that profit from attention must also take responsibility for harm, including AI-generated content such as deepfakes.

This Bill delivers. For the first time, victims have a single front door and a clear pathway to redress. Perpetrators and platforms can no longer hide behind the anonymity of code or contract or algorithm. It holds intermediaries and platforms accountable when they fail to act reasonably upon notices. It clarifies the shared responsibility. We move beyond "blame the victim" or "blame the platform" to reasonable duties on everyone with power to prevent harm – users, group owners and administrators, and service providers and platforms, paired with liability for frivolous or abusive notices to prevent weaponisation.

It empowers good actors. By defining administrators’ duties, we give WhatsApp, Telegram, Discord group owners clarity: act reasonably and promptly when put on notice and you will be fine. Most community admins want safe spaces, the law should back them up with guidelines and safe harbours for good faith action.

It empowers the OSC to issue Right-of-Reply Directions, Reduce Engagement or amplification and award enhanced damages for persistent offenders – strong but calibrated safeguards for each type of online harm.

And it is forward looking. Even though the first phase focused on five online harms, the planned inclusion of inauthentic material or deepfake abuse and online impersonation recognises how AI is changing the very texture of online truth.

The Bill is also legally coherent. I have to admit that preparing to debate this Bill was quite a challenge, given the number of consequential changes to related laws such as harassments protections, remedies for identical reports, recognition of humiliation as harm. I commend the Ministry for the hard work aligning our justice system with the digital reality.

But I have three clarifications. First, for the new civil torts, what factors will courts consider? Speed of removal? Context? Labelling? Prior warnings? A short practice note would guide platforms, administrators and users, to act correctly and swiftly.

Second, timelines, including for appeals. What are the expected service levels from report to direction and typical reconsideration timelines? A fast track for high-risk harms would be reassuring.

Third, data protection and identity notices. When identity disclosure is required to pursue relief, what privacy safeguards and purpose restrictions will apply? We should deter abuse while protecting victims’ and bystanders’ data.

I also have five suggestions where we might go further.

First, moving from reactive to anticipatory protection. Today, duties are triggered after harm occurs, hence victim redress. Other jurisdictions now require major platforms to assess and mitigate certain risks before harm occurs, especially for children and vulnerable groups. A limited, risk-based set of anticipatory duties – not blanket surveillance but responsible design – would help to prevent harm.

Second, enhance algorithmic accountability. Platforms shape what we see. Algorithms are the new gatekeepers of truth. Platforms should explain, in principle, how their systems amplify or reduce harmful content. Much like how banks must disclose material risks, systemic accountability requires visibility.

Third, strengthen rehabilitation, digital literacy and AI awareness. Not every harmful act online is malicious. Sometimes it comes from ignorance, inexperience or immaturity. Courts should have discretion to combine compensation with digital-literacy or restorative programmes, especially for younger offenders. Long term safety is not just about punishment. It is about cultivating a culture of respect online.

Fourth, extend transparency to outcomes. Platforms could publish anonymised statistics: notices received, actions taken, average response times – so that the public can trust and policy-makers can measure progress.

Fifth, as technology evolves – so must the law. Periodic reviews are essential to ensure the framework remains effective, especially for emerging technologies like generative AI and immersive platforms.

Gen AI tools like ChatGPT can create harmful content. Left unchecked, this can put children and vulnerable users at risk. We should consider whether "content generating intermediaries" should be prescribed under our online safety framework with clear obligations on content moderation, age assurance and transparency. This could include, for example, documentation of model training or data governance, or disclosure of AI safety plans as has been passed in California.

Any adaption must also reflect the unique harm pathways, user prompts leading to AI outputs, in contrast to social medica and be risk-based to safeguard Singaporeans without stifling innovation.

Sir, I know these will be difficult issues and balances to grapple with. But in times or rapid technological change, our task is not to fear the future. Uncertainty should not make us pessimistic, it should make us better builders of systems that are inclusive, safe and resilient. Our laws must remain principles-anchored, technology-neutral and future-ready.

I have also studied the amendments proposed by the hon Member Ms He Ting Ru and other Members from the WP with interest. I appreciate the desire for safeguards against overreach, but as many have said today the law must put victims front and centre. A "fair comment" exception risks confusion and loopholes. Harassment is clearly defined, "fair comment", as seen by the debate earlier, is vague and legally complex – and may give bad actors an easy excuse.

Raising the threshold for OSC action from "reason to suspect" to "reasonable grounds to believe" would raise barriers for victims and delay timely relief. Waiting for victims to produce evidence of reasonable grounds to believe before acting is not protection. It is delay, it is harm, it is injustice.

The hon Member Mr Andre Low has made some comparisons to other jurisdictions – UK and Canada. We are not here to copy the UK and Canada. We are here to protect Singaporeans today.

But he also seems to believe the thresholds, the blanket threshold that applies for this instance to also to victim redress. And I actually wonder if the Minister can clarify that.

The UK Online Safety Act does use the reasonable grounds to believe in specific context, such as when defining the scope of regulated services, when defining the platform's duty of care, and for final enforcement, but I believe not to victim redress, which happens at the platform level, so it is not equivalent to the OSC stepping in to issue, takedown or redress orders. In practice, Ofcom's own guidance shows it can and does act on complaints and indications of harm. When Ofcom or the platform can act to prevent further harm in the early-stage assessment phase, while evidence is still being gathered.

That does not seem dissimilar to the balance that this Bill strikes. The "reason to suspect" threshold ensures victims are seen and helped immediately, while safeguards prevent abuse of the system, including appeals for reconsideration. This is practical, principled and humane lawmaking, putting people over procedure.

Can the Minister confirm my understanding?

Removing the finality clauses from the appeals provisions and adding a right to appeal OSC cases to the High Court could turn quick relief into long and costly litigation. The Bill already provides for a clear appeal architecture – reconsideration by the Commissioner, then an independent Online Safety Appeal Panel and Committee. This ensures that powers are reviewable, not arbitrary.

Let me make this relatable. Parents worried about cyberbullying, or professionals whose reputations are harmed online, do not care about legal theory. They care about timely, practical action – recognition of their harm and real, enforceable recourse and real accountability. And a system that prevents someone else from being hurt in the same way. This Bill gives them justice that is felt, not just written on paper.

Our goal must be quick relief and effective, accountability, not endless second guessing and legalese. And that is why I did not support these amendments.

Mr Deputy Speaker, "online harm" is not faceless. It is the woman who is a victim of doxxing; the honest small business owner whose reputation is destroyed overnight; the retiree who is targeted and manipulated online, losing his life savings; the woman whose private images become public forever; the young creator attacked for speaking up. But is also every Singaporean who has ever hesitated before posting something kind, for fear that cruelty will follow.

This Bill protects all of them – and reminds us all of our responsibility. Safety online cannot be legislated into existence; it must be built through culture, enforcement and example.

Some may ask: why does this matter so much for Singapore? Because the Internet is where our children learn, our businesses grow and our communities gather. When people feel safe online, they feel they belong. And when they feel they belong, they can speak, create and care for other another.

Our laws already protect people in physical spaces – at work, at school, in our homes better than most other places. This Bill extends the same promise into our digital spaces.

Mr Deputy Speaker, I recently met a teenage girl in Woodlands who was a victim of cyberbullying. When I asked what she wanted most, she told me, "I just want to feel safe when I’m myself online." That, at its heart, is what this Bill is truly about. This Bill does not just respond to online harm. It rebuilds trust in the digital public space. It tells victims: "You are seen". It tells platforms: "You are accountable". It tells society: "We can be both open and safe".

So, I urge all Members today: let us pass this Bill with a united voice and continue refining it, so Singapore remains not only the safest place to live, but the safest place to be yourself, online and offline. Online safety is everyone’s right.

Mr Deputy Speaker: Mr Foo Cexiang.

6.07 pm

Mr Foo Cexiang (Tanjong Pagar): Mr Deputy Speaker, the Internet has become the space where many Singaporeans spend most of our time on any given day. It is where we work, socialise, play, engage services, amongst many other things.

Unfortunately, "online" has also become "where" and "how" malicious actors inflict harm on others. And the harm inflicted in the virtual world has far-reaching and deep implications in the physical world because they tend to permeate the very being of the individual victim and our communities, with potentially long-lasting damage.

A resident teenager came to my meet-the-people session recently. She shared with me the daily fear she lives under, being subjected to online harassment from a perpetrator based overseas, and who has been able to somehow gain access to her private online accounts to blackmail her and even threaten others, her friends in her social network. She yearns for effective enforcement action that can stop the perpetrator quickly and decisively.

Sir, this landmark Bill could not have come sooner for my resident and many others in Singapore. It reflects the Government’s commitment towards providing swift and decisive recourse to victims of online harms.

This is why I cannot agree with the amendments proposed by the WP to raise the threshold for the Commissioner of the OSC to give directions from "reason to suspect" to "reasonable grounds to believe". As Mr Andre Low says, "this is not mere semantics". The higher threshold proposed by the WP will mean that the Commissioner will need to take more time to put together the "reasonable grounds to believe" before giving directions, during which the victim continues to suffer in silence.

I agree with Mr Andre Low that we need institutional safeguards to ensure fairness for all parties, but I would argue these safeguards have already been provided for in the Bill. Clause 58 provides for the application for reconsideration of the Commissioner's decision. Clause 63 provides for the appeal against the Commissioner's reconsidered decision by the appeals board. And further, clause 69 makes the provision of false information an offence to be dealt with in the courts.

So, what these institutional safeguards mean is that the Commissioner can act swiftly to protect the purported victims while also injecting urgency on the part of the accused to clarify, seek reconsideration, make an appeal if they feel that the order was not the right one or if it was made with incomplete facts but it gives them urgency to do so and gives the victim justice.

So, given the rapid and exponential impact of online harms, this approach strikes the right balance.

However, this Bill alone is not a silver bullet to the threat of online harms, nothing can be given the complexity of it and we will require the enforcement capabilities, as well as local and international partnerships to implement the Bill effectively. So, I hope you can allow me to raise a few points and clarifications for the Minister.

First, given the severe mental and emotional distress online harms have caused, I would like to ask Minister whether the prescribed penalties of a fine of up to $20,000 and a jail term not exceeding 12 months for the failure to comply with OSC’s direction is adequate to serve as a deterrent to stop the offending actions promptly.

Also, is the Minister able to provide any guidance on how the courts should calibrate the penalties to be meted out? Because as a general principle, Parliamentary intent is key in the application of the law, and we should, as far as possible, make it clear for the purposes of administration of justice.

Next, I touch on the extra-territorial aspect of the offences under consideration. Clause 78 would empower our courts legally to deal with offences under the Bill which are committed outside Singapore, but the courts’ ability to deal with such offences would be contingent on the OSC’s efficacy in detecting the wrongdoing, getting the wrongdoers who are located overseas to comply with the directives and apprehending them if they do not.

Operationally and practically speaking, will the OSC’s arm of the law be long enough to reach the wrongdoers who may be located overseas? Who will be our key regional and international partners? Will the overseas Internet service providers and app distribution services comply with OSC’s order?

Relatedly, does the Bill contemplate online harms perpetuated by foreign actors, perhaps as a means of foreign interference?

Sir, I now move on to the issue of less conventional means of disseminating malicious content. Like the non-digital world, the Internet comprises dark and less overt corners as well.

So, while the Bill provides a comprehensive framework to deal with offences committed through the mainstream channels, are there sufficient and effective safeguards and capabilities against circulation by dark social media, dark traffic or perhaps even the dark web? For the membership or following of certain obscure forums can far outnumber groups on more familiar platforms such as Instagram or TikTok. Members of this House will know that SG Nasi Lemak Telegram chatgroup at its peak had more than 44,000 members.

The risk of involvement of organised or syndicated crime groups could further significantly enhance the scale of harm. Identity protection tools such as the Virtual Privacy Network (VPN) will add a layer of complexity to detection and investigation, enforcement efforts.

To be clear, I am not proposing the policing of private conversations. However, we ought to do our best to eliminate or reduce the gaps where they are.

Sir, my final point is on the resourcing of the OSC. Several Members have made the same point. Has MDDI formulated a projection of OSC's workload? As a to-be newly established agency, the OSC will have to deal with multiple issues on various fronts, besides the core work itself, including administrative matters, logistics, human resource and financing.

How will MDDI ensure that OSC will be endowed with sufficient resources to meet its core mission? Also, has MDDI considered if OSC’s officers will require enhanced powers, including possibly such as those under the Criminal Procedure Code to be more effective in its investigation and enforcement?

Sir, the OSC has a heavy and urgent task ahead of it. And in this starting phase, my view is that its focus should be on getting its effectiveness on the ground to stop cases of online harm. This is also why, while I understand WP's proposal for annual reporting to Parliament, I am not supportive of the proposed amendment to this effect at this point, given the administrative load it will add to the OSC in its current phase of development.

Mr Deputy Speaker, notwithstanding my clarifications, I stand in support of the Bill.

6.16 pm

Mr Deputy Speaker: Dr Wan Rizal. Before that, do you have a clarification, Ms Chong?

Ms Eileen Chong Pei Shan: Yes, I do, Mr Deputy Speaker.

Mr Deputy Speaker: Of a speaker before you in this debate?

Ms Eileen Chong Pei Shan: For Mr Foo Cexiang.

Mr Deputy Speaker: Sorry?

Ms Eileen Chong Pei Shan: I have a clarification for Mr Foo Cexiang.

Mr Deputy Speaker: Yes, please proceed.

Ms Eileen Chong Pei Shan: Thank you, Mr Deputy Speaker. Thank you, Mr Foo. I would just like to seek a clarification regarding the Member's point on fair comment, and if the Member can elaborate on what he means.

Because surely, a person cannot be guilty until proven innocent, because anyone can lodge a report to this Commission, and I would urge Mr Foo to refer to Part 4 of the draft Bill. And if the Commission has to assess and take action based on a single person's viewpoint, however valid as it may be, surely it should be based on reasonable grounds to believe that something has actually occurred, rather than just a reasonable suspicion?

Mr Deputy Speaker: Sorry, just to confirm, did you say a person cannot be proven guilty until proven innocent?

Ms Eileen Chong Pei Shan: Mr Deputy Speaker, I meant, surely a person should not be guilty until proven innocent; should not be assumed to be guilty until proven innocent.

Mr Deputy Speaker: I see, I understand. Thank you for that clarification. Mr Foo, would you like to respond?

Mr Foo Cexiang: Yes, I would. I would just like to clarify that I do not think the order that is prescribed by the Commissioner presumes guilt. There is no judgment on whether somebody is guilty. In fact, that is why it allows the reconsideration of the order and upon thereafter, an appeal as well. So, the orders serve that function.

As for the second clarification that the Member made was how would you come to the conclusion that there is something to suspect vis-a-vis reasonable grounds to believe. And I think that is something that the Commissioner will need to take judgment of, based on the facts available to him or her. Because for them to have reasonable grounds to suspect, and I do not want to prejudge what the Commissioner's decision will be, there must be information that is provided from the victim, admittedly, but it will also be based on actions that has been done by the perpetrator.

Mr Deputy Speaker: Ms Chong, would you like to raise any clarifications arising from Mr Foo's point?

Ms Eileen Chong Pei Shan: Yes, Mr Deputy Speaker. I think to Mr Foo's point about how the Commissioner's assessment will be based on the report that is filed as well as the actions taken by the perpetrator, would Mr Foo not agree that if it is based on a report filed by the victim, then it is alleged actions by a perpetrator, and the Commissioner or the Commission would then need to investigate to ensure that this action has indeed happened, and therefore warrants any directions or orders.

Mr Deputy Speaker: Mr Foo, would you like to respond to Ms Chong? Please proceed.

Mr Foo Cexiang: Yes, pardon my haste. I wanted to clarify that, indeed, and this is why I mentioned that there are strong safeguards within the current framework of the Bill, because if it is indeed found that the purported victim had submitted false information to the Commissioner, that matter will be dealt with in the Courts and the punishments are severe. And for that, I refer to clause 69 of the Bill.

Mr Deputy Speaker: Mr Andre Low, do you have a clarification for Mr Foo?

Mr Low Wu Yang Andre: Yes, I do.

Mr Deputy Speaker: Please proceed.

Mr Low Wu Yang Andre: Yes, my clarification for Mr Foo is, essentially, "reason to suspect" and "reasonable grounds to believe", the two thresholds that we are discussing and we have proposed in our amendment, these are well established legal tests: one is a subjective test and one is an objective test. And the reason we have, on balance, decided that the objective test is better is precisely to avoid situations where you have online harms where, basically, no parties are angels – he said, she said, we said, they said. We have all seen scenarios where influencers have cat fights online. We want to avoid a situation where the Bill is weaponised by one party against another and that is, why we have proposed that the threshold be set at a higher level.

Mr Deputy Speaker: Is that a clarification of your own volition, or are you seeking a clarification? [Interruption.] Hold on. Is that a clarification of your own volition, or are you seeking a response from the hon Member Mr Foo?

Mr Low Wu Yang Andre: Yes, I am seeking a response from the hon Member Mr Foo.

Mr Deputy Speaker: Mr Foo, would you like to reply?

Mr Foo Cexiang: Thank you, Mr Deputy Speaker, I would indeed like to reply. I think the hon Member brought up the issue or the case of two different groups of bickering online influencers. And I would put it to the hon Member that, in such instances, where it is a bickering that is made out in the public, online world, the Commissioner and all the officers in the OSC will have access to this information, they will be able to come to their own judgment, whether or not there are reasonable grounds to suspect online harm.

Mr Deputy Speaker: Would either Mr Foo or Mr Low seek to raise any further clarifications before we move on with the debate to Mr David Hoe? None. Mr David Hoe. Oh, Dr Wan Rizal.

6.23 pm

Dr Wan Rizal (Jalan Besar): Mr Deputy Speaker, I rise in support of the Bill. The Internet is central to how we learn, we work and we connect. Yet the same space that empowers can also wound. Online harassment, doxxing and impersonation cause not only reputational harm and damage but also emotional and psychological harm.

In my years with working with students, I have seen how online words or even emojis can weigh heavily on a young mind. What begins as a passing comment or post can linger for weeks. It shapes how young people see themselves, how they trust others and how they engage in the world.

But we know that online harm affects beyond youths. A 2025 survey by MDDI found that one in three Singaporeans had faced online harm and two in five victims reported serious emotional distress. Behind every statistic is a person whose confidence, rest or peace of mind has been shaken. When digital interaction becomes constant and instantaneous; the pursuit to respond, to perform and stay visible has grown. You can ask many of the MPs here.

For many, that brings new forms of stress and exposure, the kind that erodes stress and confidence quietly but deeply. This Bill therefore is not merely about regulating conduct. It is about reinforcing dignity, trust and wellness and well-being in our digital lives. It recognises that feeling safe online is part of feeling secure in daily life.

Sir, allow me to touch on three key areas of the Bill: timely relief, accountability and collective protection and how each strengthens the well-being of our people.

First, through timely relief. The Bill establishes the OSC to give victims a faster and more direct path to help. Each day that harmful content remains online, it deepens harm. So, consider the Singapore Sports School deepfake case in 2024, where manipulated images of students were circulated. Counsellors reported anxiety and social withdrawal among those who were affected. A faster redress system could have reduced that distress and restored a sense of safety to our youths. This reflects a simple principle: we step in early, we support recovery and we prevent harm from deepening.

In the many community dialogues that I have held, parents often share that when their child is targeted online and they do not know where to run or turn to. To them, Police reports feel daunting, platform responses are slow, or may I say, very slow. The OSC will fill that gap with a trusted, accessible pathway for relief.

Second, on accountability. The Bill introduces statutory torts for online harms, such as harassment, stalking, doxxing and impersonation. It defines responsibilities for both perpetrators and for platforms. This is not about fuelling litigation. It is about setting boundaries. When expectations are clear, behaviour improves without constant enforcement. Ultimately, a safer online space grows from shared values from individuals, from families and institutions choosing responsibility over convenience.

I stressed, just as in my last Bill, platforms must do their part. For too long, victims have waited days for responses while hurtful content spreads unchecked. That is not neutrality. To me, that is neglect. These platforms hold immense influence over public life, and with influence comes duty. This Bill reminds them that their responsibility to people cannot end at their servers' edge.

The MDDI's mystery shopper tests found that more than half of valid reports of serious content were ignored or delayed. This Bill changes that, for platforms will now have a duty of care, not merely a policy preference. And when young people see that harmful actions carry consequences, lessons on digital citizenship gain meaning. It teaches them that civility is not weakness, it is strength.

Thirdly, through collective protection through representation. Sir, the third pillar extends protection beyond individuals to institutions. Authorised bodies, including trade unions and professional associations, can now represent victims.

This is significant because it extends workplace safety into the digital realm. Unions, such as the Healthcare Services Employees Union (HSEU) and the Singapore Teachers' Union (STU) can now act when members face online harassment in the course of duty. Over the years, we have seen so many examples and I am glad that this Bill comes in very timely. As a labour MP, I have met educators, healthcare workers and young professionals who face online criticism that turned pretty personal. They told me it is not just the comments, but the silence that followed, not knowing who to turn to and whether who would stand with them. That sense of isolation is something we must change. And I am glad this Bill enables the unions and institutions to act swiftly because the ability that can make the difference between despair and recovery is now here. When unions step in, they offer more than legal aid. They offer reassurance, solidarity and care – support that protects both moral and mental well-being.

In May 2025, the Ministry of Health reported that nearly two-thirds of community-care workers had witnessed or experienced abuse or harassment. Those numbers remind us that online harm is not abstract. It touches people who serve us daily with empathy and commitment. In my community work, I have also heard from residents and parents who say they feel exposed online, sometimes hesitant to speak up for fear that comments or images may be twisted or misused. That quiet fear undermines confidence. And this Bill offers a safety net, from institutional support to clear remedies and assurance that no one has to face such harm alone.

Sir, despite my strong support for this Bill, I have five clarifications for the Ministry to respond to.

Firstly, on service benchmarks and triage. Will the OSC set clear service benchmarks, for example, acknowledgement within hours and resolution within defined days, and include escalation lanes for frontline public officers and workers, such as healthcare workers or educators? A delayed response to these workers who serve us in the frontlines prolongs their distress and can affect their confidence in carrying out their duties.

Second, on parallel duties for platforms. How will the Government ensure that platforms remain first responders even after the Commission is activated, so that they cannot delay or defer action? For workers who engage publicly, like the teachers, nurses and the social workers, harmful posts can spread faster than formal action, and platforms must act quickly to protect those who serve the public.

My third point is on protection of privacy when identity disclosure is ordered. How will personal data be protected even as perpetrators are unmasked? Confidentiality is not about data, it is about psychological safety. Victims are more likely to seek help when they know their identity will be treated discreetly and with care.

My fourth point is on sectoral readiness and guidance. Given that schools, hospitals and community agencies are often the first to encounter online-related distress, will the relevant Ministries consider developing practical guidance to help staff activate these protections and respond with empathy? Such inter-agency readiness, even if outside of this Bill’s scope, will make the OSC's work more effective and ensure that victims receive consistent, trauma-informed support.

My last point is on union and professional-body activation. What will be the process for unions or professional associations to act swiftly on behalf of their members? Unions are often the first line of support when workers face online abuse. Clear guidelines and authorisations allow them to move quickly to offer legal protection, pastoral care and reassurance before harm escalates and I hope the Ministry can consider this. Mr Deputy Speaker, in Malay please.

(In Malay): [Please refer to Vernacular Speech.] Sir, this Bill carries three main purposes. It provides timely relief to victims of online harassment through the establishment of the OSC; it sets clear responsibilities for individuals and platforms so that all parties act more responsibly; and third, it allows unions like NTUC and professional associations to represent members who become victims so they do not stand alone when attacked in cyberspace.

Fundamentally, this Bill does not merely protect digital safety, it also safeguards our mental health and well-being, stability of the family and social cohesion. We want a society that has the courage not just to speak up, but to do so in a civilised manner, one that is active online but also mindful of the impact of words and actions.

Online safety is part of social safety. When citizens feel safe enough to interact politely and respectfully, both in cyberspace and the real world, we will strengthen trust, compassion and the sense of solidarity in our nation.

(In English): Mr Deputy Speaker, this Bill protects more than digital boundaries. It protects peace of mind. It complements our wider efforts in preventive health and mental well-being by reducing one of today’s quiet stressors: online hostility. When a teacher, a nurse or parent knows that the law stands behind them; when a student or a young person feels that their dignity will be defended; when the union can stand with its members in solidarity and assurance, then trust grows. Laws alone cannot guarantee kindness. But laws can make clear what we stand for: respect, responsibility and care. This is how we build a digital society that strengthens, not fractures, our people. Notwithstanding the clarifications I raised, I support the Bill.

Mr Deputy Speaker: Leader of the House.




Debate resumed.

Mr Deputy Speaker: Mr David Hoe.

6.34 pm

Mr David Hoe (Jurong East-Bukit Batok): Mr Deputy Speaker, this Bill is a significant step towards a more accountable digital space. Specifically, it gives victims of online harms, new avenues of relief through the Commissioner of Online Safety, impose clear duties on online service providers, and creates civil rights of action for victims. This Bill is close to my heart, because I have met Singaporeans who suffered from the harms described in Part 3 of the Bill. For them, the humiliation and emotional distress that they have gone through are overwhelming.

This Bill, if implemented well, will go some way to deter perpetrators of online harms and give victims of online harms much clearer routes of recourse. Against this backdrop of my support for this Bill, I would like to seek clarifications on how this Bill works in practice across five areas.

My first clarification is about the operational tempo and service standards of the Commissioner’s office. On operational tempo, the Bill rightly allows victims to seek the Commissioner’s intervention without first going to court. Beyond "being fast", we also need to communicate what is being done fast. This means that efforts should be made to lay out the process as clearly as possible to the general public, so that the victim knows what actions will be taken and what to expect from the moment they decide to make a report.

You see, it makes a difference by telling someone that within a certain number of hours, they will receive a response versus telling someone that over the next X number of hours, actions a, b and c will be taken and they will be updated at the end. As part of implementation, the Commissioner should consider issuing advisory guidelines that set out triage principles, communication milestones, indicative response time across various categories of online harms by severity.

On service standards, clause 23 in Part 4 enables victims to make a report of alleged online harmful activity. But while the Bill mentions that it aims to provide a timely means of redress, it does not yet prescribe what counts as a timely response. The point here is this: online harm spreads by hours, not by weeks or months. Hence, I wonder if the Commissioner’s office can consider setting and publicly communicating service standards or target response times for various categories of online harms by severity.

Saying this, I am cognisant that speed also requires resourcing. The Commissioner’s office should be staffed and supported to act with urgency. I would welcome the Ministry’s plan for resourcing so that operational targets are achievable rather than aspirational.

I also think that it is more appropriate to leave service expectations and operational key performance indicators (KPIs) as guidelines as compared to statute because hard timelines in primary legislation like this Bill may risk overwhelming staffing when spikes in online harm cases occur. Hence, the guidelines can be updated swiftly and calibrated to capacity at the agency level, so that the bar can be set at achievable and meaningful levels. To strengthen this approach, the Commissioner should also consider having a standing advisory group of relevant practitioners such as technologists, clinicians, child-safety specialists and academics to refresh workflows, KPIs, and give guidance on online harms periodically.

My second clarification involves accessibility and public readiness. Clause 23 of Part 4 provides that victims can report online harms to the Commissioner, but it is also equally important that they know how to do it. Online harm victims will also include those who are less technologically savvy. It is hence vital that we ensure that the process of online harm reporting is kept simple, multilingual and accessible. This could mean: (a) ensuring that the online and mobile service platforms use plain English, not technical jargon, and official languages to communicate to the public and guide victims of online harms through reporting; (b) proactively providing step-by-step guides via major online and mobile platforms, so that they know how to file an online harm notice; and (c) at the same time, setting up offline helplines and community touchpoints for those who are digitally less-confident users to keep them informed of the work of the Commissioner and the Office.

Clause 94 in Part 12 of the Bill also allows victims to send an online harm notice to an online service provider. This is a positive step towards direct accountability. But for this to be utilised effectively, online services must make sure that reporting is easy to find and use.

In this regard, would the Government, along with the Commissioner and the Office, study how major platforms will implement their online harm reporting in practice and consider whether they are accessible, especially to the vulnerable users. For instance, sending an online harm notice within a mobile app or website should never require a user more than 20 minutes navigating multiple different landing pages and entering unnecessary inputs that the provider would already have gotten from the moment they had registered for an account. You see, these design features are not cosmetic, because they determine whether victims can exercise their rights effectively.

Thirdly, we should also consider the practical aspects of civil proceedings and cost of justice. The Bill introduces statutory torts, which gives victims the right to bring civil proceedings for defined harms such as intimate image abuse, child abuse imagery and so on. Hence, civil action involves practical hurdles, legal costs, disclosure processes and emotional strain.

I would like to ask the Government to provide more clarity on what the typical steps will be for a victim to bring such proceedings and whether there will be support measures, especially for those who might not have adequate financial resources to do this. The Commissioner and the Office should also consider communicating the availability of these resources from the onset, because at the end of the day, justice should never be determined by income. This upfront assurance will reduce the barriers to entry to report and deter the actions of perpetrators.

My fourth point of clarification is on issues related to damages, humiliation and redress. Part 5 of the Bill on the directions and orders of the Commissioner, as well as Part 13 of the Bill on Damages and Remedies provide various reliefs and remedies to victims of online harms such as removal of content, account restrictions and compensation. However, I would like to seek clarification on whether non-monetary losses could also be considered within this Bill framework. This may go further in helping the victims to rebuild their lives.

Relatedly, public communications and victim-facing materials by the Commissioner and the Office should be trauma-informed and non-stigmatising. They should state plainly that the responsibility lies with the perpetrator and not invite interpretations that the clothing, occupation, social activity has caused harm to be done unto them. This is important because victim-blaming deters reporting and delays help-seeking, and compounds humiliation and distress.

My final point: prevention and safety-by-design. Deterrence after harm is necessary, but prevention of online harms upstream is better. This Bill defines serious harms clearly, but we should also work to ensure that major online platforms operating here continue to prioritise user safety and privacy-by-design principles.

These include features: (a) age-appropriate design; (b) high-privacy default settings applied for minors; and (c) safer recommender settings for young users. Such upstream measures would ensure that our protections are not only reactive but proactive making the digital environment safer before harm occurs. Hence, I hope that the Government will actively work with major online and mobile platforms on this.

In conclusion, Mr Deputy Speaker, this Bill gives Singapore a strong framework: defines harms; swift administrative directions; and civil routes for redress. But to make online safety real, we must pair law with speed, prevention, and accessibility. This is how we build an online environment that is safe, fair and inclusive, while upholding responsible speech and due process. I stand in support of the Bill.

Mr Deputy Speaker: Mr Xie Yao Quan.

6.44 pm

Mr Xie Yao Quan (Jurong Central): Sir, it is worthwhile to take a step back and see where we are as a jurisdiction today in securing the online commons for our people, and where we will be if we pass OSRA today.

It was opportune that we debated and passed the Criminal Law (Miscellaneous Amendments) Act yesterday. That Bill provided enhanced penalties under our criminal laws for many of the same online harms that are the subject of this Bill. Online harms, such as image abuse, circulation of obscene materials, including child abuse materials, AI-generated images, doxxing and so on.

Previously, Parliament had also passed other Acts, like OCHA, to deal with online scams, hate speech and so on. In other words, we have been keeping our criminal laws up to date and fit for purpose, and these updated laws provide our people the assurance of statutory retribution for online harms.

Operating alongside our criminal laws, OSRA, before us today, will provide statutory relief, including injunctive relief and statutory torts and civil remedies for online harms. Taken together, this body of laws, while providing justice to our people experiencing online harms, can also help deter online harmful actors, thus further protecting our people in the online commons.

I will say that on the whole, the Government has done a tremendous job architecting our laws to protect and secure the online commons for our people, in a very thoughtful, thorough, yet calibrated and pragmatic manner.

This is an important thought, because after OSRA is passed today, if it is passed, the focus will increasingly turn towards the enforcement of these laws, the implementation of policy and the painstaking work of building up organisations, capabilities and practice to deliver on the intended legislative and policy outcomes. Implementation is policy, and the focus will turn towards implementation. Minister Josephine alluded to this in her speech as well.

And so, the point I wish to make is this: that in implementation, let us have trust that the Government will go about this implementation in the same thoughtful, thorough, yet calibrated and pragmatic way as it had architected our laws for the online commons.

Being a later speaker in this debate, I have had the privilege to hear many colleagues who spoke earlier. Most called for sufficient resourcing of OSC to handle workload and do its job, and most also called for clear service levels – time taken between reporting and relief – to give Singaporeans certainty and assurance of the relief they can expect when they report to OSC.

I understand where colleagues are coming from. Speed of relief is, indeed, so crucial for victims of online harm. And this Bill, which provides for the establishment of a dedicated agency to deliver such relief to victims of online harm, could not have come soon enough. And so our natural instincts will be to say, "Let us get the relief out to the victims who need it today, who needed it yesterday. Let us get this relief out quickly."

But the reality is that building up an organisation, building up capabilities, especially fast-cycle capabilities, takes time. We cannot rush the building of fast-cycle capabilities, paradoxically. So, I hope all of us, including and, perhaps, starting from this House, can give the Government the time and space needed to build up OSC properly, so that it can deliver the quick, effective, on-point relief that we, ultimately, are all seeking out for victims, present and future.

On workload and resourcing of OSC, my thoughts are as follows. MDDI's survey has found that one-third of respondents experienced online harm in the past year. We have more than three million Singaporeans alone in our population. And so, if the MDDI survey is representative of the general population, we could well be looking at hundreds of thousands of potentially reportable online harms every year.

How might we be able to resource OSC to deal with a potential workload like this? I am going to posit that we may never be able to resource it enough, because, fundamentally, there will be, to a large degree, what we call supply-induced demand for the OSC's services. Meaning, the more services the OSC supplies, the more demand it may actually induce and therefore, demand will likely keep running ahead of supply.

And so, beyond doing our level best to resource OSC to keep up with demand, the central task of implementation, in my mind, is to keep the gap between supply by OSC and demand for OSC tight. This means we must also find a way to manage demand, to manage expectations – manage our expectations – on OSC.

On service levels, report to relief times and having clear guidelines and standards at the outset, especially given the Australians' experience with operationalising their eSafety Commission, I, myself, have thought about this issue a fair bit and I have come to the conclusion that it may be better for us to not impose definitive performance guidelines on OSC at this stage, as we are building it up. Rather, I say, let us give time and space for OSC to establish a baseline, to establish its rhythm and give time and space for supply and demand to find an equilibrium and most importantly, give time and space for all of us to go through the necessary discovery process on relief standards collectively – as a society. And then, we can look to codify these service levels and standards at the right time.

Because as what the Minister alluded to, we have very few precedents to refer to and OSC has to pretty much create its own playbook. So, the best way to set OSC up for success, borrowing the hon Member, Mr Sharael Taha's words, "the best way to set it up for success is to give it the time and space to build itself up properly."

In the meantime, like I said, let us trust that the Government will go about implementing OSC with the same verve and in the same thoughtful, thorough, yet calibrated and pragmatic manner that it has architected our laws to secure the online commons. Sir, I support the Bill.

Mr Deputy Speaker: Mr Ng Shi Xuan.

6.54 pm

Mr Ng Shi Xuan (Sembawang): Deputy Speaker, Sir, I rise in support of the Bill. For the young, or whom we call the digital natives, the Internet is their classroom to the world, where they learn, play and form friendships. Yet, this classroom has few teachers at the door.

I have met parents who tell me their children can mute a chat or delete a post, but they cannot delete the hurt or the humiliation that came with it. Online harms – whether harassment, doxxing or intimate-image abuse – can follow a child home long after the device is switched off.

This Bill gives victims faster, cheaper routes to seek help. It also holds platforms more responsible for keeping their spaces safe. I support this Bill because it recognises two simple truths: that people need protection and platforms need accountability. However, I would like to seek clarifications on the points below.

Sir, I seek clarification on how the statutory torts under OSRA interact with those already in POHA. Both laws allow victims to take civil action, but if their boundaries are unclear, victims may end up filing under multiple laws – OSRA, POHA and common law, all at once.

That could inflate time and cost, which runs counter to OSRA's intent for quick and affordable relief. I hope the Ministry can issue guidelines or practice directions to make these boundaries clear on when OSRA applies and when POHA remains the better channel. Clarity will help claimants, lawyers and judges navigate cases efficiently, reducing unnecessary litigation.

Sir, one key motivation behind this Bill was to protect children and youths from cyberbullying and emotional harm. Yet, the current text does not distinguish between victims who are minors and adults. I hope OSC will develop tiered response protocols that give priority to victims under-18, especially when harm is ongoing.

If we can respond quickly to a child being targeted online, we can often prevent lasting damage. Online safety should be seen as part of raising happy and healthy children, not separate from it. Our online space must give children room to explore safely – to learn, play and make mistakes, without being preyed upon.

I hope that OSC can also work closely with relevant Government agencies and non-governmental organisations to educate and empower victims to speak up. It is imperative that the protections and recourse available to victims, especially those from vulnerable groups, are easily understandable and accessible. Doing so will help to encourage victims to seek help early and safely.

I have received feedback expressing concerns that victims may run the risk of being exposed to further harm by seeking recourse under the proposed new legislation. To ensure that the new legislation achieves its intended purpose of protecting victims, adequate safeguards should be implemented, including to safeguard the victims' identities and to ensure that the process is fair. In this regard, I am heartened to see that the Bill contains certain safeguards, including clause 54, which imposes secrecy obligations on persons who may receive sensitive information in the course of carrying out their duties; and Part 7, which establishes mechanisms for reconsiderations and appeals.

In addition, I have received feedback that seeks clarity on the types of information that would be considered "private information" for the purposes of clause 11 of the Bill, including sexual orientation, gender identity, sexual diseases or sex-related occupation status. Although the definition of this phrase in the Bill is worded in wide terms and the illustration makes it clear that non-public information relating to medical conditions constitutes private information, it would, perhaps, provide clarity and assurance to potential victims if regulations, which are contemplated under clause 11(2) of the Bill, are swiftly introduced following the passage of the Bill.

I have also received feedback regarding the list of persons who are eligible to make a report to the Commissioner on behalf of a victim under clause 22(2)(a) of the Bill. In particular, there are concerns that there may be victims, who are under the age of 18, who may either be estranged from their parents or guardians, or are fearful of informing them of the online harm that is occurring to them. For example, due to shame associated with the information that is being published about them online, a minor could be fearful of informing his or her parents. Could the Minister clarify the recourse that the minor would have in such a scenario, please?

This Bill builds on lessons from other countries. UK and Australia have implemented similar online safety laws, and it would be useful to understand whether their experiences led to fewer harms or faster response times. If certain measures have proven effective, we should localise and adapt them early. For example, the Australia Online Safety Act requires platforms to comply with removal notices relating to children within 24 hours. Could the Minister clarify the principles that would give the Commissioner's imposition of deadlines for compliance with directions issued under our Act, and whether guidelines embodying such principles will be published, please?

Sir, as someone who supports a pro-enterprise environment, I recognise that compliance must also be realistic. Large platforms have legal teams, moderators and automated filters. Smaller and community-based platforms, including start-ups and local networks, may not.

To keep the playing field level, I hope the Commission will provide clear workflows and templates for reporting and response; set service benchmarks on how quickly cases are acknowledged, investigated and resolved; and offer capacity-building support, such as training, reporting templates or open-source moderation tools, for smaller platforms to plug into. Otherwise, we risk a two-speed digital economy where global platforms comply easily while smaller innovators struggle to keep up. Regulation should lift everyone, not leave the smaller players behind.

Sir, the success of this Bill will also depend on how OSC is set up and supported. OSC will carry heavy responsibilities – from handling victim reports to issuing directions to platforms and coordinating with law enforcement. It must, therefore, be adequately resourced, not only with technical and legal expertise, but with experienced officers who understand the social and emotional dimensions of online harm.

To build public confidence, OSC should not only set but publish clear service benchmarks publicly, on how quickly cases are acknowledged, investigated and resolved. However, this is different from an annual reporting, as service standards are listed out clearly and pre-emptively, rather than post-incident. This is akin to how teachers would set out clear rules in their classrooms prior to conflicts and bullying. The success of OSC should lie in whether parents, children and all users of our online space feel safe even before logging online.

In time, I hope the Commission will also act as an educator and facilitator, promoting digital literacy and empathy across communities.

Deputy Speaker, Sir, OSRA is like a set of classroom rules in our digital school to ensure every Singaporean, especially our young, can learn, share and express themselves safely. Sir, in Mandarin, please.

(In Mandarin): [Please refer to Vernacular Speech.] Mr Deputy Speaker, Sir, I support the Online Safety (Relief and Accountability) Bill.

The Internet is like our digital campus and should be a place where people can communicate, learn and grow with peace of mind. This Bill can help the victims obtain relief more quickly and requires online platforms to take responsibility. However, the Government should also pay attention to the following during implementation.

First, clarify the distinction between this Bill and POHA. Second, prioritise cases of online harm involving minors. Third, ensure that the OSC has sufficient resources and capabilities to assist small- and medium-sized platforms to improve their content control standards.

(In English): Sir, in conclusion, I support this Bill because it protects our people, strengthens accountability and encourages a healthier digital culture. But beyond regulation, it also reminds us of our shared duty to build a safe digital environment for our young.

Mr Deputy Speaker: Dr Choo Pei Ling.

7.04 pm

Dr Choo Pei Ling (Chua Chu Kang): Mr Deputy Speaker, in Mandarin, please.

(In Mandarin): [Please refer to Vernacular Speech.] I support the Online Safety (Relief and Accountability) Bill. The Bill is timely.

The Bill will empower the newly established OSC to take action against users who abuse the Internet to harm others. Victims of online harassment or malicious disclosure of their privacy will receive more timely and comprehensive protection and will be able to seek compensation. The Commission will be able to instruct distributors of harmful content, platform administrators and platform operators to remove harmful content, restrict perpetrator accounts or publish victim responses. These measures will help curb many forms of online misconduct and are crucial for building a responsible and healthy online ecosystem.

The enactment of this Bill is a landmark achievement in shaping a civilised, responsible and trustworthy online environment. I sincerely thank all relevant departments and the public for their efforts and contributions over the past five years. I am gratified and encouraged to see everyone working together to make the online environment safer and more civilised.

The Internet's appeal lies in connecting people and ideas around the world, fostering understanding and solidarity. However, it has also been abused for disseminating pornography and violence, cyberbullying and inciting racial or religious conflict. Many of these misdeeds are carried out anonymously and this trend is escalating.

Currently, the avenues for help available to Internet victims are extremely limited. Beyond the direct psychological and emotional harm, we are also seeing increasing indirect impacts. Many people are beginning to self-censor, even choosing to reduce their Internet use to protect themselves and their families. If we do not address this issue promptly, this online environment will further deteriorate.

I welcome the establishment of the new OSC to enable victims to seek timely remedies. I cheer the Government's decision to enhance information and identity disclosures. Especially important is the introduction of statutory torts. They will provide victims the legal basis to seek accountability.

I understand that enforcement will require significant resources and hence, the OSC will start with five of the 13 types of online harms, focusing on online harassment, doxxing, online stalking, intimate image abuse and image-based child abuse. I support the Ministry's plan to progressively expand coverage. At the same time, I would like to ask, when would the other online harms be progressively included?

I would like to express appreciation on behalf of some of my residents who have shared they have been victims of doxxing. They shared that their addresses, car numbers and details of when they are resting are being shared online, as are videos and photos showing their family members, neighbours and visitors.

I would like to ask the Minister whether the OSC will address such problems in online private chat groups, such as WhatsApp and Telegram? If so, what actions will the OSC take? In addition, the POHA provide protection from online harassment as well. How can victims decide on when to seek help from OSC or POHA?

With this new legislative framework, new standards will be set for online behaviour and communication. Over time, these guidelines will guide our Internet users. I urge relevant departments to continue raising public awareness of this new framework and its corresponding penalties. Internet abusers must understand that they can no longer easily hide behind seemingly anonymous avatars and commit offences, they would be punished if they do so.

Under the Bill, individuals may be fined up to $20,000 and jailed for up to 12 months, while entities may be fined up to $500,000 for failing to comply with the OSC’s orders. I urge the Ministries to consider increasing the caps for the fines and jail terms and include caning, to empower judges to have more discretion for particularly egregious crimes. In the UK, the Online Safety Act can impose penalties of up to S$31 million.

[Mr Speaker in the Chair]

Finally, I would like to ask about adult victims who refuse to lodge complaints with OSC due to fear of the perpetrators, for example. Under special circumstances, can someone else lodge, even if the victims do not give consent?

(In English): Mr Speaker, online harm is real, and its impact is deeply personal. This Bill sends a clear message: our digital spaces must be safe, accountable and respectful. Victims deserve swift protection and perpetrators must know – anonymity is not immunity.

This new online safety framework enhances protection online and provides an additional option for victims who want quick termination of online harm. We currently have several statutes to deal with harmful online activities, content and interactions – POHA, the Broadcasting Act, POFMA, the Foreign Interference (Countermeasures) Act and OCHA.

I hope that there will be public education and awareness campaigns to help Singaporeans and residents understand their uses; and hotlines or centres where victims can approach for advice and assistance. With this, I would like to conclude with my support for the Bill.

Mr Speaker: Mr Cai Yinzhou.

7.12 pm

Mr Cai Yinzhou (Bishan-Toa Payoh): Mr Speaker, Sir, I rise today in support of the Online Safety Bill.

A recent survey by MDDI found that in the past year, more than four in five Singapore residents encountered harmful online content across platforms, like Facebook, Youtube, Instagram, TikTok, X, Reddit, Telegram and WhatsApp. This highlights a clear and urgent need to enhance current mechanisms for help seeking relief.

The Bill is, therefore, a crucial and timely step to ensure that the safety and civility we value in the physical world, on our streets, extend to our digital lives. An anecdote that Minister Edwin Tong had alluded to as well.

An Institute of Policy Studies survey found that sexual content depicting voyeuristic or intimate images recorded without consent was overwhelmingly perceived as the most severe of online harms. Victims need fast action. The current reality, as an IMDA report revealed, is that major platforms often took five days or more to act appropriately on user reports of harmful content. For a victim whose intimate images have been non-consensually recorded and distributed publicly, five days can be catastrophic.

I am, therefore, concerned about the need for timely relief and clarity of the administrative process for victims of online harm.

The Bill rightly grants the Commissioner wide discretion to carry out its own investigation and empowers to issue legally enforceable directions to take down content and restrict accounts.

I understand that it will take time and extensive consultations, after the appointment of the Commissioner and other relevant office holders, to formulate the detailed rules governing how the Commissioner may exercise investigative powers. I, therefore, seek clarification on the Commissioner's investigative process, specifically on the following questions:

Will the Commissioner publish advisory guidelines indicating the expected timelines for the investigation and resolution of a complaint, with processes put in place for urgent investigations to take place? Will the perpetrator being investigated be made aware of the investigation, and at what stage? What information will be disclosed to the perpetrator about the complaint made against them?

I understand the Commissioner is already working with SHE in the design of this initiative, including referral to SHECARES@SCWO for counselling and pro bono legal advice.

Will the Commissioner consider engaging other organisations who provide victim care and support, like AWARE's Sexual Assault Care Centre and Project X, to improve trauma-informed processes and joint case management and outreach to encourage more victims, especially those in minority communities, to be assured when seeking help.

For victims under immense stress, clarity on the steps involved for bringing a complaint and the expected time for resolution is vital. With great power holds great responsibility, and the transparency of the process can help reassure and encourage help-seeking efforts.

Mr Speaker, Sir, I appreciate how this Bill is a product of extensive consultation. A month-long public consultation on the proposed legislation, which received broad public support and directly informed the final Bill. Our citizens have asked for accountability and this Bill delivers.

This legislation strikes a critical balance. It empowers the victim, yet holds the perpetrator accountable, and sets clear expectations for platforms. It is a necessary act of legislative fortitude to protect our citizens in an increasingly complex digital world. Notwithstanding these clarifications, I support the Bill.

Mr Speaker: Ms Lee Hui Ying.

7.16 pm

Ms Lee Hui Ying (Nee Soon): Mr Speaker, Sir, the Online Safety (Relief and Accountability) Bill is one that I, as well as many Singaporeans and my residents, support fully.

With the growing prevalence of cyberbullying and the rise of illegal black markets and harmful online chatrooms that benefit profit from disseminating obscene and dangerous materials, this Bill could not have come at a more timely moment.

It is even more important that Singapore will be among the first countries to introduce comprehensive legislation on this issue, a significant milestone that did not happen overnight. It took years to get here; years of heart, hope and hard work by the many stakeholders and public officers.

In supporting this Bill, I have four clarifications: two concerning the contents of the Bill and two concerning its enforcement.

Part 12 of the Bill imposes obligations on online service providers to respond to reports against certain harmful content in a timely and adequate manner, with the potential for tortious liability if they fail to do so.

Currently, many major social media platforms, such as Instagram and TikTok, have internal reporting mechanisms and community guidelines, which will need to be aligned with the new legislation. Some of these platforms already provide warning labels for potentially distressing images, for example, content that may contain self-harm, violence or sensitive materials.

However, I note that there may still be grey areas. Certain artistic or educational materials, which are excluded from the definition of obscene materials under clause 9(2), may still be distressing for some users.

In the UK, for example, their Online Safety framework has included protections for what is called "epileptic trolling", content that is designed to trigger seizures and other materials that may trigger individuals with mental health conditions.

To ensure that such Singaporeans do not fall through the gaps, may I ask the Minister whether the Government would consider requiring online service providers to implement warning systems for potentially harmful content?

My second question concerns an issue close to the hearts of many Singaporeans, especially our senior youths and that is cyberbullying. This has become an issue of serious public concern in recent years and there is growing anxiety that our youths and senior youths' young children are being exposed to obscene and harassing behaviours online. It is critical that we create a safe online environment for our young.

So, I would like to ask the Minister whether we are considering to introducing strict liability for certain classes of offences under the Bill, to strengthen deterrence and provide stronger protection for the public, especially our young children?

A further concern relates to enforcement, especially in cases where harmful content is disseminated through transnational online platforms. A lot of these harmful materials are transnational in nature, or if not, the online platforms themselves that host these materials are transnational themselves. Some global companies have proactively aligned their internal policies with global legislation on online harms, while others have not been as forthcoming.

Given that Parts 5 and 6 of the Bill will require the cooperation of entities and individuals beyond our borders, I would like to ask the Minister how will the Government currently intend to work with these transnational stakeholders? And if work has already started, what are the current stakeholder sentiments and negotiations done to ensure that such online platforms, in particular, adopts and abides by the Online Safety Bill?

Finally, Mr Speaker, Sir, this Bill matters deeply to Singaporeans because the harms it addresses are personal and oftentimes, deeply traumatic and can be financially burdensome to some.

Under the current framework, victims must first report harmful content to the relevant service provider and in more serious cases, escalate it to the newly established Online Harms Commission. The Bill also provides for victims to bring civil actions against offenders or platforms in cases of online harm.

As the hon Member Mr David Hoe has also mentioned, many Singaporeans may lack the financial resources to pursue such claims. May I ask if we could consider allowing online harm torts to be heard before the Small Claims Tribunals or alternatively, provide legal or financial assistance for victims seeking redress through civil action?

The technology and social landscape will continue to evolve, and while this Bill marks an important step forward, the work should not stop here.

Ensuring a safe and trusted online space requires a holistic approach, one that looks beyond legislation to how we build a supportive and resilient ecosystem. This includes strengthening community networks and drawing on the support of stakeholders, such as social and religious organisations, as well as deepening public education efforts.

Together, these efforts will foster a culture of responsibility, empathy and care in our digital spaces.

This Bill represents a crucial step forward in safeguarding our digital spaces and ensuring that our laws keep pace with the realities of the online world. I hope this brings safer spaces online, with protection and recourse for all. Mr Speaker, notwithstanding these clarifications, I support the Bill.

Mr Speaker: Ms Elysa Chen.

7.22 pm

Ms Elysa Chen (Bishan-Toa Payoh): Mr Speaker, Sir, I rise in support of the Online Safety (Relief and Accountability) Bill. This legislation is a thoughtful and compassionate response to the growing challenge of online harms, such as cyberbullying, intimate image abuse, doxxing and online stalking.

These are not just breaches of privacy, but attacks on dignity and are harms that leave deep scars on victims, scars that are no less real for being inflicted through screens rather than in person. When people experience online harm, every hour of delay multiplies their suffering. That is why we need to act quickly.

The establishment of the OSC by the first half of 2026 demonstrates the Government's commitment to protecting Singaporeans in our increasingly digital lives. Our citizens deserve an agency that can act swiftly on their behalf, providing the support and relief they need during what can be a highly vulnerable moment. I am deeply grateful to the Minister and the entire Ministry team for their dedication in crafting this important piece of legislation and I hope to explore several queries where additional clarity and detail would help us achieve our common goal of protecting Singaporeans online.

First, on extraterritorial enforcement and international cooperation for cross-border protection. The Bill wisely grants extraterritorial reach, recognising the global nature of online harm. Online harms rarely stop at our shores. A perpetrator in another country can cause immense damage here. I appreciate that the Bill includes important tools, such as access blocking in section 44 and app removal in section 45 and that it establishes calibrated penalties that reflect the seriousness of non-compliance. These provisions show careful thought about practical enforcement.

Like Mr Foo Cexiang, I hope the Minister can share how the Government intends to enforce this law outside Singapore's territory other than access blocking and app removal orders. I note that access blocking can be circumvented with technology like VPNs; apps can be accessed by means other than app stores, for example, through web versions. What are the concrete steps for enforcement of directions and orders against overseas actors? Are there plans to expand the available slate of mechanisms and means of enforcement, or if there are any international cooperation protocols for extraterritorial enforcement? Let us build international partnerships that match the borderless nature of the Internet with equally borderless protection for victims.

Second, many Members have already pointed out the need to adequately resource the Commission. I will not belabour this point. The Commission's promise is clear – to offer swift relief for those facing online harm. The Commission will also seek to handle reports across 13 categories of online harms, with a phased approach implementing five categories by mid-2026 and others following progressively.

Could the Minister share what is the estimated case load expected for the new Commissioner's Office and Appeal Panel, and how resourcing will scale with time?

The victims who turn to the Commission will be in their most vulnerable state. Let us make sure the system they encounter is not just efficient, but also humane, staffed by people who listen, understand and act with compassion.

Third, on appropriate oversight and accountability. Section 64 establishes an Appeal Committee to review the Commissioner's decisions. The Appeal Committee has been granted powers to affirm, revoke or vary decisions and directions, providing an important internal check on decision-making.

Given the significant powers being granted to the OSC, including directions carrying criminal consequences and orders with substantial penalties, I believe that clear accountability mechanisms will strengthen public trust and confidence in the system.

I would like to ask if the decisions of Appeal Committees regarding the Commissioner's reconsidered decisions may be further appealable to the courts. Judicial oversight by direct appeal would ensure accountability, fairness and transparency and thus strengthen the credibility of the OSC. To address concerns of ensuring swift and low-cost resolution for the relevant parties, there can be prescribed short timelines, standard forms and low court fees. This would be a similar approach to that under POFMA.

Fourth, on inter-agency collaboration. I appreciate that section 5(1)(e) requires the Commissioner to collaborate with IMDA and other public agencies, and section 24(4)(d) allows appropriate referrals. This reflects understanding that protecting Singaporeans online requires coordinated effort across Government.

Many situations, including doxxing with intent to harass, non-consensual sharing of intimate images and image-based child abuse, constitute both harm under this Bill and criminal offences requiring Police involvement.

How will the Commissioner's actions and investigations coordinate with possible Police investigations into potential criminal offences arising from the same online activity? Will there be a framework to automatically refer certain types of online harm cases to the Police? When agencies work seamlessly, victims would not need to repeat their trauma at multiple doors, ensuring that the system itself becomes a source of healing, not exhaustion.

Fifth, on potential over censoring by online service providers (OSPs) and administrators. OSPs and administrators may over censor content to avoid risks. They may err on the side of caution and may remove content upon receiving any purported online harm notice that superficially appears to have some basis, unnecessarily silencing voices. Unlike the Commissioner's decisions, which can be reconsidered or appealed against, their decisions are not statutorily subject to reconsideration or appeal. While there is a statutory tort available to OSPs and administrators against people sending frivolous or false notices, people adversely impacted by their decisions have no statutory recourse. In most cases, there will be an imbalance of power and resources such that those impacted will have little or no means to pursue recourse.

Can we avail to people whose content have been removed by OSPs and administrators, statutory torts similar to sections 91 and 94 if OSPs and administrators fail to respond reasonably to online harm notices by unreasonably removing their content? Would the Ministry consider mechanisms for recourse, a form of appeal or review, to ensure that fairness applies not just to victims of online harm, but also to users who are wrongly penalised by overzealous moderation?

I would like to end off this speech by calling for comprehensive online protection for our children and young people.

Today, the Code of Practice for Online Safety covers six designated social media services and the new Code of Practice for App Distribution Services, effective 31 March 2025 and introduces important age assurance measures.

Building on this strong foundation, I hope we might explore opportunities to extend these protections even further. Our young people are creative and curious, exploring online spaces we might not immediately think of as social media. Games like Roblox, Minecraft and Fortnite bring children together but include private messaging functionalities where risks can emerge. Messaging platforms, like Discord, WhatsApp and Telegram enable connections but can also expose young users to unwanted contact and harmful content.

Would we consider extending privacy safeguards and age assurance to gaming platforms, messaging apps and other services where young people gather by requiring these platforms to implement similar protective features such as disabling unwanted messages, restricting profile viewing, disabling location sharing, and empowering parents to guide the content their children access?

Furthermore, as our young people increasingly engage with generative AI services and chatbots, I hope we might consider whether age assurance and safety features should extend to these emerging technologies. Children approach AI with trust and curiosity. We should ensure these powerful tools are designed with their safety in mind.

The Broadcasting Act amendments and Codes of Practice represent genuine progress. I hope we can continue this important work, advancing "safety by design" principles across the entire digital ecosystem our children and young people inhabit. This must be coupled with continued public education across all segments of our society, especially parents and young persons, on digital safety, so every child and young person can explore, learn, create and connect safely.

Mr Speaker, Sir, when there is a fire, we do not delay while the flames spread. Online harm spreads in the same way a fire does: fast, invasive and devastating, and every second counts. With this Bill, we are ensuring that when victims cry out, the system answers not with delay, but with decisive, immediate protection.

A law is only as strong as its capacity to be enforced. Let us ensure that this Commission has the teeth, resources, and clarity to deliver justice by providing clarity on how legal pathways work together, guidance that helps everyone understand proportionality, robust international cooperation, adequate resourcing, appropriate oversight, seamless inter-agency coordination, balanced platform responses and comprehensive protection for children. Mr Speaker, Sir, I support the Bill.

Mr Speaker: Minister of State Rahayu Mahzam.

7.32 pm

Ms Rahayu Mahzam: Mr Speaker, I thank Members for their support and interest in this Bill. We are all in agreement that victims should have access to swift and effective relief. Statistics and stories tell us this is what victims primarily want. We also all agree that the implementation of OSRA is key and that the OSC must always be fair and consistent in its decisions with accountability through due processes. There is consensus that platforms also need to do their part in creating safer online space for their users.

I am heartened that both sides of the House are aligned in wanting to do more to help victims of online harms.

Where we differ is in some of the details. We thank the WP for proposing the amendments, which we have considered seriously, because we share the common goal of helping victims of online harms.

We think OSRA strikes the right balance as it stands, between swift protection for vulnerable victims with sufficient accountability through reconsideration and appeal. What is important are the outcomes when the OSC commences operations. We all want the OSC to succeed, and in this vein, I ask Members to give OSC the time and space to stand up what we all agree are new and novel functions, so that it can progressively build its muscles to help victims.

I acknowledge that Members have questions about the statutory reporting mechanism and how the OSC will carry out its functions. Understandably so, given that this is a new agency that we are setting up, I will address Members' questions thematically.

First, on the scope of the OSRA Bill and its interaction with other laws. Members have noted that there are several statutes addressing harmful online content and conduct.

Mr Speaker, when faced with an online harm, what most victims want is direct and timely relief. This includes removing the harm as soon as possible. With the OSRA Bill and the set-up of the OSC, we hope that victims will be granted more timely relief with a new channel for support.

Besides timely redress, the OSRA Bill also expands the scope of online harms covered in our current criminal and regulatory regime. From the research and surveys on online harms in Singapore, we know of emerging issues like inauthentic material abuse, also known as "deepfakes" and online instigation of disproportionate harm, or "cancel campaigns". With this, we hope to protect victims not just from existing but future harms.

Therefore, the OSRA Bill is complementary to the existing laws and frameworks. Most importantly, it supports victims in ways they have asked for, putting a stop to the online harm as soon as possible.

Ms Tin Pei Ling, Mr Henry Kwek and Dr Choo Pei Ling have asked how victims can navigate the various laws, and whether they would need to make multiple reports when seeking help for online harms. Mr Gabriel Lam and Ms Elysa Chen raised the need for clear processes and guidance on how victims can approach the various agencies like the Police, IMDA and the OSC.

Mr Speaker, I wish to assure Members that we will adopt a no-wrong-door policy for victims of online harms. The OSC will work closely with other agencies, including the Police, to ensure backend coordination and minimise the need for multiple reports. This means that regardless of which agency a victim first approaches, they will be guided on the appropriate help. I also take Mr David Hoe’s point that the process of online harm reporting should be kept simple and accessible. We will bear this in mind as we refine the OSC’s operational details.

Some Members have also asked about the scope of the harms covered by the OSRA Bill. Ms Tin Pei Ling raised the point that victim’s exposure to online harms may be prolonged if we are unclear about what constitutes online harms in the OSRA Bill.

With the OSRA Bill, we sought to clearly capture the key characteristics of each harm. To aid public understanding, explanatory notes and illustrations are included, where appropriate. I would also like to refer Members to Handout 4, which gives further illustrative examples.

While we have given definitions for each of these harms in the OSRA Bill, the OSC will assess each case based on its unique context and facts. I think Members would know that no two types of online harm will be the same. Some flexibility must be accorded to the OSC to act according to the harm concerned.

Ms He Ting Ru suggested that we consider including sexual grooming and the publication of online material that encourages or promotes suicide or acts of self-injury as specified online harms that OSRA covers. We agree with the Member that these are egregious harms, and I have explained why these harms would be better addressed through other legislation in our online safety framework.

Under the Broadcasting Act today, the IMDA would already be able to issue directions to platforms to disable Singapore users' access to egregious content on online communication services, such as social media services and app stores. Such content includes child sexual exploitation material and content advocating or instructing on suicide or self-harm. If such online content is connected to the commission of criminal offences, action can be taken under the Penal Code or Online Criminal Harms Act.

If sexual grooming is displayed through online harassment or stalking conduct, the victim will have recourse to seek directions from the OSC. For the victim, who is under 18 years old, a parent or guardian can make a report on their behalf.

On the suggestion to add "fair comment" exception to online harassment and public interest exception to the harms of non-consensual disclosure of private information and online instigation of disproportionate harm, I would like to highlight to Members that these factors will be taken into consideration as the Commissioner assesses each case. If we included the suggestions and make them exceptions to the online harms, communication that meets the "fair comment" or "public interest" criterion will automatically not be actionable by the OSC.

In comparison, the current formulation allows the OSC to consider a basket of factors when deciding whether to issue directions.

Specifically, clause 27 states that the OSC can consider whether the comment was reasonable and the circumstances the comment was made in deciding whether OSC should issue a direction. This allows the OSC to weigh whether something counts as "fair comment" or "public interest", while also considering the features of the post, and not immediately say it cannot act. So, if a member of a public raises in good faith or exposes serious wrongdoing, this will be taken into account as part of OSC's case assessment. We should not pre-emptively restrict the OSC from considering any one factor.

Leader of the Opposition, Mr Pritam Singh, and Mr Ng Shi Xuan sought clarity on what would be considered private information. Private information refers to information about a person that is not widely available to the public at large, as explained in Handout 4 that Members received earlier. This includes someone's medical history.

Mr Ng has asked whether the harm of non-consensual disclosure of private information would cover information such as sexual orientation and gender identity. We believe as a matter of principle that generally everyone in Singapore should be protected against having their private information shared publicly without consent. Hence, if such information was not public before and disclosed without consent, the person can file a report to the OSC. As we implement this harm in future, the Minister may issue regulations to clarify the types of information that is or is not private information.

Ms Mariam Jaafar highlighted the need for the OSRA Bill to be technology-neutral and future-ready to deal with harms generated by algorithms and AI. I wish to assure the Member that the 13 harms under the OSRA Bill are generally technology-neutral. As shared in Handout 4, if person D uses an application to digitally alter an image of person C fully clothed into a nude image and posts it online without consent, that would be intimate image abuse.

We recognise that online harms are constantly evolving, and new harms may emerge over time, especially with the proliferation of new technologies. The OSRA Bill allows the Minister to prescribe additional types of online activity that are likely to cause harm to persons in Singapore. We will continuously review the online landscape and expand the list of harms, if there is a need to, in future. This will be done judiciously.

For the harm of “inauthentic material abuse”, Ms Cassandra Lee asked for clarity on how "likeness" would be assessed in the context of AI-generated content and whether there needs to be intent to mislead. The OSC will assess based on objective standards whether the content is a false or misleading depiction of the victim’s words, actions or conduct; and if it is realistic enough such that a reasonable person would believe that the victim said such words or did such actions or conduct. Such content may cross the thresholds of likely to cause the victim harassment, alarm, distress or humiliation, even if there was no intent to mislead.

Ms Elysa Chen asked about how the harm of "publication of statement harmful to reputation" interacts with defamation under common law. For the harm of "publication of statement harmful to reputation", the only direction that may be issued by the Commissioner is a Right-of-Reply Direction. This enables a victim to put out their reply or their side of the story quickly in order to protect their reputation, which matters especially in the online world where allegations spread with great speed and ease.

We see this as a complementary pathway for the victim, that may be an alternative to, or concurrent with, a defamation suit. A victim may find sufficient relief in a Right-of-Reply Direction, that they no longer need to resort to a defamation suit. Or, they may still need or want to sue, for example to seek monetary damages. In either case, putting their reply out helps to limit the damages suffered.

Ms Elysa Chen asked how the OSC would ascertain disproportionality in the case of "online instigation of disproportionate harm" (OIDH) and if there will be guidelines published for what constitutes disproportionate harm. [Please refer to the clarification later in the debate.]

As illustrated in Handout 4, disproportionate harm can take on various forms, such as physical harm. The Bill currently lists non-exhaustive factors that the OSC may consider when assessing the requirement of disproportionality, such as whether the act instigated is or is likely to constitute a criminal offence and the nature and severity of the harm mentioned.

The OIDH framework does not mean that members of public can no longer call out certain problematic behaviours, or express views on issues of public interest. Even as we reject mob behaviour, we continue to uphold the principle that there must be space for online discussions, within the bounds of civility and respect.

Some Members, including Ms Valerie Lee and Ms Elysa Chen and Ms Eileen Chong, asked about the set-up and composition of the OSC. The appointed Commissioner for Online Safety will be someone of suitable seniority and experience. We will also ensure that the OSC is appropriately staffed with individuals that have the relevant experience and expertise, and have a good understanding of our society and online norms, so as to address reports as they come in.

Mr Henry Kwek and Ms He Ting Ru asked about training for OSC officers and if the OSC will be working with community partners to support victims of online harms who need psychological and legal support. OSC officers will be trained in communications and victim management to ensure that each case is handled sensitively. They will also refer victims requiring further support beyond the mitigation of an online harm to community partners.

Mr David Hoe suggested having a standing advisory group of relevant practitioners to provide guidance on online harms periodically. I thank Mr Hoe for the suggestion. Under the OSRA Bill, the Commissioner is empowered to consult with any person that the Commissioner thinks appropriate, for the purposes of performing the Commissioner's functions and duties. This could include the relevant experts and practitioners in the field of online safety.

Ms Elysa Chen, Mr Foo Cexiang, Mr Sharael Taha, Ms Yeo Wan Ling, Mr Xie Yao Quan and Mr David Hoe asked about the expected case load of OSC, and how it will be adequately resourced to provide effective and timely redress. We estimate the initial caseload of OSC to be high, based on various factors, including Australia's eSafety caseload for similar harms, adjusted for Singapore's population. We also took into account Singaporeans' Internet-use practices, such as the time spent online. Depending on the volume of the cases, we will calibrate and reallocate resources, as necessary, to resource and size OSC adequately.

Mr Cai Yinzhou asked about the rules that will govern how the OSC carries out its investigations to determine further action. Each case will be assessed based on the nature and severity of the harm, and investigations will be carried out fairly, working with the relevant agencies to determine the appropriate follow-up actions. The relevant parties will be notified at the appropriate junctures, as the OSRA Bill accords the OSC powers to require documents or information and to examine and secure attendance of persons for investigations. To Mr Pritam Singh's question, the Commissioner's power would not extend to the seizure of devices.

Overall, the focus will be to ensure that the victims are not denied timely relief, even as we accord due process to all involved. Mr Foo Cexiang also proposed expanding OSC's powers, such as by giving its officers powers under the Criminal Procedure Code. The scope and nature of every law is different; hence the respective officers are accorded different powers to ensure they are fit-for-purpose. In considering how OSC might conduct investigations in various scenarios, such as investigating reports to determine whether to issue a direction or offences under OSRA, we had accorded OSC the appropriate powers.

On the suggestion for the Commissioner to submit an annual report to Parliament, the OSC will consider publishing regular reports on its website for public awareness on online harms and the OSC's work. The OSC will require time to assess what and how to put up information that would complement its processes, as it gradually stands up its operations. We are, therefore, taking a more adaptive approach towards the OSC's publication of reports, as opposed to legislating this as a requirement.

The regular reports may include information on aggregated caseloads and anonymised case information, insofar as these do not re-traumatise victims. Members are also welcomed to file Parliamentary Questions to request such information, if it is not already published in the public domain.

On the suggestion for the OSRA Bill to require platforms to publish annual reports on their measures to enable users to seek recourse from harm and response times, among others. I would inform Members that under the Code of Practice for Online Safety – Social Media Services, designated social media services (DSMSs) with significant reach or impact in Singapore are already required to submit annual reports to be published on IMDA's website. These reports contain information on the DSMSs' measures to combat harmful and inappropriate content, and metrics, such as number of reports received from users, as well as the DSMSs' response time to act on these reports. IMDA also publishes an Online Safety Assessment Report. We will continue to drive the point on platforms' accountability through the OSRA Bill.

On the phased implementation of OSC's reporting mechanism, Mr Sharael Taha, Ms Yeo Wan Ling and Dr Choo Pei Ling asked about the timeline for introducing remaining categories of harm. Mr Sharael Taha and Ms Yeo Wan Ling also asked what happens when a victim experiences a form of harm covered only in a later phase.

Mr Speaker, as mentioned in my opening speech, we want to do right by the victims. A phased implementation approach is necessary so that OSC builds its capabilities in a sustainable and scalable way, focusing the right level of attention to each case, to ensure a positive user journey. It will also manage the OSC's caseload, allow officers to properly develop the necessary guidelines and frameworks, to ensure that the OSC's decisions are consistent and appropriate.

For a start, OSC will focus efforts on addressing the most severe and prevalent harms, such as online harassment and intimate image abuse. The remaining harms will follow progressively. If victims write in to the OSC regarding their experience of a specified online harm that is not covered in the first phase, officers will, nevertheless, guide the victims on the appropriate process. This could be making a report to the platforms, filing a Police report or reaching out to community partners for support.

Ms Cassandra Lee asked about measures to safeguard victims' confidentiality during the reporting process. I assure Members that all victims' reports will be stored securely, and the agency will adopt practices to ensure victims and all other parties involved are treated with sensitivity and care. We have also built-in legislative safeguards on preservation of secrecy. Unauthorised disclosure of information is an offence. Any information published for public education and awareness will be in an anonymised form.

Members, including Dr Choo Pei Ling and Mr Alex Yeo, sought clarifications on who can or cannot file reports to OSC. Dr Wan Rizal asked about the process for unions and professional bodies to file a report on behalf of their members.

Generally, victims have two options: they can submit a report on their own; or authorise another individual or entity to do so on their behalf. Where the victim is under 18, their parent or guardian can file a report on their behalf. To Dr Wan Rizal's question, unions and professional bodies can file reports to OSC if they obtain written authorisation from their member who suffered the specified online harm.

Mr Andre Low and Mr Pritam Singh asked for clarity regarding those with a prescribed connection to Singapore. I mentioned in my opening speech that we intend to prescribe foreigners who stay in Singapore for the long term. For a start, this would include foreign spouses who are in Singapore on a Long-Term Visit Pass. We are still studying the full scope and will share more details in due course.

Mr Ng Shi Xuan and Mr Alex Yeo had suggested for OSC to take differentiated approaches between groups of victims. I thank Members for the suggestions. For a start, the response time will likely be shorter for more severe harms.

Ms Tin Pei Ling and Ms Yeo Wan Ling asked about the measures to prevent devious characters from submitting vexatious reports or weaponising the OSC reporting mechanism. Dr Choo Pei Ling also asked how OSC can minimise mischief reports. OSC will have mechanisms in place to filter out and dismiss trivial, frivolous or vexatious reports. Those submitting a report will have to explain how the content or conduct is a specified online harmful activity and be required to declare that the information that they have provided is true. Submitting false information to the OSC will be an offence. If these complainants persistently make such reports, the OSC will also not consider any further reports from them. This would allow the OSC to focus its time and resources to the real victims who need redress.

On the OSC's assessment of reports, Mr Sharael Taha asked how the threshold for online harassment will be determined; Mr Andre Low asked whether victims who publish identity information of their harasser would be caught under doxxing; Mr Low and Ms Eileen Chong also asked about consistency in the OSC' decisions; and Mr Pritam Singh sought clarifications on the information that OSC will share at the reconsideration stage.

As drafted in the OSRA Bill, OSC will take an objective approach when assessing reports, and the thresholds for online harms are based on objective standards. While each case will be fact-specific, the "reasonable person" test is a well-established legal standard, including in tort and criminal law. The question will not be whether someone personally feels offended, but whether OSC has reason to suspect that online harmful activity was conducted, before issuing a direction.

Members discussed the thresholds of "reasonable grounds to believe" versus "reason to suspect". I would like to reiterate the points I made in my opening speech. First, that "reason to suspect" is an established legal threshold that applies in other legislation, such as the Online Criminal Harms Act. Second, we have assessed this threshold to be appropriate to meet the intent of ensuring that online harms can be stopped in a timely manner.

As mentioned in my opening speech, the OSC will also publish guidelines that will inform its decisions to ensure consistency and objectivity. These guidelines will inform the public on the factors that the Commissioner will take into account for present and future cases.

Victims and recipients of OSC's directions will be able to seek OSC's reconsideration if they disagree with it, and appeal to an independent Appeal Committee thereafter.

Ms Cassandra Lee asked about the requirement for victims to file reports to the platforms first before the OSC. We believe the platforms should be the first port of call or, as Dr Wan Rizal put it, "first responders". They have a duty to protect their users, and the statutory torts will make this clear, especially when notified by the users of harms on their service. The OSC will continue to work closely with the platforms to ensure that they comply with directions in a timely manner. Where platforms fail to act on online harms within 24 hours, victims can then file a report to the OSC.

Mr David Hoe, Ms Yeo Wan Ling, Ms Lee Hui Ying and Ms Mariam Jaafar raised broader points on platforms' design and accountability to prevent online harms, beyond OSRA and the OSC. In this regard, we will also continue working with the major platforms to ensure that they strengthen their online safety measures.

That said, we recognise that some harms are simply so serious or urgent that they warrant immediate intervention. Ms Valerie Lee asked for clarification on the categories of online harm that will be prioritised for urgent attention. For a start, victims can file a report directly to the OSC for the following harms: image-based child abuse, intimate image abuse and doxxing. In such cases, speed is of the essence to minimise the harm that may be caused to the victims.

Several Members of Parliament, like Mr Ng Shi Xuan, Dr Wan Rizal, Mr Sharael Taha, Mr Cai Yinzhou and Ms Yeo Wan Ling, have asked or spoken about the OSC's service benchmarks to respond to and resolve cases. Mr David Hoe also suggested laying out the operational process clearly, so victims are aware of the actions to be taken.

We want to ensure that victims get the help they need, as soon as possible. The OSC's response time and corresponding compliance timeline to a direction, will likely be shorter for more severe harms. Practically, some cases may be more easily resolved than others; for example, it may be easier to make out if the harm of intimate image abuse is present, as compared to a report on online harassment. For the latter, the OSC officer will have to go through the details more carefully, to understand the nature and the severity of the case.

Ms Valerie Lee asked whether the Commissioner will be required to conduct a preliminary assessment before issuing directions. The answer is yes. OSC will assess if it has reason to suspect that online harmful activity was conducted before issuing a direction.

To recap, the Commissioner will be empowered to issue directions to stop online harms from continuing to occur or to prevent further online harms from affecting the victim, where there is reason to suspect that the online harm was conducted in respect of the victim or the victim group. These directions will be used judiciously.

Mr Andre Low raised a few clarifications on specific directions. First, on the Engagement Reduction Directions. Mr Low had asked about the scenarios where Engagement Reduction Directions should be considered. Mr Speaker, I refer the Member to Handout 4, where we had illustrated a case where an Engagement Reduction Direction could be considered for an online harassment case to temporarily limit end users' liability to make posts mentioning a victim's name, thereby reducing engagement with insulting posts regarding a victim. The intent for such a direction would be to stem the virality of online harms, which can spread quickly through similar posts.

Second, Mr Low also asked about potential overreach when the OSC issues a Stop Communication (Class of Material) Direction. This direction protects victims from floods of harmful content, sharing common identifiers, such as a coordinated harassment campaign. Rather than requiring the Commissioner to identify each harmful post individually, the direction covers all such content in proportion to the scale of the harm occurring.

Mr Ng Shi Xuan raised the need to keep the playing field level for smaller platforms. We recognise that we must take a pragmatic approach when engaging the platforms on compliance with directions. This is why we intend to engage online platforms with significant reach or impact first, noting that the harms on such platforms have the potential to travel further and faster. Such platforms would also have the technical capabilities to implement more complex directions.

Mr Sharael Taha asked what qualifies as an online service provider and whether the term extends to closed or encrypted platforms, such as WhatsApp, Telegram and WeChat. He and Dr Choo Pei Ling also ask how the Bill would deal with cases arising from private or semi-private chat groups. Under the OSRA Bill, online service providers would include social media services and online messaging services, such as WhatsApp, Telegram and WeChat.

While we have no intentions to proactively intervene in private communications, I believe Members know how quickly something harmful can spread via private messages or in online chat groups. The OSC’s reporting mechanism is victim-led. For all harms except incitement of enmity, if a victim files a report to OSC and brings its attention to the specified online harms made via private communications, the OSC can then assess the case.

I refer Members to the example case given for intimate image abuse in Handout 4. Here, B posts an intimate image of A in an online chat group, without A’s consent. In such a case, the OSC can issue a Stop Communication Direction to the administrator of the group to remove B’s post. If this online chat group continues to circulate A’s intimate image on subsequent occasions, or intimate images of other persons, if made aware, the OSC could also consider an Access Disabling Direction to the platform whose service the chat group is maintained on. This will disable access to the chat group for Singapore users.

Next, Members sought clarifications on the OSC's reconsideration and appeal processes. Mr Zhulkarnain urged caution against allowing extensive appellate processes that risk re-traumatising victims. We are still working through the implementation of the Appeal Committee, including its powers to affirm, revoke, vary or substitute a reconsidered decision by the OSC. But I want to assure the public that each case will be given the necessary time and due consideration. Ms Elysa Chen would also be assured to know that it is possible for persons who are still dissatisfied with the decision of the Appeal Committee to seek to challenge it in the Courts by way of judicial review. We will provide more details on the appeal process at the later stage.

Ms Elysa Chen and Mr Foo Cexiang asked how the OSC will take enforcement action against bad actors who are overseas, noting the added complexity that technology like VPNs brings. Ms Lee Hui Ying and Mr Henry Kwek also asked how we would ensure that transnational or cross-border online platforms comply with OSRA directions.

As mentioned, the OSC's directions apply to all communicators, administrators or platforms, regardless of where they are based. This applies as long as the harmful content is communicated in Singapore or accessible by users in Singapore. Where Access Disabling Directions are issued to overseas platforms, the platform will be required to geo-block the content to prevent Singapore end-users from accessing it. Non-compliance with OSC directions is an offence. Further, in such cases of non-compliance, the OSC is empowered to take escalatory measures to require providers of Internet access services to block access to non-compliant online services and online locations or to require providers of app distribution services to remove non-compliant apps from their app stores. The issuance of these escalatory measures will be carefully considered and used judiciously.

Lastly, the use of VPNs to circumvent restrictions and detection affects all Internet laws and regulations, including the OSRA Bill. We will have to take a practical approach and do our best to deal with those within the bounds of our law.

Mr Foo Cexiang asked whether the penalties prescribed under OSRA are adequate. Dr Choo Pei Ling also suggested increasing the maximum penalties.

The Bill sets out maximum penalties for non-compliance with OSC's directions and orders. These were determined with reference to similar offences already on the books, such as the Online Criminal Harms Act, the Foreign Interference (Countermeasures) Act, the Protection from Online Falsehoods and Manipulation Act and the Broadcasting Act, because these are statutes that have directions similar to the OSC's directions. You will see that while there are similarities, the penalties are not exactly the same because we have calibrated it against the different purposes of these laws, and the gravity of the situations that would usually give rise to non-compliance.

Mr Foo also asked if guidance could be given on how the Courts should sentence. Sentencing is a multifactorial exercise and highly context-specific. It is a matter for the Courts to decide in applying general sentencing principles, with the benefit of submissions by the prosecution and the defence. The Court would generally treat the maximum penalty in legislation as representing the appropriate sentence for the most serious form of the offence and calibrate accordingly for the case before it. This can be adjusted in view of aggravating factors, such as defiant conduct, and mitigating factors, like being a first-time offender.

Ms Mariam Jaafar suggested incorporating digital literacy or restorative programmes, especially for younger offenders. Ms He Ting Ru also asked whether perpetrators could be issues counselling orders to stop them from reoffending.

We agree with the Members that, in some cases, rehabilitation may be the more appropriate action than prosecution. As shared in my opening speech, the Commissioner may put in place an Online Harmful Activity Remedial Initiative, which consists of the completion of volunteer programmes. This may be taken into account when deciding whether to prosecute a person for non-compliance with the OSC's directions.

Mr Sharael Taha, Mr Cai Yinzhou and Mr Henry Kwek suggested ways that the OSC can work with partners in the public and people sector to strengthen victim care and support. Dr Wan Rizal asked whether practical guidelines will be issued to schools, hospitals and care agencies on how to activate the OSC’s remedies. I thank Members for their suggestions. Better support for victims of online harms is at the heart of what this Bill serves to do.

Firstly, the OSC will work with other Government agencies and community organisations to ensure that victims can access the OSC's reporting mechanism. Secondly, we recognise that OSC remedies may not meet all the needs of victims. Should they require further support as they cope with an incident, OSC will refer cases to its appropriate partners. One such partner is SHE, who partners with the Singapore Council of Women's Organisations (SCWO) to run SHECARES@SCWO. SHECARES@SCWO is an online harms support centre that provides counselling services and legal assistance if needed. The centre is supported by full time counsellors and volunteers, and offers pro bono legal clinics with the support of Pro Bono SG.

Mr Gabriel Lam, Ms Valerie Lee, Ms Tin Pei Ling and Ms Eileen Chong have highlighted the importance of public education in the promotion of responsible online behaviour. Mr Pritam Singh also asked about outreach and preventive education programmes.

Indeed, legislation is not a silver bullet. As Ms Tin Pei Ling said, social norms and education matter. Beyond legislative measures, we will improve public education and outreach to make online safety resources more accessible, practical and action oriented. In collaboration with community, industry and corporate partners, more ground-up initiatives, such as workshops and webinars, will be organised to foster healthy digital habits and raise awareness of key online safety and digital well-being issues. These efforts will equip Singaporeans with practical skills and knowledge to navigate the online space safely and responsibly, and empower them to support others in their communities.

Mr Speaker, the Bill before us today seeks to protect victims from online harms. Let me conclude by leaving Members with three key takeaways.

Firstly, the OSRA Bill's victim-centric approach in offering relief from online harms plugs a gap in the online safety landscape. It complements our existing measures to address various harmful online content.

Second, the OSC will provide timely redress for victims of online harms. This will be done in calibrated manner, starting with most severe and prevalent harms.

Third, the OSRA Bill will strike a balance between protecting victims from online harms, while preserving space for healthy discourse.

The road ahead will bring its share of challenges, but also opportunities to make a real difference. I seek Members' support for the office of the Commissioner of Online Safety, and your understanding as we refine the details of OSC’s implementation. We are committed to doing right by victims of online harms and to do so in good time, with care.

I urge all Members to support this Bill. Together, we can foster positive norms of online behaviour, such that Singaporeans can go online safely and confidently. [Applause.]

Mr Speaker: Minister Edwin Tong.

8.10 pm

Mr Edwin Tong Chun Fai: Mr Speaker, thank you. I also thank all Members who have spoken in support of the Bill and for their thoughtful contributions to this very important debate. I am very heartened to see that actually we are on very common ground, not just in our thoughts and in the way in which we want to advance the Bill and the work of the OSC, but also in our views and in our values on what this means to society.

And I think, ultimately, after almost eight hours, I think we are not really disagreeing on much. And I found speeches on both sides of the House thoughtful, with good suggestions, and even if we are not able to take them onboard today, I think, they will be relevant for future iterations of this Bill.

Members have raised a number of questions, in fact, many questions, and I thought I will take them thematically, as Minister of State Rahayu Mahzam has done. But I beg your indulgence, that I am not going to be able to respond to every Member’s specific questions, nor I suspect would you want me to.

But let me quickly recap the fundamentals of what this Bill seeks to achieve to set it in perspective.

The OSC is set up as an independent agency to provide quick relief and can grant directions quickly in respect of platforms, something that, today, if you wanted to do with the platforms directly, you almost certainly would find that you have a huge hurdle to scale.

Second, we provide, in complementary to that, a private remedy framework by way of the statutory torts. The law on torts itself is not new. It is established and the jurisprudence is well-established, but to allow victims, individuals, to avail themselves of that framework is something that is new and novel.

Finally, the End-User Identification, which supports the tortious framework, because without understanding and knowing who it is, you cannot bring the action. It is important to remember that we do not have any of these tools today. And this Bill, therefore, in my view, will create a low-barrier and accessible framework for victims to obtain remedies. In many ways, this has democratised relief for victims.

And as I respond to questions, I would like to say that I hope we can keep it that way because it is a key design feature, as I said at the outset, for this Bill to be speedy, to be efficacious and to allow there to be a low barrier to getting the relief obtained. And I think we would like to keep it that way as far as possible. And I like the way in which Mr Sharael Taha and Mr Xie Yao Quan put it – give it space to grow up, build up and let it take off. And I think that is exactly what we would like to see it do.

Many Members' spoke about the Statutory Torts Framework, and I will respond to this as much as I can. The Courts have contended with issues of establishing liability, quantifying damages and enforcing judgments for as long as the law of torts has existed, and it is not going to be a novel area in which they look at the way you assess breach of duty, the way in which you assess proof of damages and also whether the damage is foreseeable or too remote.

But I think the speeches have put the different issues that are at tension with one another into perspective. On one hand, Ms Elysa Chen made a very thoughtful speech and cautioned against over-censorship – that administrators and platforms might then remove content too readily. On the other hand, Mr Henry Kwek said, we should be careful not to allow this to be weaponised. Ms He Ting Ru also said that over-censorship is already happening today. People today are already, in some cases, withdrawing from social media entirely.

So, you can see the tension, pulling in different directions. What this Bill tries to do is to set up a framework that allows individual relief but also one which allows you to go to a third party, in this case, the OSC, for quick relief by a regulator.

On top of that, I want to also emphasise that this Bill is, however, not intended to remove discourse. You take, for example, the Explanatory Statement on Incitement of Enmity. You will see in there that we have specifically provided in the Explanatory Statement that when you have statements that express an opinion, you are free to do so, even if they may be an opinion on the law or on Government policy. That alone, an opinion, is not something that would fall foul of the provisions of this Bill.

Neither is expressing a person's belief or the practices of a group of persons inconsistent with another set of beliefs, something, that will fall foul of this Bill. It would not. You are entitled to your views, expressing your opinion and pursuing discourse as much as you can, as long as you keep to ensuring that you do not fall into one of the buckets of online harms.

To do this, we have carefully calibrated the Statutory Torts Framework, and I thought I would just reiterate in my answer to Members' questions on how this framework is designed to work.

First, we are dealing with online harms with a clearly defined threshold. There is a formula that is set out in the Bill and in respect of all online harms, the OSC must make an assessment, and the Statutory Torts Framework must reach that standard. Those are the duties that are set out in this Bill. You must fall below those duties before it becomes actionable. So, there is a threshold that is fairly clear-cut. In fact, many of them, you will find that these standards are not inconsistent with what the platforms self-profess to be their standards as well.

Second, I think Members will know that there is no criminal offence or liability or criminal punishment for the online harm. This is not the design architecture of this Bill. When there is an online harm, OSC determines it. It then makes one of the directions as may be appropriate.

Third, by and large, liability under the Statutory Torts Framework is compensatory. It seeks to compensate a victim, sometimes for loss of earnings, sometimes for distress, but it is compensatory in nature, by and large. And to Mr Henry Kwek's question, the courts will know when a case is frivolous or taken on trivial or unmeritorious grounds. It is well-established. There are cost consequences. There are ways in which we can stop a vexatious litigant.

Fourth, liability for platforms is conditional on them receiving proper notice. We designed that framework because we do not expect the platforms to trawl the Internet or trawl social media to look out for these harms. But once you get a notice, then that duty arises and you then have to act in a manner in which the duty would be commensurate with the remedy.

Mr Pritam Singh made the point about what is reasonable, and I think the hon Member talked about an illustration and asked whether "prompt" means it must be forthwith and so on. We have put the formula as "reasonable" because in some cases, you can simply disable access with a flip of the switch. In that situation, unless you have some other reason why you cannot do that very quickly, then otherwise, we expect the timeframe to be much shorter.

In other situations where you might take a bit more time to comply with the order and some steps might be taken, or you might have to make some enquiries or there are some technological issues or challenges, then the framework that we have for "reasonable" allows the Court to take all that into account. And so, it is not a single standard. We are not able to say it must be X number of days in every case, but "reasonable" in the context of the particular circumstances of that case.

Fifth, we will prescribe the contents and mode of service of these notices. I know many Members have talked about whether it is going to be difficult to fill up, or is it going to be something that we will have to go back and forth with the platforms. The answer is, as far as we can, we will prescribe the contents and the way in which we will do service of these notices so that you have, almost like a fixed framework of information that you will have to provide. And once the platforms receive it, they will have to respond to this.

I know what the hon Member Ms Chen said about how there might be an over-reaction. Platforms might then be a bit more cautious when dealing with online harms. Sir, in many ways, I will say it is not a bad thing. That with these frameworks, for the platforms, the administrators, the communicators, there is a pause for thought, to think about whether an item crosses the threshold or not and whether you should be doing it. I think that is the kind of mindset that we hope to build – for everyone to be a little bit more cautious, not self-censor in the way which I have told you. All opinions are welcome. Discourse is welcome. But to have a care, to think about it one more time whether that is something that passes muster for online harms. And I think if they have to take a second look and thereby improve protections for users and for potential victims, then I think we have achieved something.

Mr David Hoe, Mr Sharael Taha, Ms Lee Hui Ying and Ms He Ting Ru observed that some claimants might lack the means to pursue a claim. Let me just put it in perspective. We have introduced this now as an additional avenue for our victims to pursue this claim. Previously, on whatever resources you might have, it was not possible. But today, you now have an additional framework.

We will always, in the context of how we discuss and enhance access to justice, we will have people who fall at the margins and may not be able to be well-resourced enough to pursue the claim. In those cases, my Ministry, MinLaw, will continue to work on Access to Justice principles to ensure that those who deserve assistance, whether through Legal Aid with the usual means and merits test, or Pro Bono SG or the Community Law Centres, or various other legal clinics or pro bono schemes around Singapore. We will continue to support these schemes and ensure that those who are at the end of the spectrum, who are unable to have resources to pursue their own claim, will not be left by the wayside.

In many ways, that is just as an aside, that is precisely why we have also created the OSC framework – that it is simple, really cost-efficient and fast. You do not have to pursue a claim in Court to get the immediate relief of having the takedown or right-of-response or any other of the measures that OSC can prescribe.

I note that Ms He Ting Ru talked about the prospect of engaging in court proceedings will be daunting for victims and asked what simplified processes could be introduced. In a similar way, Ms Lee Hui Ying made the same point and talked about the Small Claims Tribunal.

I am not immediately currently persuaded that the Small Claims Tribunal is set up to investigate and hear tortious claims like this. But the points are taken, and we will see whether there is a way in which we can introduce a simplified framework and process that will be able to handle these cases, particularly the simpler ones, much quicker and much more cost-efficient.

Mr David Hoe suggested that it should be easy for victims to send "online harm notices" to them and avoid unnecessary information. I covered this earlier. I will just make one other point. We will also be working with the platforms to ensure that the process is straightforward and accessible. So, we will be designing this process with them in mind. They will tell us what is it that they need to identify the harm appropriately and we will work with them as well to design this into the information that will be needed to trigger the process with the platforms.

Mr David Hoe and I think Ms Cassandra Lee asked whether non-monetary damages or losses can be considered under the Bill's framework, the Statutory Torts Framework. The short answer is yes. The claimants are not limited to monetary compensation and indeed, many of the harms that is contemplated in this Bill, whether it is harassment or intimate image abuse, they lead to distress and humiliation, and the Bill does empower the court to grant damages as it thinks just and equitable. Of course, the usual tortious principles, as I outlined earlier, will have to apply: is it foreseeable? Is it too remote? And so on.

Ms Cassandra Lee asked if the identifying information of victims who commence civil proceedings can be automatically redacted. I would say that, whilst redaction is appropriate in some cases, there will also be many where it would not be. And I think you got to put it in context. You are now having to face a civil suit, and you have to understand and know who is bringing the claim. And so, in most cases, I think that would not be the case. Not all, but most cases. And once an assessment has to be made as to whether it is appropriate, then, an automatic reduction upfront would not be suitable for this scheme.

Mr Ng Shi Xuan and Ms Tin Pei Ling asked for clarification on how the Statutory Torts Framework in OSRA might interact with that in POHA. Let me just quickly explain this. Victims of harassment, doxxing and stalking should continue to sue the communicator or bring action under POHA. And I made the point in my opening speech that we will provide for remedies against the administrator and the platforms for these harms under OSRA. We did consider, but ultimately, decided not to subsume the POHA statutory torts under OSRA. Harassment, doxxing and stalking under POHA all have offline dimensions to it as well, which continue to call for protection and it is, therefore, necessary to preserve this. OSRA deals with online harms, so, we thought it will be more expedient and better coverage to not subsume that under OSRA.

Mr Ng also suggested that guidance be issued to help parties understand the interaction. We will do so. I noted what the hon Member said, having a healthy online culture and I think that is what we would like to promote.

The OSC and the Courts provide complementary avenues for victims to obtain protection from online harms. Some Members asked what happens if there are inconsistencies. In most cases, victims will want to use the OSC's directions at the first instance. As I said, it is quick, fast and swift relief. Many might consider to go further. For example, you might need to seek an injunction from the Courts. Like OSC's directions, injunctions are also intended to be quick and can protect the victim. But for tort claims, the claims in Court, it seeks to impose a civil liability on the other party. And so, for these reasons, the Courts are better placed to address such cases, which sometimes will need a more complex evaluation of the evidential positions on both sides and also considering the legal position and the arguments of parties.

Mr Henry Kwek asked what happens if the OSC and the Courts take a different view on whether an online harm occurred. The decisions of the OSC and the Court will not bind the other. The former is an administrative decision by an agency, and the latter is a matter of law decided by the Judiciary. But the OSC may take the Court's decision, and after a while, a body of case law and jurisprudence into account. Nothing to stop the OSC from having regard to the body of jurisprudence over time, developed as a result of this framework. And the OSC is empowered to also revisit past decisions and to vary or cancel them to the extent relevant or appropriate.

Mr Andre Low asked whether the OSC would be bound by Minister's clarifications made in the House today. Sir, it is usual that Parliamentary intention as discerned from the debate that we have engaged in and as recorded in Hansard can and should be taken into account when interpreting the Bill and this is not just by the OSC, but also by the Courts at a subsequent juncture when construing the interpretation and construction of a particular provision in this Bill.

Ms Valerie Lee asked whether the Right-of-Reply Directions which OSC may issue are intended to complement civil remedies for defamation. The short answer is yes, and it is because they serve different purposes. Currently, a successful claim in defamation provides compensation for reputational loss. But as I said at the outset, not all victims want that financial compensation. They do not want just to go to court for monetary compensation, and we have learned this from many consultations from the stakeholders.

Instead, many of them want to set the record straight as quickly as possible before their reputation is further harmed, and that is where the Right-of-Reply mechanism meets this need, and victims can choose to pursue either or both remedies, and I would say that it is not a prerequisite to seek a Right-of-Reply Direction first before commencing court proceedings for defamation. But the Court may consider the claimant's duty to mitigate loss and if an avenue is available for you to have a right of reply and you do not trigger that for good reason, then, that might be taken into account as part of the claimant's duty to mitigate.

Several Members asked questions about the balance between the desire to ensure accountability of wrongdoers and the need to safeguard personal information. This is under the End-User Information. Let me explain the framework in this context.

In my opening speech, I covered the safeguards to ensure that information disclosed is not misused. The OSC may impose any condition in disclosing the information, including limiting the use of the information to the approved purpose. Further, any misuse of information may be an actionable wrong in itself.

So, for example, if the victim uses the information to dox the perpetrator, that may be an offence under POHA. It may also be an offence under OSRA. These safeguards would protect the information disclosed and prevent misuse. The foregoing answers the questions of Dr Wan Rizal, Mr Henry Kwek and Ms Mariam Jaafar.

Mr Henry Kuek also asked about the OSC's directions under the End- User Identification measures and how they interact with PDPA. The short point is that the obligations imposed by the PDPA do not prevent platforms from complying with the OSC's directions.

Ms Eileen Chong also asked, I think, similar questions on the End-User Identification. Apart from what I have just said, I would like to refer to clause 53, which allows conditions by the OSC to be set in the course of providing information. I know the hon Member Ms Chong said that these proposals have, and I quote the Member, "real value", and I thank her for that; but that giving information to the victim is a "one-way door". That is also correct. But likewise, posting an online harm to the world, is also a "one-way door".

And so, the question for us really is, with most of the issues in this Bill, is to grapple with the right balance to be struck. On the one hand, you have information or a post to the world that is harmful. On the other hand, you are getting information that might be subject to conditions and there are also consequences, some of which are penal in nature if you misuse it. So, we believe overall that this provides the right balance to redress the victim's harm, provide an avenue to pursue the action. Without the identity, you cannot pursue the action and also ensure that the information is not used in a collateral way. I hope that answers Ms Chong's question.

Mr Henry Kwek noted that some anonymous perpetrators may abandon their account when a platform subsequently, after finding out that there is an online harm, attempts to collect their identity through the Collection Notice. Sir, that is possible. In such cases, the reality is that their information will then not be collected and both OSC and victims will, then, not be able to identify the perpetrator.

But we did consider what the alternative would be. The alternative would be to require all platforms to collect all information upfront, regardless of whether or not anyone has conducted online harm. As I said at the outset, the vast majority of users will not fall in that category. So, it will inconvenience and in some cases, perhaps, add an additional hurdle, and there is also a burden on platforms. Taking into account, as I said earlier, that balance, we decided that we will require the platforms to do so once we have established the online harm. We know that, with doing that, there is a real risk, that in some cases they will just abandon. But at the very least, with the provisions that OSC has under this Bill, OSC will be able to take action to deal with the harm being continued online.

Sir, I believe I have covered most of the points. I would just want to address Mr Low's point about clause 4. I think he raised a point about clause 4. The general position in law is that the Government is not bound by legislation unless it provides expressly for that to be so. In clause 4, Mr Low will be aware that neither the Government will be bound by this legislation nor can the Government avail itself of this legislation. So, on that basis, we do not think that it is suitable not to include the Government either way and for the reasons that we set out in the Explanatory Statement. I hope that answers his question.

Sir, I think that is really all the questions and, as I said, I apologise if I have not been able to go into the specifics but I think what I have said on both End-User Information and the Statutory Tort Framework elucidates our thinking behind those two measures in this Bill, and we ask for Members' support because, like Minister of State Rahayu, I believe this will be a game-changer in the online space, not just in what we do online but it will translate into how we interact with one another offline as well.

Mr Speaker: Minister of State Rahayu.

8.32 pm

Ms Rahayu Mahzam: Mr Speaker, I would like to make a clarification regarding my speech earlier. I had made reference to Ms Elysa Chen's speech; and I note that she actually did not ask about how defamation interacts with the harm of publication of statements harmful to reputation and how the OSC would ascertain disproportionality of OIDH. Instead, my reply actually addresses the general point on how victims can put out a reply quickly to protect their reputation and how it interacts with other laws, and I shared further on the threshold for OIDH.

Mr Speaker: Thank you. Can I now invite Minister Josephine Teo?

8.33 pm

Mrs Josephine Teo: Mr Speaker, I thank all Members who have spoken on the Bill, for your thoughtful comments and suggestions. Minister Edwin Tong and Minister of State Rahayu have responded to all of your comments comprehensively. Let me share just a few more observations about the overall thrust of the debate today.

The first observation is that Members agree no one should be beyond accountability when their actions harm others. Ms Mariam Jaafar was most emphatic about it. The stories shared by Members today illustrate the mental toll that victims of online harms experience. They are visceral and emotional, and it is why it is so important to help victims get closure through restitution. It goes to the heart of the OSC's mission to provide timely, effective and accessible relief for victims of online harms.

My second observation, Sir, is that Members recognise the complexity and tensions within the online sphere. Online interactions and the nuances of harmful activities can shift and evolve rapidly. No two cases of online harms will be the same. Every hour that passes by makes a difference to victims. This is why the OSC's work will be very challenging. Mr Xie Yao Quan posits that it will also be voluminous.

The Government has explained how the OSC's processes strike a balance between speed and due process, efficiency and effectiveness. As much as possible, we will rely on clear criteria for action. We will also provide avenues for independent review of decisions so that outcomes are consistent and fair.

Our framework will hold all stakeholders accountable – from platforms to perpetrators – for the safety of users. The OSC will not be undertaking this work alone. We have regularly worked together with our network of partners. Many have been on this journey with us since the beginning. The Bill, itself, has been years in the making. At every stage, local community organisations and advocates provided valuable insights that shaped our approach.

Feedback from industry stakeholders further refined our framework. I am heartened they see Singapore as a determined, yet pragmatic regulator. I also hope they will continue to actively contribute to improving online safety. The OSC will draw upon the extensive experience and expertise of our partners. It is essential to how we navigate the evolving digital landscape. It will also help us improve our practices and stay responsive to victims' needs.

My third observation is that Members agree the OSC must earn the trust of Singaporeans that it is effective and impartial. There are different thresholds for acting on online harmful activities that may occur. The Commissioner and the OSC have the delicate job of assessing reports and issuing directions on a case-by-case basis. We need to consider, for example, that what causes distress to a child may not be distressing to an adult. Mr Sharael Taha, Ms Eileen Chong, Ms Elysa Chen and Mr Alex Yeo have rightly pointed these out.

We may adopt a victim-led approach to filing reports, but OSC will need clear guidelines, adequate oversight and accountability mechanisms. This is so that all parties, whether victims, alleged perpetrators, platforms and administrators know that the OSC makes its assessment based on objective criteria.

The OSC's directions should only go so far to protect any immediate or further harm to victims. It should not be seen to be taking the side of one party over another. This is important, whether or not political personalities are involved. Trust in the OSC's objectivity and fairness will be critical to its success.

Minister of State Rahayu has explained how the OSC will take an objective approach in assessing online harms. These thresholds are not met simply because someone says they feel offended by the content, no matter who that someone is. Clear definitions of each harm have been provided in the Bill, supported by explanatory notes and illustrations. Mr Alex Yeo and Mr Foo Cexiang acknowledged and agree with this approach.

In time, the OSC will also publish guidelines on factors that it will consider in its decision making. This is important to aid public understanding. The transparency also ensures that the OSC's decisions can be held to clear standards. It is a necessary safeguard for fairness and consistency. I hope these assurances address Ms Chong's concerns on the independence of the Commissioner.

To Mr Pritam Singh's question on the use of Ministerial Directions. I would like to assure him that there is no intent to issue directions to interfere in the day-to-day workings of the Commissioner in specific cases. An example of a type of direction that may be provided for is on resource prioritisation for the Office of the Commissioner of Online Safety.

At this juncture, let me also thank Ms He Ting Ru for her proposed amendments to the Bill. Over the years, many Members of Parliament have given suggestions on strengthening online safety for Singaporeans. The Government has welcomed them. I spoke about this when I opened the debate.

As with all well-intentioned suggestions, the Government's approach is to review them carefully and rationally. Minister of State Rahayu and Minister Edwin have comprehensively addressed why the amendments filed by Ms He are either already provided for in the Bill or could inadvertently limit the work of the OSC in providing relief for the victims.

For instance, we agree that public interest is an important factor. In fact, the Bill provides that the Commissioner may consider it. But making public interest and overriding exemption could inadvertently provide a shield to bad online actors. Ms Tin Pei Ling underscored this risk. Under the pretext of public interest, these bad actors can continue to undermine victims' recourse.

To be clear, the Government has no disagreement with the intent behind Ms He's amendments. We will also operationalise the law in the same spirit. But as with any law, we must strike a careful balance between being specific to make the law clear and operable, and being too specific as to inadvertently make the law less effective.

MDDI and MinLaw have tried to get this balance right. OSC has both the latitude to act quickly and the duty to act responsibly within clear bounds. In designing the OSC's remit, we have taken every care to scope it precisely, proportionately and as practicably as we can, so that it fulfils its mission. OSC will learn from experience, continually refine its processes and always strive to do better. It will continue to monitor and report on its progress, refining the system openly as needed.

We welcome the Workers' Party's (WP's) shared interest in ensuring that OSC is well-positioned to achieve its objectives. Our end goals are aligned. We all want better support for victims of online harms. I hope Members will agree that our priority is to allow OSC to focus on building its capabilities, clarifying its policies and cultivating its partnerships. It will take time and things may not work perfectly at first but, step by step, we will get it right.

I thank Mr Singh for stating that the WP supports the Bill. If the WP agrees, I hope it will also consider withdrawing its amendments. Mr Speaker, I seek to move.

Mr Speaker: It is time for clarifications. Ms He Ting Ru.

8.43 pm

Ms He Ting Ru: Mr Speaker, the WP still believes that the amendments we tabled would enhance the effectiveness of our laws in working for victims of online harms, while providing careful consideration for the Bill's powers. We disagree that these changes will cause the protection regime to be problematic or toothless.

Nevertheless, I am grateful that we have had the chance to have a debate on these important issues and to the officeholders who have provided substantial clarifications. I think we all agree that the legislation covering online harms is very much something that will need continued review, refinement and development as the scale and nature of online harm shifts alongside changes in technology in our society.

On Minister of State Rahayu's point that the Broadcasting Act covers the two categories of harmful online activity covered by our proposed amendments, we note that the Broadcasting Act does not allow victims to seek a swift recourse. As I stated earlier, both these categories were also rated by respondents as the most egregious of harms and, thus, we believe that they should be explicitly covered.

Regardless, IMDA should work with the new agency under the Bill and study the types of reports received, emerging trends and effectiveness of the Broadcasting Act. We believe that victims should more appropriately lodge a report to OSC, not IMDA, to seek redress, and for OSC to issue directions and orders fast.

Additionally, the Broadcasting Act is more of a governance for entities which does not provide recourse for victims. Also, I believe that the new agency's powers are wider and contain more remedies that will be available to victims. Thus, will the Minister of State clarify how our concerns outlined above will be addressed?

Ms Rahayu Mahzam: Mr Speaker, I thank the Member for her further question.

As I have explained in my opening speech as well as my response, the nature of this particular Bill is really victim-led and in the manner in which there are certain specified online harms that are created as a result of that interaction online. In the case of sexual grooming, for example, as explained, if the victim realises that he or she is affected and there are actions that are causing harassment, alarm, distress or humiliation and there is content for OSC to act upon, we can act upon it.

So, it is the nature in which you are describing that action, because sexual grooming in itself is a whole transaction and it is a crime that can be dealt with better under the Penal Code. With regard to the materials that are online, again, if the materials are something that is causing harassment, distress, alarm and humiliation to the victim, that person can file to OSC and we will address it.

So, it is really a function of whether the victim sees a particular content that is triggering certain characteristics pursuant to this Bill, they can pursue it through OSC. The other point I would make is that OSC – we are taking a no-wrong-door approach, no-wrong-door policy with OSC. So, in the event some of these issues are raised to OSC, OSC will assess it and make appropriate references to make sure that the appropriate help is given to the victims.

Mr Speaker: Mr Andre Low.

Mr Low Wu Yang Andre: Speaker, my clarification is directed to Minister of State Rahayu. I just have a quick clarification to make.

In addressing a point made by hon Member Ms Elysa Chen, I believe Minister of State stated that, "Ms Chen would also be assured to know that it is still possible for persons who are still dissatisfied with the decisions of the Appeal Committee to seek to challenge it in the Courts by way of judicial review. We will provide more details on the process at a later time."

Sir, I just wanted to clarify if Minister of State Rahayu is still referring to the existing statutory power of the Courts to exercise supervisory jurisdiction over executive actions or is this envisaged as a distinct process?

Ms Rahayu Mahzam: I am referring to the existing power of judicial review.

Mr Speaker: Mr Pritam Singh.

Mr Pritam Singh: Mr Speaker, just a quick clarification. I am assuming this will be to Minister Tong.

On one of the clarifications that I had sought – I thank the Minister for addressing a number of those that I had put out. The point about the Online Criminal Harms Act, a sister legislation; and the implementation direction that the Government had issued to Facebook on their measures against scams, I think they had a timeline: first, end of September; then, end of October. I asked that question, actually, not specific to that, but just to have a better understanding of how the platforms will respond to the Bill the Government seeks to pass today.

As an example, Handout 2 states the platform – a victim will essentially go to the platform at first instance, except in a few different circumstances, and these pertain to intimate image abuse, doxxing, image-based child abuse – that one you go straight to OSC.

In the first case, I am trying to understand from the victim's perspective, how quick can we expect the platforms to respond to victims of online harm? I think this aspect of it is really the substance of that particular clarification, where I sought to understand how promptly and the extent to which the social media platforms would be able to comply.

Mr Edwin Tong Chun Fai: Thank you, Sir. I understand where the Leader of the Opposition is coming from.

I will not provide information on the first question because that is not the subject of OSRA. But I think on your more general point, the question really is this – two points.

First, as I mentioned, we did have extensive consultations, including with the platforms. The idea here is obviously we do not want to devise a framework in a scheme that will be unworkable and inoperable, practically speaking, so we have been consulting with them.

I mean, if truth be told, you start from a blank canvas and you ask the platforms, would they want to see a Bill like this? The answer probably is, no, if they had a choice. But we were quite determined to ensure that there would be a Bill of this nature, and a Bill of this nature would not have the teeth if it did not involve the platforms and have powers over the platforms, and I think they quickly understood that. So, we have been working with them.

There will be differences of views on what can be done, what cannot be done, but we have navigated the path forward and we will continue to do so.

To the Member's point about how quickly it can be done in this case, I would like to say that all of the provisions they work with one another. The OSC, on the one hand, the Statutory Tort Framework on the other and, bear in mind, that in the Statutory Tort Framework we have now prescribed in the case of platforms, administrators, communicators, tortious duties that are triggered from the time they get notice.

So, unlike in the past where you can get notice and the Member has heard what Minister Josephine said earlier about the requests from individuals being ignored in the case, sometimes five days or more. It would not happen here, because the time runs from when they get the notice. And, thereafter, their conduct as to what is reasonable will be measured by how easy it is to disable in the case, if it is very easy, then it should be very quick, as I mentioned earlier. Or if there is some reason why they cannot do so expeditiously, all that will be taken into account.

But the conduct, they cannot sit on it. The conduct from the time they receive notice will be under scrutiny. And in a way, that Statutory Tort Framework complements the OSC framework in ensuring that the platforms respond, react and do something about the harm that is complained of.

8.52 pm

Mr Speaker: Are there further clarifications for the Ministers? No. Just to be clear, Ms He Ting Ru, you are not withdrawing the amendments, right?

Ms He Ting Ru: No.

Mr Speaker: Aright, thank you.

Question put, and agreed to.

Bill accordingly read a Second time and committed to a Committee of the whole House.

The House immediately resolved itself into a Committee on the Bill. – [Mrs Josephine Teo].

Bill considered in Committee.

[Mr Speaker in the Chair]

The Chairman: Members, this Committee stage will be a rather long one, so please bear with me.

Clauses 1 and 2 inclusive ordered to stand part of the Bill.

Clause 3 –

The Chairman: Clause 3. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 3, standing in my name as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during the debate on the Second Reading of the Bill.

*The amendment read as follows:

In page 14: after line 12, to insert –

"(n) publication of online material encouraging or promoting suicide or an act of deliberate self-injury;

(o) sexual grooming of any person below 18 years of age or vulnerable adults;"

Consequential Amendment:

In page 14, line 13: to reletter paragraph (n) as paragraph (p).

The Chairman: As the amendment to clause 3 is related and interdependent with new clause A and new clause B, I shall allow the debate on the amendment to clause 3 to range over new clause A and new clause B.

Ms He Ting Ru, is there anything else you would like to add to the debate? You are entitled to speak. Not that I am asking you to speak, but you are entitled to speak if you wish to make any additional points.

Ms He Ting Ru: No, Sir. Thank you.

Question put, and amendment negatived.

Clause 3 ordered to stand part of the Bill.

Clauses 4 to 8 inclusive ordered to stand part of the Bill.

Clause 9 –

The Chairman: Clause 9. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 9, standing in my name as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during the debate on the Second Reading of the Bill.

*The amendment read as follows:

In page 24: after line 21, to insert –

"(5) For the purposes of subsection (1), communication of online material is not "online harassment" if it constitutes, or is part of a course of conduct that constitutes, fair comment on a matter of public interest.”

Question put, and amendment negatived.

Clause 9 ordered to stand part of the Bill.

Clause 10 ordered to stand part of the Bill.

Clause 11 –

The Chairman: Clause 11. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 11, standing in my name as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during the debate on the Second Reading of the Bill.

*The amendment read as follows:

In page 25: after line 27, to insert –

"(4) A person's conduct does not constitute "non-consensual disclosure of private information" under subsection (1) if, having regard to all the circumstances, the public interest in the disclosure of the private information outweighs the public interest in maintaining privacy.

(5) Without limiting subsection (4), matters which may constitute a public interest in the disclosure of the information include —

(a) informing the public on a matter of significant public concern;

(b) exposing wrongdoing, corruption, or a serious miscarriage of justice;

(c) the proper administration of government or the conduct of public services;

(d) open justice;

(e) protecting public health and safety;

(f) national security;

(g) the prevention or detection of serious crime or fraud."

Question put, and amendment negatived.

Clause 11 ordered to stand part of the Bill.

Clauses 12 to 18 inclusive ordered to stand part of the Bill.

Clause 19 –

The Chairman: Clause 19. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 19, standing in my name as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during the debate on the Second Reading of the Bill.

*The amendment read as follows:

In page 37: after line 26, to insert –

"(6) Communication of online material does not constitute "online instigation of disproportionate harm" under subsection (1) if the communication relates to a matter of public interest."

Question put, and amendment negatived.

Clause 19 ordered to stand part of the Bill.

Clauses 20 to 25 inclusive ordered to stand part of the Bill.

Clause 26 –

The Chairman: Clause 26. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 26, standing in my name as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during the debate on the Second Reading of the Bill.

*The amendment read as follows:

In page 42, line 28: to leave out "reason to suspect" and insert "reasonable grounds to believe".

Question put, and amendment negatived.

Clause 26 ordered to stand part of the Bill.

Clauses 27 to 62 inclusive ordered to stand part of the Bill.

Clause 63 –

The Chairman: Clause 63. There is an amendment. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move the amendment* to clause 63 standing in my name, as indicated in the Order Paper Supplement. The reasons for the amendment have been addressed in my speech during a debate on the second reading of the Bill.

*The amendments read as follows:

In page 75: to leave out lines 22 to 25.

Question put, and amendment negatived.

Clause 63 ordered to stand part of the Bill.

Clauses 64 to 111 inclusive ordered to stand part of the Bill.

New Clause A –

The Chairman: New Clause A. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I introduce a new clause entitled "Publication of online material encouraging or promoting suicide or an act of deliberate self-injury."

Brought up, and read the First time.

The Chairman: Ms He.

Ms He Ting Ru: Mr Chairman, I move that the new clause* be read a Second time.

*The new clause read as follows:

“Publication of online material encouraging or promoting suicide or an act of deliberate self-injury

A.— (1) In this Act, "publication of online material encouraging or promoting suicide or an act of deliberate self-injury" means the communication of online material that encourages, promotes, or provides instructions for suicide or an act of deliberate self-injury.

(2) Despite subsection (1), "publication of online material encouraging or promoting suicide or an act of deliberate self-injury" does not include the communication of material that:

(a) has a legitimate purpose related to science, medicine, education or art which a reasonable person would regard as such; and

(b) does not pose an undue risk of harm to any person below 16 years of age.

Explanation. — Material has a legitimate purpose related to an academic enquiry, or as an expression related to art which a reasonable person would regard as art.

Illustration

A is a professor at an educational institution. A conducts a study relating to suicide or self-injury and communicates this online as part of A’s work as a professor. A’s communication has a legitimate purpose related to education.”

Note: It is intended that this new clause A be inserted immediately after clause 21.

Question put, and amendment negatived.

New Clause B –

The Chairman: New Clause B. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I introduced a new clause* entitled "Sexual grooming of any person below 18 years of age or vulnerable adults."

Brought up, and read the First time.

The Chairman: Ms He.

Ms He Ting Ru: Mr Chairman, I move that the new clause* be read a Second time. The reasons for the new clause have been addressed in my speech during the debate on the Second Reading of the Bill.

*The new clause read as follows:

“Sexual grooming of any person below 18 years of age or vulnerable adults

B.— (1) In this Act, "sexual grooming of any person below 18 years of age or vulnerable adults" means the communication of online material by any person of or above 18 years of age (A) with another person (B) on at least one previous occasion —

(a) A intentionally communicates online material with B which encourages, promotes, or provides instructions of sexual communication or sexual activity;

(b) B is below 18 years of age or a vulnerable adult; and

(c) A does not reasonably believe that B is of or above 18 years of age or a vulnerable adult.

(2) For the purposes of subsection (1) —

(a) "Sexual communication" means intentional communication for the purpose of obtaining sexual gratification or of causing another person (B) humiliation, alarm or distress, and the communication is sexual.

(b) "Sexual activity" means —

(i) intentional engagement of an activity for the purpose of obtaining sexual gratification or of causing another person (B) humiliation, alarm or distress, and the activity is sexual; or

(ii) For the purpose of obtaining sexual gratification or of causing another person (B) humiliation, alarm or distress, A intentionally causes B to observe an image or recording which is sexual.

(c) "Vulnerable adult" means an individual who is 18 years of age or older, and by reason of mental or physical infirmity, disability or incapacity, incapable of protecting himself or herself from abuse, neglect or self-neglect.

(3) To avoid doubt, it is not a defence that B consents to the communication of the online material under subsection (1).

(4) Despite subsection (1), "sexual grooming of any person below 18 years of age or vulnerable adults" does not include communication of material that —

(a) has a legitimate purpose related to science, medicine, education or art which a reasonable person would regard as such; and

(b) does not pose an undue risk of harm to any person below 18 years of age."

Note: It is intended that this new clause B be inserted immediately after new clause A.

Question put, and amendment negatived.

New Clause C –

The Chairman: New Clause C. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I introduced a new clause* entitled "Appeal to General Division of High Court."

Brought up, and read the First time.

The Chairman: Ms He.

Ms He Ting Ru: Mr Chairman, I move that the new clause* be read a Second time. The reasons for the new clause have been addressed in my speech during the debate on the Second Reading of the Bill

*The new clause read as follows:

"Appeal to General Division of High Court

C.— (1) An appeal against, or with respect to, a decision or direction of an Appeal Committee on an appeal under section 65 lies to the General Division of the High Court.

(2) An appeal under subsection (1) may be made only on one or more of the following grounds:

(a) on a point of law; or

(b) that the online harmful activity did not occur; or

(c) that it is not technically possible to comply with the decision, direction, or order that is the subject of the decision or direction of the Appeal Committee.

(3) In any appeal under this section, the General Division of the High Court may confirm, vary or set aside the decision or direction of the Appeal Committee and make such further or other order as the Court deems fit.

(4) The procedure for an appeal under this section is to be governed by the Rules of Court."

Note: It is intended that this new clause C be inserted immediately after clause 67.

Question put, and amendment negatived.

New Clause D –

The Chairman: New Clause D. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I introduced a new clause entitled "Risk Assessment, Reporting, and Accessibility."

Brought up, and read the First time.

The Chairman: Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move that the new clause* be read a Second time. The reasons for the new clause have been addressed in my speech during the debate on the Second Reading of the Bill.

*The new clause read as follows:

“Risk Assessment, Reporting, and Accessibility

D.—(1) The Commissioner shall, on an annual basis, cause to be prepared and transmitted to Parliament a report, which must state the particular kinds of activities and content present during the preceding financial year, including the following —

(a) consolidated information on the number of reports or complaints received by the Commissioner;

(b) information on the types of complaints and reports from persons affected by online harmful activity as received by the Commission;

(c) number of directions and orders issued by the Commissioner, the type of directions and orders issued, the classes of material pursuant to which a direction or order was issued (if any), and the time taken from the date of a report and the issue of directions and orders;

(d) categories of persons or entities who have been issued a direction or order by the Commissioner, which includes an administrator, communicator, online service provider, owner, or prescribed online service provider;

(e) findings by the Commissioner on the risk assessments and trends relating to online harms;

(f) steps the Commissioner has taken, and its processes to address privacy concerns in accordance with the Personal Data Protection Act 2012; and

(g) an assessment of the accessibility of recourse provided by online service providers for vulnerable adults.

(2) For the purposes of subsection (1), "vulnerable adult" means an individual who is 18 years of age or older, and by reason of mental or physical infirmity, disability or incapacity, incapable of protecting himself or herself from abuse, neglect or self-neglect.

(3) In preparing the report under this section, the Commissioner may consult —

(a) persons (including a public agency) to represent the interests of women and children (generally or with particular reference to online safety matters);

(b) persons (including a public agency) to represent the interests of vulnerable adults; and

(c) such other persons as the Commissioner considers appropriate."

Note: It is intended that this new clause D be inserted immediately after clause 111.

Question put, and amendment negatived.

New Clause E –

The Chairman: New Clause E. Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I introduced a new clause E.

Brought up, and read the First time.

The Chairman: Ms He Ting Ru.

Ms He Ting Ru: Mr Chairman, I move that the new clause* be read a Second time. The reasons for the new clause have been addressed in my speech during the debate on the Second Reading of the Bill.

*The new clause read as follows:

“E.— (1) For the purposes of the preparation of the report in section 112, the Commissioner shall have the authority to require online service providers to provide and publish publicly accessible information on an annual basis on —

(a) risk assessments and trends relating to online harms;

(b) privacy of its users;

(c) clear procedures for reporting content and making complaints;

(d) the measures taken or in use to enable users and others to seek recourse on such online platforms for victims; and

(e) the time taken and remedy granted for such online platforms in response to reports and complaints."

Note: It is intended that this new clause E be inserted immediately after new clause D.

Consequential Amendments:

Amendments to be made to the numbering of clauses, cross-references, and contents page consequent on the addition of any new clause(s).

Question put, and amendment negatived.

The Schedule ordered to stand part of the Bill.

Bill considered in Committee; reported without amendment; read a Third time and passed.

9.07 pm

Mr Speaker: Deputy Leader.