Second Reading Speech of the Online Safety (Relief and Accountability) Bill 2025, Mr Edwin Tong SC, Minister for Law and Second Minister for Home Affairs
5 November 2025 Posted in Parliamentary speeches and responses
I. Introduction
Mr Speaker, Sir,
1. Today, we take an important step to make our online spaces safer and fairer.
2. The Bill before this House clarifies duties, provides relief, and also strengthens accountability across the online ecosystem to deal with online harms.
3. My colleagues have earlier taken Members through the key aspects of the Bill.
4. I will focus on outlining the new Statutory Torts framework, as well as the End-User Identification measures in this Bill.
II. Stories of victims of online harms
5. Let me start by reiterating why we saw a need for this Bill.
6. The victims of online harms are not just statistics. They may be our children, classmates, colleagues or neighbours. Their confidence, studies, and indeed, their livelihoods can be shaken and seriously affected.
7. For one Primary 4 student, cyberbullying reared its ugly head when she started using Instagram.
8. She based her self-esteem on how many ‘likes’ and ‘followers’ she had. She would ask her friends to ‘like’ her photos or ‘follow’ her account to appear popular. If a photo did not receive over 100 ‘likes’, she would delete them.
9. She began receiving comments about her appearance and hurtful messages from those who were supposedly her friends.
10. Her mental health worsened. She self-harmed and she was later diagnosed with PTSD when she eventually sought help at IMH. She was on medical leave from school for most of the year.
11. Sir, this is a sad story. But what is worse is that it is not an isolated case.
12. With your permission, Mr Speaker, may I ask the Clerks to distribute a handout? Sir, Members may also access these materials through the portal, MP@SGPARL App. Sir, Handout 6 sets out several other accounts - Students, women, working adults, and their loved ones.
13. Online harms affect not only individuals and their families, but also the confidence and the trust of our society as a whole.
14. In the SHE 2023 Study on Online Harms cited by Minister of State (Health and Digital Development and Information) Rahayu Mahzam a short while ago, 76% of respondents were NOT comfortable expressing their personal views on potentially controversial issues and topics online.
a) Women, youths and minorities are especially vulnerable. 22% of female youths experienced sexual harassment, compared to 14% across all respondents.
b) 52% of respondents aged 15 – 24 years old personally experienced online harms, compared to 38% across all respondents.
c) 14 – 21% of respondents believed they experienced harms because of their identity, such as race and religion.
15. As a result, victims exit from public life or they choose to stay offline. Their voices – and their representation – are lost. Proper discourse in the community becomes weaker.
16. If this continues, we will have a divided society, and a weakening of our social fabric and collective trust.
III. Why do harms occur online
17. Sir, in contrast, most Singaporeans feel safe on our streets. About 98% of Singaporeans feel safe walking out alone, even at night.
18. In the offline world, we are mindful of how we behave towards one another. We know not to threaten, to harass or to insult other people. In our everyday interactions, there are social norms we know to abide by in public spaces, in schools and also in work spaces. Such norms allow us to function and thrive as a society.
19. But when it comes to the online world, these norms are not quite observed in the same way. The sense of safety we have offline does not translate into the online space.
a) The SHE 2023 study found that 58% of respondents reported personally experiencing and/or knowing others who faced online harms.
20. We studied why this might be so, and in the course of our study, three issues stood out:
a) First, fragmented standards and weak accountability in the online harms regulation space.
b) Second, economic incentives that unfortunately reward sensational content.
c) Third, online anonymity that emboldens misbehaviour.
21. These three issues have very much shaped the thinking behind the construct and the framework of this current Bill, and let me elaborate that for Members.
22. Thus far, the development of the rules and norms of the Internet has been largely left to the Tech Companies.
23. Left on their own, different Platforms will apply different rules, largely shaped by their own interests. In the absence of any common set of enforceable norms, it is not easy to expect or enforce accountability. Instead, these differing standards will continue to exist and operate in a manner which allows wrongdoers to exploit the gaps.
24. This lack of accountability is exacerbated by the structural features of the Internet. There is a misalignment between Tech Companies’ profit motive and the need and desire to enhance online safety.
25. At the Online Harms Symposium co-organised by MinLaw and the SMU Yong Pung How School of Law in 2023, former Facebook employee turned whistleblower, Frances Haugen, spoke about how Tech Companies do have solutions to address online harms but implementing them will eat into their profits and hurt their bottom line.
26. Social media platforms profit from the amplification of sensational and inflammatory content. Viral, harmful content draw engagement, higher engagements draw eyeballs, so their removal will impact revenue.
27. We have some basis to believe that online safety considerations might well yield to profit generation if left unregulated. Today, Tech Companies have community standards which ostensibly address online harms. Members can see some examples of these standards which are set out in Handout 7. However, most Platforms do not necessarily adhere to their own community standards.
a) Indeed, in the IMDA study cited by Minister for Digital Development and Information Josephine Teo a short while ago, over 50% of legitimate user complaints were not addressed in the first instance.
28. This, we believe, is far from ideal, as victims are reliant on Platforms to stop online harms.
29. At present, there is no framework which can give redress to victims for harms that happen at the speed and the extent to which they happen online. While victims may try and seek relief in Court, there is a limit to how fast Court proceedings can be, and may also be costly. We all know that in these types of cases, speed of redress is crucial. Most victims of such harms do not want to have to seek relief in Court, through the Court process.
30. It is also clear that online users behave on the Internet differently from the offline world. Perpetrators are emboldened to act with impunity, especially when they can remain anonymous.
31. First, they experience what we call the “Online Disinhibition Effect”, a term coined by Professor John Suler. Prof Suler explains that anonymous internet users separate their online persona from their in-person identity and they do not feel responsible for their behaviour online.
32. Second, there is little that victims can do to hold anonymous users accountable simply because they remain anonymous. Perpetrators know that they are unlikely to get caught and do not fear consequences. Overall, in general, anonymity fuels bad behaviour.
IV. Statutory torts
A. Existing measures to address online harms
33. To address these issues, the Government has taken proactive steps. Minister Josephine outlined the reforms that protect users. MOS Rahayu explained how the new Online Safety Commission (OSC) will provide timely relief to stop harm. Let me address the remaining mechanisms in this Bill for Members.
34. Sir, we start with the proposition that, in some cases, stopping the harm alone might not be enough. Victims might require additional recourse, for example, compensation.
35. To do that, they will need to know who lies behind that anonymous social media handle.
B. New statutory torts to clearly address online harms
36. The Bill therefore seeks to introduce a framework to close those gaps.
a) It will introduce clear statutory duties for online actors – defining what responsible conduct and therefore what the standard of duty that is expected of them ought to be;
b) It provides civil remedies for breaches of those duties – giving victims the right to seek justice in Court; and
c) It creates an additional avenue of relief – complementing the quick administrative recourse that can be obtained through the OSC – so that victims can choose the path that best suits their needs.
37. During the Online Harms Symposium that I referred to a short while ago, Ms Haugen likened internet and social media regulation today to the evolution of road safety, something that happened several decades ago. In the 1960s, US car manufacturers vigorously resisted safety reforms. But informed legislators persisted with greater regulation, coupled with a push, a very determined push, from concerned citizens, as well as investors.
38. Through legislative changes, safety was made a core design principle in the manufacturing of cars, and it has been estimated that between 1960 and 2012, over 600,000 lives were saved in the US as a result of this.
39. In a similar way, not exactly the same, but in a similar way, we hope that this Bill will make online safety a design principle of the online space upfront, and not just an afterthought.
C. Defendants and their duties under the statutory torts
40. Sir, let me now outline how the Statutory Torts framework is intended to work.
41. At the outset, our focus and the intent is to empower victims. It is a very victim-centric approach. Today, if you look at the example of the one victim who tried to speak to the platform – you heard the stats that Minister Josephine cited earlier, it is very difficult, and they are pretty much powerless today.
42. At the Government level, we have the Broadcasting Act, the Protection from Online Falsehoods and Manipulation Act, and the Online Criminal Harms Act that empower the Government to act in a variety of ways. OSRA provisions will empower private persons to obtain relief, and Members might therefore be aware that Clause 4 makes it clear that a public agency cannot commence claims under the Statutory Torts framework. This is a private citizen’s private remedy.
43. I will now explain the types of online harms covered by the Statutory Torts.
44. This framework will cover the same categories of harms that the OSC will act on, the same categories that MOS Rahayu took you through earlier, with some exceptions and refinements for coherence.
45. Clauses 83 to 88 cover the following harmful online activities:
a) Intimate Image Abuse;
b) Image-based Child Abuse;
c) Online Impersonation;
d) Inauthentic Material Abuse;
e) Online Instigation of Disproportionate Harm; and
f) Incitement of Violence.
46. Harassment, Doxxing and Stalking will continue to be dealt with under the Protection from Harassment Act (POHA) for Communicators, but the new statutory duties for these harms will extend to Administrators and to Platforms under this Bill, since POHA does not cover them.
47. The online harms omitted from the Statutory Torts, mainly, (i) False Material, (ii) Statements Harmful to Reputation; and (iii) Non-Consensual Disclosure of Private Information – are already well-covered under the existing legal framework, laws on defamation, as well as on privacy and confidentiality.
48. This alignment ensures coherence in our legislation – no overlapping, no double remedy.
49. In this Bill, the Statutory Torts will also not cover Incitement of Enmity. We think it is unwise to encourage such matters – which can be potentially explosive, emotive and divisive – to be dealt with litigiously, in a courtroom. And so, it will be dealt with by the OSC.
50. The Statutory Torts will be implemented in phases, as you heard MOS Rahayu sketch out the OSC’s implementation of harms framework earlier, in coordination with the OSC.
51. Second, Sir, the Bill assigns clear duties to the key actors in the online ecosystem – and they are the Communicators, the Administrators, and the Platforms.
52. Clauses 83 to 88 therefore impose duties on Communicators not to make or share any communication which constitutes an online harm.
53. Clauses 90 and 91 impose two duties on Administrators.
a) First, they must not develop or maintain an online location in a manner that facilitates or permits online harm to take place – with the intention or knowledge that online harm is likely to take place.
(i) This duty covers Administrators that are complicit in the online harms, and Members might be aware of the infamous chat group “SG Nasi Lemak” previously. This will cover the administrators of such a chat group.
b) Second, when notified of harm, they must act reasonably – more specifically, they must take reasonable care to assess if there is harm and if so, to take reasonable steps to address it.
54. Clause 94 therefore imposes a similar duty on Platforms to act reasonably when notified of harm. I want to make clear to Members that the duty to take “reasonable steps” does NOT require the platform to do constant surveillance and monitoring.
55. Their liability arises only when an actor fails to act reasonably after receiving proper notice. They are not liable, if through no fault of their own, they did not receive the notice sent by the victim.
56. In assessing reasonableness, the Court will consider the circumstances of the case – including the seriousness as well as the persistence of the harm.
57. Let me illustrate this:
a) An Administrator or Platform that receives notice of a harmful post for the first time may act reasonably by simply removing the post, if that is the appropriate remedy under the framework.
b) But if the same account repeatedly causes harm in the same way, simply taking down each time the post is put up may no longer suffice.
c) In such cases, taking stronger action – such as suspending or disabling the offending account – may be the reasonable steps required under Clauses 91 and 94.
That is what I meant when I said you assess the entire factual matrix and situation holistically.
58. What is “reasonable” will therefore depend on the facts. The Courts can take into account factors such as the nature of the conduct, the context in which it occurred, and the effect and impact on the victim.
59. For example, putting up a post or creating a website to whistleblow on serious misconduct may well be “reasonable” if done for a legitimate purpose and in a proportionate manner, even if it might cause harassment, or might be considered as online instigation.
60. There are also safeguards to address concerns that Administrators and Platforms may be inundated with frivolous notices or notices with insufficient information. The Bill provides that the particulars, which an online harm notice must contain, are to be prescribed. We set it out clearly in a prescribed form, so that the categories of information are known upfront, and this ensures that only genuine, properly documented cases trigger the duty to act, and the Administrators and Platforms have enough information to take “reasonable steps”.
61. Ultimately, it is the Court that will look at the facts of each case, weigh the totality of the evidence and decide whether a claim is made out, and if so, what remedy should follow and against which party. These are all fact-sensitive judgments that reflect the diverse realities of online interaction.
62. The Bill therefore avoids the use of fixed or rigid formulas, to allow the Court to develop the law incrementally, while at the same time, keeping the focus squarely on online safety and responsibility. The Bill recognises that Platforms and Administrators need not proactively scan for all harms. They only have to act responsibly once notified.
63. Taken together, these duties encourage vigilance without imposing impossible burdens. They reflect a simple ideal embodied in many legal principles, which is – if you control the space, then you must play your part in keeping it safe.
D. Remedies under the statutory torts
64. Next, Sir, let me turn to remedies. If a victim successfully establishes a claim, the victim must have access to effective and fair remedies.
65. Under Clause 96, victims may seek
a) damages that the Court finds just and equitable; and
b) other heads of damages that the Minister may prescribe in regulations – such as compensation for loss of earnings or an account of profits where perpetrators benefitted from the harm.
66. The intent is to ensure that victims are properly compensated, and wrongdoers are not allowed to benefit from their behaviour. For some harms, therefore, the victim’s earning capacity or livelihood may be affected, and they should be compensated for loss of future earnings or loss of earning capacity, as the case may be or as may be appropriate.
67. In other cases, such as where intimate images have been put online and for sale, and perpetrators profit from this harm – then in those cases, an account of profits may be ordered so that the wrong-doer does not get to retain the benefits of the harm caused.
68. The regulations in this Bill reduce victims’ uncertainty as to what remedies they are entitled to. But ultimately it is for the Court to decide on the appropriate orders, based on the facts of each case.
69. Clause 98 introduces the concept of enhanced damages, and empowers the Court to award such damages where a Communicator or Administrator persists with their conduct despite notice.
70. We think that enhanced damages should apply to those who are the root cause of the harm – such as recalcitrant Communicators, or Administrators who create harmful websites or chatgroups. We have therefore excluded the Platforms.
71. This framework is intended to incentivise and drive reasonable compliance, and in some cases, as quickly as possible. The enhanced damages framework also compensates victims for any additional harm resulting from failure to comply. We hope to drive overall a strong enough messaging with a deterrent impact on the actors in the online space.
72. Therefore, enhanced damages may be awarded to compensate the victim for additional harm caused by the refusal to stop the online harm, or penalise the Communicator or the Administrator for bad conduct. The Court will consider the overall justice of the situation when assessing whether to impose enhanced damages.
73. In addition, Clause 99 empowers the Court to issue injunctions, both interim as well as permanent, to stop harm swiftly.
74. These injunctions operate independently of any direction from the OSC, giving victims complementary routes to relief. The OSC and the Courts operate independently of each other, and neither is bound by the decision of each other.
75. The OSC seeks to act quickly, and takes public interest into account in making its decisions. The Court decides any claim for Statutory Tort relief based on the applicable legal principles, and a framework for remedies.
76. Taken together, Sir, we believe that we have fashioned a suite of remedies that strikes the right balance:
a) Accountability for wrongdoers;
b) Fair recourse for victims; and
c) Flexibility for the Courts.
E. Summary
77. The Statutory Torts are designed to change and strengthen norms – to make self-responsibility a default in our online space.
78. I come back to the time when the first motor-safety laws were introduced in the 1960s. There was an American publication – Automotive News – which lamented the passing of these laws with the headline and I quote: “Tough safety law strips auto industry of freedom”.
79. There was fear. There was resistance in the industry.
80. But, with the passage of time, history has proved those laws right – they made cars safer, saved countless lives, changed attitudes and mindsets, and re-shaped how the industry designed every vehicle thereafter.
81. We hope that this Bill, with a clear framework, can also have the effect of setting the right tone for online behaviour, shape mindsets and attitudes, for both users as well as service providers. They define what is acceptable, and what is not. They will guide conduct not only through monetary damages, but through shared expectations made explicit.
82. Overall, our intention is that as our online norms mature, we will rely less and less on reports and lawsuits. Because, like road safety, the law will have done what it set out to do: not just punish harm, but nurture the habits that prevent harm in the first place.
V. End-user identification
83. Sir, I move on now to address how the Bill handles anonymity.
A. Effects of anonymity
84. From time to time, anonymity can serve a good purpose. It allows users to speak freely, sometimes obtain assistance, and on occasions allows marginalised groups to speak up.
85. But at the same time, it must not shield wrongdoing.
86. Unfortunately, many online users abuse the power and privacy which online anonymity affords them.
87. I had earlier covered how online anonymity is a driver of online harms and leads to the Online Disinhibition Effect.
88. Anonymity also exacerbates the impact of harm on victims.
a) First, victims may become more distrustful of those around them. They wonder who it is posting on their social media sites. They do not know if the perpetrator is indeed someone they might know.
b) Second, victims will not be able to obtain legal recourse from perpetrators. By definition, they cannot commence legal proceedings, or enforce Court judgements, against an unknown person.
B. Limitations of existing options
89. There are existing mechanisms currently available in Court, such as pre-action discovery, and non-party discovery. All of these are mechanisms which can be used to obtain information about the identity of wrongdoers.
90. But victims will still need to commence Court proceedings which may be costly and time-consuming.
91. So, we believe the Bill’s proposed End-User Identification measures offer an accessible option.
C. How the end-user identification measures will work
92. To start with, Clause 49 empowers the OSC to obtain information and documents for the discharge of its functions. This includes identity information of an end-user which is in the possession of Platforms. This is akin to how law enforcement agencies are empowered to obtain such information for investigative purposes. Akin but not similar.
93. Second, Clause 52 empowers the OSC, where it reasonably suspects a user of committing an online harm, to require Prescribed Platforms to take reasonable steps to obtain specified information that may identify the user. This can take the form of the user’s name, or perhaps verified phone numbers or credit card information, which can then be used to make further inquiries with telcos or the banks.
94. This obligation to collect information is carefully scoped to target those users who are suspected of carrying out online harms. This is following close consultations that we had with industry partners who expressed difficulty with a general obligation upfront for Platforms to collect information of all their Singapore users.
95. Third, Clause 53 empowers the OSC to disclose the perpetrator’s identity information to a victim or to their authorised representatives, upon receiving an application from the said victim. At the initial stage, disclosure will be limited to the purpose of enabling victims to bring their claim. We intend to eventually extend this for other purposes as well, such as allowing victims to safeguard themselves from the perpetrator, and to take proactive future measures.
D. Addressing concerns about end-user identification
96. Mr Speaker, we recognise that some may have concerns – that these measures might intrude on users’ privacy or go too far.
97. Let me be clear – that is not the case, and we thought about this framework quite carefully. The measures are aimed squarely at those who hide behind anonymity to cause online harms. They are not meant to affect ordinary users who act responsibly.
98. In fact, for the vast majority and for most users, nothing will change. Most Platforms today already require some form of verified contact or payment information at the point of registration.
99. Additionally, when the OSC discloses a perpetrator’s identity to a victim – there will be safeguards to ensure that the information is protected and not misused.
a) First, the OSC may impose strict conditions on how the information can be used – such as limiting the use of the information to seeking protection or pursuing legal remedies.
b) Any breach of those conditions will be a criminal offence.
100. Second, the misuse of the information may itself attract legal consequences. For example, if a victim were to use the information obtained from the OSC to dox the perpetrator – that could itself be an offence under the POHA or under an online harm under this Bill.
101. In short, the Bill has in-built safeguards. They balance and protect both the victim’s right to know, as well as the perpetrator’s right against misuse.
VI. Proposed amendments by Ms He Ting Ru
102. Sir, this Bill is designed above all, as Members can see from how I have articulated the framework and the schema of this Bill, to be as victim-centric as possible, and to give swift accessible relief to those who have suffered real harm online. The provisions have been drafted with that goal in mind.
103. Minister Josephine spoke about how the OSC will be empowered to issue directions quickly to address harmful content, and MOS Rahayu explained the appeal mechanisms available.
104. Sir, the Honourable Member Ms He Ting Ru has proposed two amendments which I would like to address. Her amendments speak to the removal of finality of an Appeal Committee’s decision, and second, to add a right of appeal to the General Division of the High Court.
105. Sir, I would like the House to know that both the MinLaw and MDDI teams had carefully considered the appeal process, and it includes options similar to Ms He’s proposals. However, we felt that we could not support them in this Bill, and let me explain why.
106. These mechanisms will make the process slower, with less finality to the proceedings. It will make it more complex, and ultimately, less accessible for victims.
107. Let me reiterate that the purpose of the OSC is to deliver speedy, practical relief to give redress to what has objectively been determined to be an online harm. Allowing repeated appeals would prolong litigation. Each new appeal means fresh rounds of arguments, delay and also uncertainty in dealing with harmful content, as well as renewed anxiety for those already hurt, who quite likely will have to remain engaged throughout the appeal process.
108. We expect that there will be likely higher case volume in OSRA cases, which also adds to the administrative load of the OSC. Sir, the further point is this – if a case goes on appeal to the High Court, lawyers will probably be instructed. In such an instance, will there be equality in how this might play out?
109. One can imagine – most Platforms are very well resourced, and likewise a number of administrators and content creators too. What happens when an individual victim might need to seek redress against one of these giants, with deep pockets, in Court and with lawyers? With the additional prospect of having to bear substantial costs in the litigation if one does not succeed? Overall, we fear that this will dissuade victims from coming forward.
110. Over time, this will render the framework toothless, not because of the provisions, but because individual victims will find it more difficult to seek redress and might shy away. This will make the framework less inclusive, and we hope not to see that.
111. In contrast, we believe that the current framework already strikes the right balance. In the first place – the framework that you heard MOS Rahayu outline earlier – these are administrative decisions by the OSC, who assess the harm based on the prescribed factors in this framework. They make a polycentric decision, taking into account policy and public interest considerations when deciding on whether it is a harm and if so, what the appropriate remedy ought to be. Such administrative decisions are subject to judicial review and not an appeal. In fact, this is not unusual.
112. Sir, at its heart, this Bill is as I said at the outset, is about empowering victims. The OSC’s process is deliberately designed to be straightforward – fast, simple, focused on stopping harm quickly, and hopefully not spending time arguing about it.
113. We think that Ms He’s proposed amendments, though well-intentioned, would probably make that journey harder, and not easier.
VII. Conclusion
114. Mr Speaker, Sir, this Bill is pragmatic, proportionate and principled. It protects victims, sets fair expectations for online actors and strengthens trust and accountability in our digital commons.
115. If we proceed steadily and work together – Government, industry and users – I believe we can keep our online spaces open, but also safe. Vibrant, but also responsible.
116. Sir, we have shown clearly how online harms exact a cost – on individuals, on families, and on the social fabric that holds us together.
117. As technology evolves, new harms will emerge. Our laws must therefore remain future-ready. We must be bold and innovative to stay ahead, but also compassionate in how we protect those who are most vulnerable.
118. This Bill gives victims a clear and practical framework to seek relief when harm occurs. It also sends a clear, unambiguous signal – that everyone who shapes our digital spaces in Singapore must act responsibly.
119. Through the OSC, the Statutory Torts framework, as well as the End-User Identification provisions, we are building a coherent system of protection and accountability.
120. Each prong complements the others:
a) The OSC is a safety net, providing rapid relief to victims of online harms.
b) The Statutory Torts framework provides private remedies, and it also is the standard-setter, encouraging all actors to play their part; and
c) The End-User Identification measures ensure that no one can cause harm from behind a mask.
121. Sir, public support for these measures has been strong – across communities, professions and also generations.
122. In the Public Consultation launched by MinLaw and MDDI in 2024, respondents expressed strong support.
a) For establishing a dedicated agency to address online harms – over 90% in support.
b) Allowing victims to take legal action, such as seeking compensation in Court for private remedies on top of the OSC’s framework – over 95%.
c) Disclosing a perpetrator’s user information to the victim for certain specified purposes – over 80%.
123. Sir, the Government started this work a long time ago. We started looking at developing this Bill as far back as 2021, even as the amendments to the Broadcasting Act and the Online Criminal Harms Act were being worked on.
124. We spent close to five years carefully examining the issues. We conducted numerous surveys and studies into the issue of online harms in Singapore, the findings of which have been presented to this House earlier. We also partnered with the SMU Yong Pung How School of Law to organise the Online Harms Symposium, where distinguished speakers and panellists, including experts on online safety from around the world, shared their insights on key issues and solutions for online harms.
125. In addition to the Public Consultation exercise, we also met and consulted extensively with over 100 different stakeholders over the years. This includes local and foreign experts, foreign regulators, victims of online harms, social service agencies, lawyers and the Judiciary, and the MOE and other educational institutions.
126. We recognise that the impact of online harms may be felt and experienced differently across different communities. We have heard from various segments of society, including the youths, disability and the community groups. We learnt much from their experiences, their insights and their stories.
127. We also conducted over 20 engagement sessions with Tech Companies in the past two years to ensure that the provisions in the Bill are robust, workable, feasible and can be carried out when the OSC issues directions.
128. We discovered, through these extensive engagements, a shared belief that the online world should reflect the same values of respect, decency and fairness that we all know and often assume, and which guides us in the offline space.
129. Around the world, societies are grappling with similar challenges – in Europe, the United Kingdom, Australia and the United States. We are moving in step with these global efforts, but shaping our own paths, our own course, contextualised and nuanced to what Singapore needs.
130. Sir, ultimately, law and regulation alone cannot keep our people safe. We will require a whole-of-society effort. Public education must teach users to protect themselves, and every user must take ownership of their safety and behaviour online.
131. But if we can do this together – build sound laws, responsible platforms, and a thoughtful public – we will strengthen not only our digital safety, but over time our social fabric.
132. And in time, our online norms will not erode, but endure – grounded in respect, anchored in responsibility and guided by the same values that make Singapore strong.
133. Sir, that leaves me to express gratitude to a few persons and groups who have contributed deeply to our work. As mentioned by MOS Rahayu, we convened a Steering Committee who guided our team in shaping our policy. In particular, let me acknowledge two members from the private sector – Ms Stefanie Yuen Thio, joint managing partner of TSMP Law Corporation, member of the Sunlight Alliance for Action and founder and chairperson of SG Her Empowerment (SHE), as well as Associate Professor Eugene Tan from SMU.
134. In addition, SMU Yong Pung How School of Law partnered us to organise our Online Harms Symposium. The sharing from the experts and survivors at the Symposium informed much of our thinking on this matter.
135. SHE’s surveys and research on online harms, and their experience in running SHECARES@SCWO, Singapore’s first support centre for targets of online harms, provided us with data and insights to refine our policy.
136. Finally, all those, many from the public, who responded to our Public Consultation or who have engaged with us with very constructive comments and suggestions, or written to my Ministry to share their stories. Every story helped us to shape the contours of this Bill. We thank them for their suggestions over the years.
137. Mr Speaker, we have in the audience today – in the gallery above – a few who have contributed deeply to our work and gave valuable feedback in developing our proposals. We have members from SHE and SHECARES@SCWO – Natalie, Hemavalli, Lorraine, Saira, and Si Han, who together with their team, served as vital pillars of support for those experiencing online harms today.
138. We also have representatives from YouthTechSG – Ben, Zoe, Beatrice and Kok Thong, who together with many others shared their perspectives of young Singaporeans with us.
139. We are very deeply grateful for their partnership and commitment to making our digital spaces safer for all. and we record our thanks and gratitude to the time that they had taken and the experiences they have so generously shared.
140. Thank you, Sir.
Last updated on 5 November 2025