Keynote Address by Minister for Law and Second Minister for Home Affairs, Mr Edwin Tong SC, at 35th Annual IBA Communications and Competition Law Conference 2026
Distinguished guests
Members of the International Bar Association
Ladies and gentlemen
Introduction
1. Good morning everyone. It is a hard act to follow the distinguished speakers earlier, but I’d like to start with a good morning to all of you, especially friends from overseas. We wish you a very warm stay in Singapore, not just weather-wise, but in terms of hospitality and also the generosity of spirit, which I hope you would be able to see a bit of whilst in Singapore.
2. I thought I would share a few remarks in response to what we have just heard. I spent 25 years in practice. I was practising arbitration and litigation. I was with a firm in Singapore throughout that time. During my time in practice, I was very much a member of IBA - taking part in many of these conferences. Among the most memorable ones that come to mind were Prague in 2005 and Chicago in 2006.
3. It is therefore, with a lot of prestige and privilege as well as honour, that we are going to host IBA again next year in 2027. It is something that we are very much looking forward to, and we will work with the IBA team to make this a memorable experience. I think IBA has set some rather high expectations. We hope that we will be able to live up to that.
4. The second point I will make is this: in all my time at IBA, I have always seen IBA as a very forward leaning, very visionary organisation. The announcements that Jaime (Co-President of IBA, 2025) made earlier about the AI initiatives just underscores that, because today you can’t really talk about lawyers without speaking about AI.
5. It is important that we, as lawyers, come to terms with technology. In Singapore, at least, over the past couple of months, we have been talking about AI and the prospects of what else we can do with AI - perhaps with some anxiety, because we know that if we don’t embrace AI or take AI into the workflows that we have and embed them into the work that we carry out with our clients, we will be very quickly left behind.
6. IBA’s AI initiative is a very important initiative, and we will be happy to see what else we can do to support that initiative in the coming months, leading up to 2027.
7. For many years, this Conference has been a very important conference - a very important meeting point, and also a leading forum for the global legal community. It is a platform for you to come together to exchange and reflect views. I think the diversity of views that we have across so many practitioners from different countries, representing countries with different legal systems, means that it is a really important platform for us.
8. I took a look at the programme for the next two days and I see that you will have many distinguished speakers on a range of different topics - from online communications to competition, and sometimes an intersection of both of those topics.
9. So today, in my remarks, I thought I would focus on a question that lies at the heart of some of the discussions you will be having, which is: how do we uphold the rule of law in an increasingly digital, borderless, and fast-moving world?
Focus of Keynote: Rule of Law in a Digital Age
10. 2026 marks a very important milestone for us in Singapore. It is timely and apt that we have this conversation about rule of law, because 2026 marks the 200 years since the inception of the Second Charter of Justice. We received the Second Charter of Justice from Britain 200 years ago. Today, that forms the foundation of our legal system - our judiciary, our legal system - the way in which we practice law, our common law system and so on.
11. It is a system that is anchored in fairness, in certainty, and I think quite importantly, the equal application of the law. No one is above the law, and the law applies to everyone equally.
12. And that is the rule of law in Singapore. Over these two centuries since we have had the Second Charter of Justice, the world has changed a lot. But what has really signified this change is how we are now interacting. Where once we were interacting in the physical and very face to face local world, today, it is increasingly digital and global, and you often do not see face-to-face the person that you are transacting with.
13. We now live, we work, we communicate extensively, in some cases almost exclusively online, and our messages can reach vast audiences right across the world in a split second. They cross borders easily, which makes enforcement and makes jurisprudence that used to be very much located within the boundaries of a country seem quite yesterday’s news.
14. So we have got to start thinking about how we can deal with this across different borders, and of course, in recent times, most importantly, these are all also amplified, powered and created using AI tools. A significant share of this activity is also concentrated on a handful of major platforms, public discourse, social interaction and even communities are mediated through these spaces with a tendency for norms and expectations to then diverge across different platforms and user groups.
15. It really goes without saying that these dynamics shape behaviour and perceptions, not just at speed, but also at scale. They form echo chambers and tribes that transcend geography and where the views of what is acceptable or harmful might well differ. Yet across all these digital spaces and different forms, and we see a proliferation of them, growing at a breakneck speed. The one principle that remains a constant for us in Singapore is that the rule of law must continue to apply. Spoken very basically, what happens offline or what can’t happen offline should also apply online, and that has been the challenge that we have been having.
16. And I thought to share some of Singapore’s perspectives with you on how we have sought to sustain the rule of law across a diverse and very quickly evolving online space, and how we have progressively strengthened it to address new forms of online harm.
Singapore’s Experience: Legal Fragmentation to Order
17. To understand our approach today, it is useful to understand what came before. As I mentioned, we mark the bicentennial of Singapore’s rule of law this year, and I think this moment is more than just a historical milestone, more than just a date in the calendar because it offers a useful lens through which we view the challenges confronting our world today.
18. 200 years might not seem a long time to some of my colleagues in the audience, because you come from legal jurisdictions that have a longer history and heritage, but for us this is our 61st year post-independence and we have a much shorter jurisprudential history. 200 years is therefore a different timescale for us in Singapore. If you go back 200 years, legal authority in this region was both fragmented as well as uneven. Different rules might well apply to different communities and disputes were resolved through very disparate arrangements. There was not, as yet, a single framework that applied to all.
19. The Second Charter of Justice made this decisive shift. It established a unified legal system for the first time, grounded in the rule of law. This then brought together a common applicable framework that applied to all, clarity over rights and responsibilities, and predictability and confidence in institutions.
20. After independence, that means after 1965, our founding leaders upheld the rule of law as a cornerstone of our nation‑building. It may not always be visible front and centre. It is not brick and mortar, it is not healthcare, it is not education, it is not transport systems. But the rule of law lies in the centre of all that we did in building Singapore, and this has underpinned Singapore’s development into a trusted business as well as services hub. And the application of the rule of law is also not immutable. It must grow. It must evolve with times and with a changing population with different aspirations of the young especially. We need the rule of law to move with the times.
21. Today, we also face a modern parallel. The rules that govern the digital communications environment are also fragmented, not yet harnessed into the same coherent form as we might see, for example, laws around contract, tort or property. We also have a scenario where online interactions cut across platforms and speed across different jurisdictions at unprecedented scale, raising familiar questions about what norms should govern conduct and how rights can be enforced, but today in a very different modality, powered by the online space.
22. Once again, the task that we have is to bring order to a fast changing digital world. The law in my view, must provide clarity, must set expectations, because the law is also our way of reflecting the social mores and values of a particular society. What does that society value? How do we see discourse amongst each other? All of that must be reflected in the laws that we enact and that we enforce, because this ultimately gives each individual in our society the confidence to participate even in online discourse, and the businesses that operate in Singapore the confidence to invest in the digital economy.
Communications in the Digital World: New Challenges and Risks
23. Digital technologies and platforms are reshaping the digital world, and with it, the way we communicate with one another.
24. Social media has evolved from relatively simple straightforward tools of interaction into a complex, AI-driven, algorithm-driven, borderless communications system, shaped by generative AI. Content today is not only produced by individuals — it is generated, curated, ranked, and then amplified by algorithms. We are seeing something that is generated for us, pushed to us, fed to us, and curated in the way in which platforms want us to see them. Information can spread across platforms instantaneously, reaching anyone, anywhere in the world, with digital access.
25. This has in turn expanded access to information and enabled connection, participation, and innovation. So do not get me wrong, the technologies that we have seen, social media, they have been largely good. It has brought people together, it has organised communities, and it has really democratised access to information, all of which is very good. But it has also presented new risks, risks that we as policymakers, as lawmakers, have to grapple with and confront. What are some of these risks that we have, and which we have thought about for some time, as we enacted the laws. Let me share a few with you.
26. First, the risk that these algorithmic systems that we see, that we share, are not neutral. They are set up, in fact, and designed to maximise engagement, as engagement drives revenue in platforms across different business models. I think that is the bottom line. it is not neutral because it is not designed to be neutral. It is designed to be profit-enhancing. In practice, this therefore leads to amplification of sensational, emotionally charged, and highly polarising content. After all, no one wants to go online to see news that is normal, that is not salacious or sensational. You want to go to a platform that gives you something that is exciting. These systems also reinforce the users’ pre-existing preferences, as people tend to gravitate towards content that aligns with their own views, or the people within their own societal makeup. An echo chamber.
27. As a result, information can spread rapidly and unevenly across the information ecosystem. This then skews perceptions and, over time, deepens divisions within Singapore, within society. And I will make the point later on that in Singapore, this is particularly important for us, because we are a multicultural, diverse, multi-ethnic society. If you go around Singapore, you will see that represented in almost every corner of Singapore, because we have by design done so. If you go into any public housing precinct, you will find that people of all races occupy every single block of flats in that precinct. And we do that by design, because we know that people gravitate or polarise towards people of their own, that sound like their own, have their own background and interest. And so, we build a multicultural society, and we do not want to allow the online algorithms and platforms to disrupt them.
28. Initial concerns about the abuse of AI tools centred on targeted harm against individuals, such as the creation of manipulated intimate images using deepfake technology to harass, to humiliate, or to cause distress. In fact, early studies have found that 90% to 95% of deepfake videos globally involve non-consensual intimate imagery, most of which feature women. And that is another point that we have taken into account as we thought about revisions to our rules around the framework of vulnerable victims, young women, young girls and women online.
29. In Singapore, we have also seen a growing number of cases involving technology-facilitated sexual abuse, including the use of manipulated images to blackmail, to threaten, or to coerce victims. I am sure, even as I share my experiences with you about Singapore, that you will find resonance elsewhere in the world. I think these are all common ills that we see right across the globe.
30. And the implications go beyond isolated actors targeting individual victims. The same technologies are now being abused by criminal syndicates in an organised fashion as well. Recent findings by the Joint Economic Committee of the US Congress illustrate this shift. Global criminal networks have begun using deepfake voice tools to carry out increasingly personalised and convincing scams at scale, fuelling a growing scam ecosystem.
31. If you look at Singapore, we have been victims of that cyber scam attack as well. In 2024, we reached an all-time high where we lost more than S$1 billion online. And for a country our size, that is a huge concern for us. That number has since come down in 2025 and we hope to see a downward trajectory as we take more and more steps to combat, to police against online scam syndicates. And also to educate our population, to think carefully about how when you receive a request for passwords, for information about banking systems and so on, you have a care.
32. Taken together, these developments show how AI and deepfake technology, coupled with algorithms, and the platform’s financial and revenue motivation incentives can be used not only to mislead, but to cause real and significant harm.
33. Third, we have come to realise that this harm, as I said earlier, can now spread rapidly and at scale that is now unimaginable. We are not even at the end of the evolution of that technology. Harmful activity such as harassment, the non-consensual sharing of intimate images, and content that incites enmity between groups, can now reach wider audiences very quickly. So we do not just have to deal with the negative content, but we now have to act with speed, because with each passing day, such content will generate more and more harm at an exponential level.
34. I mentioned earlier that Singapore is a multicultural, multireligious, diverse society. We are very conscious that hate speech, of images that reflect denigration of any religion or any culture or any ethnicity, where they travel at speed and are driven along by algorithms that push such divisive content, can very quickly undermine the social fabric of Singapore.
35. In Singapore, last year alone, more than 84% of Singapore residents encountered harmful online content within the past year. Singapore is very plugged in. Most of us have a digital device or a phone, some more than that, or more than one. Our digital penetration is very high.
36. Each time we have online content that spreads that is harmful, quite a number of our population gets exposed to it. The risk to younger users is especially stark. Our surveys show that the vast majority of parents, 81%, are deeply concerned about their children’s exposure to inappropriate content, interactions with strangers, and cyberbullying. Yet, only 37% of this group felt confident themselves guiding their children’s digital habits. Most remain quite lost and uncertain about that space.
37. A 2025 report by Singapore’s Infocomm Media Development Authority (“IMDA”) confirmed that these risks remain pressing. While the platforms have shown improvement in acting on reported violations of their own community guidelines, the action rates vary from 54% to 93%. This is so even in Singapore, where we have a framework of rules regulating platforms, which I will touch on in a moment.
38. Most platforms took an average of two to five days or more to act on user reports. To me, this is way too long. The way in which harm gets proliferated, gets amplified, gets powered online, two to five days for a response to take down something like a sexual image that is intimate, that is fake, that causes tremendous distress, is way too long. The report also found that children’s accounts, those that are registered in the name of children, on some major platforms could still very easily access harmful and age-inappropriate content.
39. From survivor accounts, we see that online harms can have sustained and serious effects on mental health and daily life, including anxiety, panic attacks, sleep disruption, and in some cases even suicidal thoughts. I think we now know with all the learning that we have, and all the research that has been done, that there is a very tangible nexus between social media and mental well-being, and there is now a measurable negative impact on the mental health of victims, and I think we need to consider this very carefully. Australia has of course gone one way with banning social media access to 16 years and under. We in Singapore, we are studying what the right option is for us.
40. Our answer so far has been to ask ourselves a fundamental question: how do we ensure that the rule of law, one that we celebrate this year in our Bicentennial, continues to hold in this fast-changing, highly amplified, AI-powered, digital environment?
Law as a Living System: Singapore’s Approach
41. Our answer has been to ensure that the law remains a living system that can keep pace with change and evolution. The laws that we inherited – of course we have built on it, we have nuanced it, we have contextualised it. But nothing has prepared us for the advent of social media and technology and AI as we have seen in the past five to 10 years. Yet one thing for us remains very clear and constant, and that is that the rule of law must remain, and that the rule of law applies to all, provides a clear and common framework that everyone understands and abides by.
42. This has been our approach over time.
43. We recognise that harm does not change its character simply because it takes place online. As I mentioned earlier, what is not permissible offline should also not be permissible online. From that understanding, we built the first of our series of legislation around this space, and that was the Protection from Harassment Act enacted some years ago. It makes clear that threats, abuse and other forms of harassment are unlawful whether inflicted in person or through a screen. We criminalise some behaviour, and we have also provided a framework for self-help for individuals who have been the target of harassment.
44. In addition, as online communications became faster and more far‑reaching, a new risk then emerged. Falsehoods could be created and then spread at speed, take hold before it can be corrected, and affect public understanding and social cohesion. We thus introduced the Protection from Online Falsehoods and Manipulation Act (“POFMA”) to address this. We introduced tools to address falsehoods that harm the public interest.
45. What we did with this legislation was to allowed ourselves to use the platforms to put out more information. In the cases that we have used this piece of legislation, we have not asked for the content that is false to be taken down, but we have told the platforms that because content that is false can travel very quickly, salacious content and sensational content travels very quickly, we wanted the right to append a response to that original content. So the original content does not get taken down, but a response gets added to the original content. Overall, readers get to see more, not less, content, and they then judge for themselves.
46. This is our response to what we think are falsehoods that get pervade quickly. We say to the platforms: if you want to continue to operate in Singapore, then each time there is a falsehood that affects public interest, we have the ability to put a response, and you put it together so that it travels together. In addition, there is a framework that is available for anyone who feels that they have been hard done by, for which the decision is unfair, to make an appeal in court. Both of these avenues are available, and overall, this helps us to protect the online space, allowing the infrastructure of fact, that is the bedrock of public discourse, to continue to be strong and firm.
47. We have learnt from many other jurisdictions that if you allow falsehoods to be pervasive, to seed doubt, that creates doubt in our institutions, in the courts, in judges, in police, in systems, then, very quickly, society will begin to crumble inwards.
48. Over time, it also became clear to us that online harm is shaped not only by individual conduct, but also by the systems through which information flows. Online platforms influence what users see and how content spreads. In response, we have had to make amendments to the Broadcasting Act, which introduced system‑level safeguards. This required, for example, designated online communication services to put in place measures to limit the spread of harmful content, with particular attention to protecting younger users.
49. In parallel, where the online environment is used to then facilitate criminal conduct, the law has also had to respond accordingly. A few years ago, we enacted the Online Criminal Harms Act which strengthens law enforcement’s ability to disrupt potential criminal activity, recognising the speed and scale at which harms such as scams now occur. To give one example, we have the ability under that piece of legislation to direct platforms to take measures to protect users against potential scam material and advertisements. We also have the ability to direct the removal of scam advertisements and materials.
50. Most recently, the Online Safety (Relief and Accountability) Act further improves the legal response to online harm. This is a piece of legislation that has created or set up a new dedicated agency, the Online Safety Commission. This Commission enables victims to report harm and obtain timely redress. It is not yet operational. We expect it to go live within the next two to three months. What does it do?
51. It sets up an agency that assesses complaints by individuals for a listed categories of online harm. Taking intimate pictures or child pornography as an example, if that is brought to the Commissioner’s attention, he has a team who assesses if this is a harm. If there is harm, he very quickly tells the platforms to remove it.
52. In addition, the piece of legislation also sets up a framework for statutory torts, and this is a private remedy. This framework clarifies the duties and liabilities of key online actors, providing a legal basis for victims to seek civil remedies against those responsible. So we create a framework that allows for the Commissioner to intervene, but we have also provided a framework which we believe, over time, will then provide for there to be equilibrium in a self-policing fashion, where individuals who are harmed by any tort that is set up by the framework in the legislation, will then be able to find relief in the Courts.
53. Amongst other things, the Act also clarifies the duty of online platforms to act and respond reasonably when notified of harm. We do not expect all the platforms to be monitoring and policing the online space all the time, 24/7, but when they receive notification that there is harm, that is when it triggers the duty to respond.
54. We recognise that perpetrators may be emboldened by anonymity to commit online harm with impunity. I think that is one big driver of online harms, when you can act behind a shield that makes you invisible, an anonymity screen that is opaque to the outside world. The Act introduces measures for the disclosure of end-user identity information in certain situations, to ensure that those who perpetrate online harm can be held accountable. This legislation will, as I said, come into operation in the next two to three months.
55. Across these developments, the law continues to provide a clear baseline for conduct and a reliable means of recourse, and that is the operating principle behind our framework of legislation that deals with online harms. In this way, we ensure individuals remain protected, and I think very importantly, that trust in the system is maintained. If you find that you are not able to make a police report that will be acted upon or make a complaint that will be dealt with seriously, and that the online platforms continue to publish, amplify and worse still, continue to digitally enhance such content, then I think, very quickly, one will lose faith in the system.
This Conference and the Role of the Global Legal Community
56. These challenges that I have outlined, do not stop at national borders. There is no respect for national boundaries or different legal systems. It travels right across, depending on where there might be more eyeballs. Online harms can originate in one jurisdiction, but very quickly spread across multiple platforms and different countries.
57. No single jurisdiction, therefore, has all the answers.
58. But taken together, if we can act in a cohesive, concerted and I would say very intentional fashion, then I think this is where the global legal community can really make a difference.
59. By comparing regulatory approaches, testing assumptions, and sharing experience, the Bar, and I would say, just in this room alone, the quality of the talent that we have here, the extent of the networks that you have in your own countries, will help to shape a common understanding of what reasonable conduct should looks like in the online digital environment. More importantly, it helps us to translate enduring legal principles of responsibility, accountability, and fairness. All of these principles we know well, but how do we translate them into the online space, and create standards that we do not allow different platforms to arbitrage against in different jurisdictions.
60. In doing so, we as practitioners can then contribute to greater transparency, stronger accountability, and development of a safer and more trusted online communications ecosystem. I hope that this is a conversation that will continue beyond today, beyond this conference, and that at some point we will find consensus across the different countries, across the globe. I think IBA, with its initiatives into AI, might well be well positioned to take that lead.
Closing
61. Let me now very quickly conclude. Our digital technologies continue to evolve, and I think they will do so at pace and at speeds that will render today’s technology obsolete and unrecognisable. There must be continued dialogue across jurisdictions, and I think this will be essential, because we have different rules, different laws, but I think we have a fundamental common principle, which is to protect our young, our vulnerable, and to ensure the online space remains a space for truthful, fact-based public discourse.
62. The responsibility for ensuring that modern modes of communication remain open, vibrant, and worthy of public trust is a shared one. It is not just for policy makers, not just for legislators, but really for the entire community and the whole ecosystem.
63. And so on that note, I wish you a very productive, engaging, and I am sure you will have lots of takeaways from the discussions that you will have. But before I leave you, I just want to encourage you that inasmuch as it is important to listen to the speakers and the panel on stage, and you should pay attention to them, make sure that you also reserve some time to see Singapore. You are in the middle of a district that is rich with history and heritage. Not far away is Little India, I think three minutes from where you are is the Malay Heritage Centre, which the Prime Minister re-opened after some years of refurbishment just two days ago. You are also not far away from Chinatown. So, the three largest communities in Singapore. You really are in a perfect place to see Singapore, explore, and to see the culture and the diversity of ethnicity that we have right here in Singapore.
64. On that note, I wish all of you a really productive and enjoyable Conference.
65. Thank you very much.
Last updated on 27 April 2026