Opening Address by Mr K Shanmugam, Minister for Home Affairs and Minister for Law, at the "Keep It Real: Truth And Trust In The Media" Forum
Mr Alan Chan, CEO, and
Mr Patrick Daniel, Deputy CEO, Singapore Press Holdings,
Mr Warren Fernandez, Editor-in-Chief of the English/Malay/Tamil Media Group, and Editor, The Straits Times,
Mr Gilles Demptos, Director, Asia, World Association of Newspapers and News Publishers,
Professor Arnoud de Meyer, President, Singapore Management University,
Ladies and gentlemen,
1. Good morning. Thank you for inviting me here to be with you today.
2. This comes at an important point in how, as a society, as a Government, we deal with fake news.
3. Truth and trust in the media are pressing issues for our contemporary society. At the heart of that is the online misinformation that we all have to face. That is what I will speak about today.
4. There are four aspects I will touch on:
· What is the nature of the problem, both foreign and domestic;
· Why does it happen;
· What is the reality that we face; and
· What we should do about it.
II. THE NATURE OF THE PROBLEM
5. In 2014, at the World Economic Forum, the rapid spread of misinformation online was considered the 10th top trend in terms of global significance.
6. About 39 per cent of those who were surveyed said they had limited knowledge of what this trend was about.
7. Today, it is no longer something that serious people can ignore.
8. Attempts at misinformation have been around as long as human societies. It is not new, but the way it is being used was brought home to a lot of people after the 2016 US presidential elections.
9. Of course, many of you would have heard about the “Pizzagate” incident, the rumour that Mrs (Hillary) Clinton and her chief of staff John Podesta were running a paedophile ring out a of pizza restaurant in Washington.
· It appeared in a small online message board, then it went viral on Twitter.
· It fed into the whole psyche of conspiracy mongering.
· The owners of the restaurant, the staff, and the customers received serious threats.
· Many of you will recall one man went with a rifle and opened fire, but no one was hurt.
10. There were also claims that the Pope had endorsed then-candidate Mr (Donald) Trump, and that Mrs Clinton had sold weapons to ISIS.
11. None of this is to say that these things did not happen before – it is just that they have much greater influence this time around. Nor am I suggesting that they actually influenced the elections.
12. But what we can learn from this, is how viral this can go, how many millions of people can be influenced, how it impacts societies and how it brings mistrust into the whole process of democracy.
13. All these were outright lies, and they were shared online widely.
14. We move on to Brexit.
· Many of you would be aware of news circulating – how many million Turks were going to turn up in the UK, to fuel and feed xenophobia.
15. Or Germany, the “Lisa case”.
· False news that a Russian-German girl was kidnapped and raped by Middle Eastern migrants.
· Completely false. Media outlets from one country spread the story online.
· EU security experts who looked at it believe that it was part of info-ops.
· Had real-world consequences: protests across Germany against migrants and of course, hardened attitudes towards migrants.
16. If you look closer to home, Indonesia before the elections for the Jakarta governorship:
· Online sentiments fanning anti-Chinese feelings.
· One claimed that China was using “biological weapons” to target Indonesia’s agricultural sector.
· Another claimed that a free HPV vaccine was part of a Chinese conspiracy to make Indonesian women infertile.
· When spread continuously, a lot of people will start to believe it.
17. You have had that in India, to provoke public panic, vigilante violence, and people got killed.
18. So, there are very serious consequences.
· It undermines the very fundamentals of a democratic society.
· It undermines the media.
· It undermines trust in government.
· It undermines what the truth is.
· It spreads fear and panic.
· It undermines domestic politics and society as a whole.
· It de-legitimises leaders.
· It divides societies.
· It endangers lives.
19. Misinformation is not new. But why is it much more serious now, than ever before?
20. This is because the information ecosystem has changed dramatically.
21. Today, you have echo chambers. Everyone can be an expert in whatever they wish because they can just look up the Internet and read whatever they find. And people talk amongst themselves, with people of like-minded views and reinforce their biases. Any news that is put into that ecosystem, that supports their underlying beliefs, catches on.
22. The multiplicity of information systems and platforms means that there are many different ways in which this can be spread. Particularly by people who are quite clever in using the information systems.
23. As you can see, trust in media has been weakened.
· The deputy editor of The Economist recently said conventional newspapers no longer keep fake news in check because they are operating in different environments.
24. This is the “post-truth” world, where flimsy and ludicrous misinformation can be believed by millions of people.
· It travels faster than ever.
· The identities are fake or anonymous.
· It is legitimised by social networks, working within echo chambers.
III. WHY DOES THIS PROBLEM HAPPEN?
25. When we think about why misinformation exists, then we will understand that this is a problem that is not going to go away any time soon, and I doubt it will ever go away.
26. For one simple reason: misinformation is an easy and effective way in which a range of agendas can be advanced. And unless you believe that human beings, societies, and groups of individuals will stop having agendas for profit, agendas for political advancement, agendas for destroying institutions, this problem is going to exist.
27. Let us look at the potential types of actors.
28. As I said earlier, misinformation has been considered a legitimate tool of propaganda for as long as human societies have existed. It has been used in war for thousands of years; as a legitimate tool by other countries.
29. The Russians are not the first, and they are not going to be the last. A lot of countries, I am sure, have the ability to use it.
30. Misinformation is set out in black and white as part of Russian military doctrine.
· As early as 1983, you had a Soviet-sponsored newspaper which claimed that the US military had created the AIDS virus as a weapon.
· That story was published in more than 80 countries, in 30 languages.
31. Today, the Internet has been harnessed for this.
· I make no suggestions about what actually happened. But if you look at what US intelligence authorities say, the suggestion is that Russians financed the spread of online anti-Clinton fake news during the 2016 US presidential elections.
32. You must assume that every country with an interest in another country will do the same.
33. Given that we are such an open country in terms of Internet penetration, such a small country but very strategically important, and given our racial and religious profile, we would be of particular interest to a number of countries who will want to influence specific racial groups, or social groups, or religious groups within Singapore.
34. Therefore, we are particularly vulnerable, and it would be naïve for us not to recognise that and see what we can do to defend ourselves, because that is a matter of our national survival.
35. Second, misinformation can be used to spread hate.
36. After the 2015 November attacks in Paris, a video went viral online.
· It was titled “London Muslims celebrate terror attack”.
· Half a million views within two hours.
· But in reality - that was a celebration by Muslims of a victory by Pakistan in a cricket match.
· All it took was changing the title, and clicking a button.
· People were angry about the attacks in Paris. So these fed that anger by taking a video, changing its title, and circulating it.
· It was a perfect marriage of emotions on the part of the people, and somebody exploiting those emotions to stoke anti-Muslim, Islamophobic feelings.
37. There have been many attempts to use online misinformation to smear Muslims as a group and turn non-Muslims against Muslims.
38. This is done both by people who are Islamophobic, as well as people who want to make the Muslim community more exclusive – terrorists, who want to create more anger against the Muslim community and turn the non-Muslims against moderate Muslims, and turn the moderate Muslims toward the more extreme ideologies.
39. For example, a group of moderate Muslims got together to protest against the 2017 London Bridge terror attack.
· A journalist had asked them to gather together and pose for a photograph.
· The scene was taken out of context by hate-mongers, who claimed that the demonstration was staged.
· It discredited the moderate Muslims, who wanted to step forward, and made them a target of attacks, so that in future, moderate Muslims will not hold their head high and stand up.
40. Hate also breeds terrorism, and it is now common ground that terrorists use online misinformation to spread hate for their own ends.
41. Third, misinformation can be used for profit.
42. We have all heard what happens in Macedonia: teenagers who manufacture and spread sensational fake news stories, and they did it during the 2016 US Presidential Elections. Solely driven by money, not ideology.
43. We have had it in Singapore – The Real Singapore fabricated articles, and made more than $500,000. It came out in court.
44. It is easy money, because you feed on anger and hate.
Unintentional Reporting Failures
45. Fourth, there are of course unintentional reporting failures. It applies to many newspapers around the world, where they believe a story, publish it and then the next day, write an apology.
46. Earlier this year, there was a widely-read newspaper in Germany that reported that a mob of migrants had sexually assaulted women on New Year’s Eve.
· The story was false.
· They had relied on two chaps who lied about it.
· Today, in the competition with online media, they wanted to get a scoop, they wanted to move ahead. It spread wildly.
· But at least there was no deliberate intention to create fake news.
· They believed it to be to be true, so this is an unintentional creation and spread of fake news.
· I think we have to treat it slightly differently from the other types of fake news.
47. In 2015, many of you would remember a Singaporean student created a fake government website and posted a false announcement that Mr Lee Kuan Yew had passed away.
· CNN and others went ahead and reported that Mr Lee had passed away.
48. It was not intentional. These things can happen.
IV. SINGAPORE’S CONTEXT
49. The drivers and causes of misinformation are the same, all over the world. I’ve shown you many examples, most of which we know.
50. So what do we face?
51. Again, we start with a little bit of history. Attempts at foreign influence in Singapore are not new, and these are now coming back.
52. Let me start with the Eastern Sun.
· 1960s, 1970s.
· English daily in Singapore.
· Financed by officials from China.
· They wanted to ensure that the Eastern Sun did not oppose China on important issues.
53. Likewise, we had the Singapore Herald, which was financed by a Malaysian official. Ask yourself – why would a Malaysian official invest a huge sum of money in a Singapore newspaper that then talked about domestic issues and campaigned against National Service, which is a backbone of our defence?
54. Online misinformation does not need to be financed out of Singapore. It can be produced anywhere in the world - little WhatsApp messages and videos. We have seen some examples of these, last year and this year. These messages can be put in emotive terms, and in language that appeals to specific racial and religious groups.
Racial and Religious Tensions
55. There have been attacks all around the world, based on hate speech and religious divides. Locally, we have managed to integrate more successfully than many other countries. But that does not mean we are immune to such attacks.
56. We are vulnerable - not just us, but every society - to misinformation that exploits racial and religious divides.
57. We have to try and stop and deal with the attacks that try to spread hate and xenophobia here.
58. I have taken a slightly more neutral example – a fake news story by The Real Singapore.
· They claimed that a Filipino family had complained about a Thaipusam event, and the police had then acted.
· There had been no such complaint. It had been written by one of the two editors of the website, who attributed the claim to a third party.
· This cleverly captured the mood of those among the public with an anti-foreigner sentiment.
· The claim spread widely.
· The truth came out maybe more than a year later in court. I wonder how many people read news of the proceedings, and realised the claim was fake news.
· It was reported in The Straits Times and other media. But that did not have the same emotive value. Correcting the truth a year and a half later does not have the same effect.
59. And now an additional phenomena along the same lines, that confuses and promotes distrust. I will give two examples.
60. Back in 2015, someone spread a rumour that the Government was conducting cloud-seeding not to help the Indonesian fires, but to help the upcoming Formula One race.
· And that the Government was harming people with chemically induced rain.
· The rumour was spread widely.
61. Earlier this year, there was a relatively less serious example. A viral Whatsapp message claimed that people had been fined for leaving used tissue paper in coffee-shops and hawker centres.
· This caused confusion.
· Even though months have passed, people may still be concerned about whether they could leave their tissue paper behind, and whether there would be a fine. There was a recent article by TODAY on this.
· causes confusion;
· sows doubt; and
· creates a climate of distrust.
63. If the distrust becomes deep-rooted, people will have serious doubts about institutions, about governance, and you then get a fractured polity.
64. The traditional approach to the market-place of ideas is that everyone puts out their view-points. The reader looks at the different viewpoints, and the best ideas win out.
65. Now, it is a completely different ecosystem, where groups tend to be closed off and reinforce each other’s viewpoints. There is little that mediates between different viewpoints, and brings different groups together.
66. The Singapore Government has been considering ways of dealing with the problem for the last year or so. We have started thinking about legislation. I am personally convinced that legislation is essential. But the nature, shape and contours of the legislation will depend on what we think is right. It will be informed by the consultations that we have with stakeholders.
67. We wanted to see what Singaporeans think about the problem, so we commissioned a survey.
· Three-quarters came across fake news at least occasionally, most often on Facebook and WhatsApp.
· 25 per cent came across fake news frequently.
· One quarter shared information they later discovered to be false.
· Around two-thirds could not recognise fake news when they first saw it.
· And only around 50 percent believed that they could recognise fake news.
68. That tells you the nature and extent of the problem. News underpins how a society functions. This is a serious problem.
V. WHAT SHOULD WE DO ABOUT IT?
69. We have been studying this issue, as I said.
· We need to focus on creating a strong climate of trust.
· And we need to find a way in which we can –
o Dispel; and
o Disrupt fake news
70. This is not something any of us can do on our own, not the Government, the tech companies, or the media. Each has a role to play. I will touch on each.
71. First: Society.
· We need to make sure our society is able to react to fake news.
· A study was done on the rumours that spread over Twitter during the 2011 summer riots in London. Twitter users helped to rebut rumours very quickly. They showed themselves to be resilient. They helped to point out logical fallacies, and were usually quick to put the rumours to rest.
· The key task is to make society more resilient. We need to see how we can strengthen our resilience through media literacy education, through critical thinking.
· Members of the public and civil society have to help to foster an online culture where truth is valued and protected.
72. Second: The Media.
73. The media plays an important role in being a trusted source of news for the public. But this trust is under challenge.
74. We need to be much more careful about reporting. Journalists face time pressures, especially with the challenges of the new media landscape. It is much more important to make sure standards of reporting are robust.
75. Third: Tech Companies.
76. Internet companies like Facebook, Google, Twitter, and WhatsApp bear significant responsibility.
· Some have said they are not publishers with editorial control over content on their websites.
· But that is not credible any more. They have created algorithms to know what your tastes are, what you are looking for, and to send you the right advertisements.
· Now, they are
o flagging disputed information,
o taking down posts,
o removing fake accounts, and
o refining search results.
77. Last year, some of these companies gave the EU voluntary commitments to remove reported hate speech within 24 hours.
78. But the general prevailing view across a range of countries is that these voluntary commitments are not enough. We have surveyed what is happening elsewhere.
· The EU, Germany, and Israel:
o Are considering laws to compel social networks to take down various types of unlawful content.
· In the UK:
o A Parliamentary Committee conducted an inquiry into hate crime online.
o The Committee had very strong words for social media companies. It said they were “shamefully far” from adequately tackling illegal and dangerous content online.
o It recommended that the Government review existing legislation, and assess whether failure to remove illegal material is in itself a crime.
o Prior to the UK elections, another UK Parliamentary Committee had been looking into online fake news, including whether new offences should be created to hold social networks responsible for inappropriate content, including fake news.
79. That brings me to my final point – what is the role of the Government in this context?
80. Ideally, most misinformation will be dealt with through
· a resilient society;
· a responsible and effective media; and
· the voluntary actions of Internet companies.
81. But the Government still has a key role to play.
· It must stand ready to deal with misinformation that impacts society. There are many examples. Such as if someone spreads a rumour that if you go to a certain hospital, you will be infected by a disease. Or a rumour that a bank is in trouble. It could be financial, health or governance-related.
· Media companies may have differing standards. I will give you one example.
o According to Google, a video claiming that Jews had organised a “White genocide” does not meet its test for removing content. Governments may take a different view.
· I told you earlier about echo chambers,
· In the poll that we commissioned on fake news, we also asked Singaporeans what their views were on legislation to deal with fake news.
o 91% of those surveyed supported stronger laws to ensure the removal or correction of fake news.
82. Legislative action therefore seems a no-brainer. We will continue to study what other countries are doing. A team has gone to Germany and the UK to learn from what other countries are doing, and thinking of doing.
83. In the second half of this year, we will consult with stakeholders, the media, the legal profession, and the Internet companies, and see what the contours and shape of the legislation ought to be. Hopefully, we will have it in place next year or so.
84. Thank you.