Skip to main content
Larger Smaller Text Size

Video of Senior Minister of State for Law and Health, Edwin Tong at the hearing of the International Grand Committee on Fake News and Disinformation in London

Posted in Others

You may view the video at the UK Parliament Website here 




Senior Minister of State, Mr Edwin Tong (ET)

Richard Allan, Facebook's Vice President of Policy Solutions (RA)


ET: Good afternoon Mr Allan. Would you accept that there is no any place for content on Facebook which attacks people on lines of race, religion or ethnicity?


RA: Yes, and our policies are very clear that that is not acceptable.


ET: In fact, your policy is very clear that when such attacks are made, you are committed to removing it anytime you become aware of it. Would that be right?


RA: Yes.


ET: And if you just go to page one of that extract, you will see Mr Zuckerberg’s Facebook extract from August 2017. Now I will read you the portions which I have highlighted. He says, “When someone tries to silence others, or attacks them based on who they are or what they believe, that hurts us all and is unacceptable. There is no place for hate in our community. That’s why we have always taken down any post that promotes or celebrates hate crimes or acts of terrorism.” Would that be correct?


RA: Yes


ET: And you do so, because you are aware that such content has the potential to divide communities, incite violence, tension, hatred and strife. Would you agree?


RA: Yes


ET: Would you go to the second page of that extract, and I think you would be familiar with what has happened in Sri Lanka recently in March this year.


RA: That’s right.


ET: The post that was put up originally is the part that is in pink. It’s in the Sinhalese language – the language of the native Sri Lankans. And the post translates to, and I quote, “Kill all Muslims. Don’t even let an infant or a dog escape.” That would be properly categorised as hate speech?


RA: Yes, clearly a breach of our terms of service.


ET: It was a clear breach, wasn’t it?


RA: Yes.


ET: It was then put up at a time there were significant tensions between the people in Sri Lanka and Muslims, causing damage to properties, deaths even, riots, damage to mosques. And eventually resulted in the Sri Lankan Government declaring a state of emergency in Sri Lanka. Would you agree?


RA: Yes. 


ET: Would you agree that in the context of that kind of tensions occurring in Sri Lanka, putting up such a post would invariably travel far, stress those tension even more, and divide the community?


RA: Yes, that’s high priority content for us to remove.


ET: Yes. If you just look at the page, it was then pointed out by one of your users on Facebook, and subsequently picked up by one Harry Fernando who was the Communications Minister of Sri Lanka at that time. Why is it that Facebook has refused to take it down?


RA: The comment should be down. If it’s not down, it should be. What I am seeing in this is there are two possible reasons why the content was not taken down at that time. One is that it was a simple error on the person who looked at it.


ET: Let me just stop you there. You go over the page of the second document. You will see the response by Facebook when you were asked to take it down. And it says, “Thank you for the report. You did the right thing. We looked over the post.” So it was no mistake, Mr Allan. And it says, “it doesn’t go against one of our specific Community Standards.”


RA:  That was a mistake. I just want to be clear that somebody has made a mistake.


ET: Mr Alan, this is a very serious egregious mistake. Would you agree?


RA: I agree.


ET: it goes completely against your own policies – to take down immediately.


RA: That’s right.


ET: So would you accept that this case illustrates that Facebook cannot be trusted to make the right assessment on what can properly appear on its platform?


RA: No. So, we make mistakes – serious mistakes. Our responsibility is to reduce the number of mistakes. I still think we are best placed to do the content removal. And that’s why we are investing very heavily now in artificial intelligence, where we would precisely create a dictionary of hate speech terms in every language. We are working through the languages. The best way to resolve this is a dictionary of hate speech terms in Sinhalese that gets surfaced to a Sinhalese-speaking reviewer who can make sure that we do the job properly.

ET: Mr Alan, in this case, whilst one excuse might be that your users or reviewers don’t understand Sinhalese, you have the Minister of Communications of Sri Lanka telling you that it is hate speech, and to take it down. Your people review it, and you said hundreds and thousands of people review it but they don’t seem to abide by the same philosophy that you have expressed in your own policies.


RA: We make mistakes. Our job is to reduce the number of mistakes. I completely accept as we discussed previously that we should be accountable for our performance to you and your colleagues, to every Parliament and government that sat around this table today. I would love to be able to explain to you this part of the process – how we do what we do, the challenges of getting it right, the challenges of doing it at scale, not because I expect sympathy but because I think it’s important to understand if we are going to solve the problem, what is it that we need to do to solve it. And I think it will be a combination of us getting better at our jobs, using better technology, and frankly the accountability, and you and your colleagues standing over us and making sure that we do it correctly.


ET: In this case, the post only came to a halt when the Sri Lankan Government blocked Facebook.


RA: Yes.


ET: Do we have to get to that kind of measure?


RA: No, we would most prefer not to.


ET: How would the governments trust that Facebook would live up to its own promise?


RA: Again, this is where I think the openness has to be there to an extent – and I hope you have a constructive relationship with my colleagues in Singapore who work on these issue. I want us to be in a position where we share with you the good and the bad, about how we think we are doing, in full expectation that you will be pushing us always to be better.


ET: We look forward to that. Because what has happened, by way of example in Sri Lanka and several others as well, should not be allowed to happen ever.


RA: And as an employee of Facebook, I am ashamed that things like this happened. And they do, and they shouldn’t.


Last updated on 28 Nov 2018