Facebook recognizes & # 39; serious errors & # 39; after Edwin Tong questioned the failure to remove posts of hatred in Sri Lanka



[ad_1]

LONDON: A Facebook executive has admitted that the company "made a mistake" by not removing a post that incited racial hatred in Sri Lanka, an international hearing on fake and disinformation news in London that was heard on Tuesday (November 27).

Richard Allan, vice president of Facebook policy solutions, was questioned by Singaporean Member of Parliament Edwin Tong for a post, written in Sinhala in March, calling for the killing of all Muslims. Mr Tong asked whether the shipment violated the social media company's service requirements.

Mr Allan agreed.

"It happened when there were significant tensions between Sri Lankans and Muslims, which caused property damage, even death. Unrest. Damage to the mosque. And finally that resulted in the Sri Lankan government declaring an emergency. Do you agree? "Mr. Tong said.

Mr Allan replied: "Yes."

"Do you agree that in the context of such tensions that occur in Sri Lanka, putting up such posts will always travel far, divide the tension, or emphasize the tension more, and divide the community?" Asked Tong.

Mr Allan replied: "Yes, that is high priority content for us to delete."

When Tong asked why Facebook refused to delete the post in question, even after being highlighted by Sri Lanka's communications minister, Mr Allan said it was a "simple mistake" on the part of Facebook employees.

At this point, Mr. Tong interrupted. He argued that it was not wrong and Facebook had replied to the user that the post did not conflict with the community's standards.

Mr. Allan does not agree. "That was a mistake," he said. "I just want to be clear that someone made a mistake in the review."

He also disagrees with Mr Tong's question about whether this case shows that Facebook "cannot be trusted to make the right judgment" on what can appear on its platform.

"We made a mistake … a serious mistake; our responsibility is to reduce the number of errors, "he said.

"We invest very much in artificial intelligence, where we will make a dictionary of hate words in every language."

He added: "The best way to solve this problem is the dictionary of hate words in the Sinhala language, which appears to speakers of Sinhalese languages, who can ensure that we do the right work."

Mr Tong replied: "Mr. Allan, in this case, while one reason might be that your users or your reviewers do not understand Sinhalese, when you have the Sri Lankan communications minister tell you that this is hate speech and to drop it, and you review it, people your people review it, and you say hundreds of thousands of people check it out, but they don't seem to obey the same philosophy as you stated in your own policy. "

The post only stopped circulating after the Sri Lankan government blocked Facebook.

When Tong asked whether the government should use these steps to deal with the problem of deliberate online lies, Allan said that Facebook would "choose not".

"This is where I think openness must be … and I hope that you have a constructive relationship with my colleagues in Singapore who work on this issue," said Mr Allan.

"I want us to be in a position where we share with you the good and the bad, about how we think we are doing, with full hope that you will encourage us to always be better."

For this response, Tong replied: "We look forward to it, because of what happened, for example in Sri Lanka, and there are a number of other people too, it shouldn't be possible."

Mr Allan said: "No, and as an employee Facebook I am ashamed that things like this happen and they do and they shouldn't."

IN FACEBOOK AND ELECTION

Tong joined two fellow members of Parliament from Singapore, Pritam Singh and Ms. Sun Xueling, at a hearing in London. Mr Singh asked Mr Allan what Facebook was doing to combat the prospect of elections from "undermined".

Mr Allan said that for "any significant election", Facebook is now creating a "war room" – a task force consisting of specialists whose job is to understand the risks of that election and disseminate the necessary tools and technology to deal with those risks.

Mr Singh asked whether smaller countries would also fall under the Facebook "war room" concept.

"In an ideal world, every election, anywhere, all the time. Our current resources I think allow us to see all national elections, "Allan said.

"So, if there is a national election in Singapore, for example, it will be closed."

He added: "We have a similar task force around the Latvian election. So we see every election whether the country is big or small, at the national level. And then the question is we can expand it also into regional and local elections. "

Mr Singh also asked whether Facebook would consider working with local electoral authorities and political party representatives to delete or mark posts that would endanger the political process.

"We think it's important. And once again I want to repeat … people who decide whether a free and fair election is you, and your authority, and political parties, "said Mr Allan.

"So we want to do whatever is needed so that everyone has confidence that elections are free and fair – and we cannot do it alone.

"We can make tools, we can work with you, but in the end we need to get involved with you to meet these common goals so that we contribute positively rather than negatively to elections in your country."

[ad_2]

Source link