Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Former U.N. chief Kofi Annan told Facebook Inc on Thursday that it should consider establishing a special team to respond more quickly to threats of sectarian violence in countries such as Myanmar that are at high risk.

Facebook, the world’s largest social network, is under pressure from authorities and rights groups in many countries for its role in spreading hate speech, false stories and government-sponsored propaganda.

Annan, appearing on stage before an audience of Facebook employees, was asked by Facebook Chief Product Officer Chris Cox if he had a recommendation for the company to help protect elections.

He responded that Facebook should look for societies where people are likely to put out “poisonous messages,” and then monitor the language there.

Facebook could “organise sort of a rapid response force, rapid reaction group, who can be injected into a situation, when you see it developing, so that they can try to see what advice they can give the electoral commission or those involved,” Annan said, according to a live broadcast of the event.

Facebook says it has more than 7,500 workers who review posts for compliance with its rule book.

It some countries, though, it acknowledges it is short-handed. It said last month that it needed more people to work on public policy in Myanmar.

U.N. human rights experts investigating a possible genocide in Myanmar said in March that Facebook had played a role in spreading hate speech in the country. Nearly 700,000 Rohingya Muslims have fled Myanmar into Bangladesh since insurgent attacks sparked a security crackdown last August.

Annan headed a commission that last year recommended to the government of Myanmar, a majority Buddhist country, that it avoid excessive force in the crisis.

Since then, social media may have made the crisis worse, he told Facebook employees.

“If indeed that was the case, was there a point somewhere along the line when action could have been taken to disrupt the dissemination of the messages? These are issues that you may need to think through,” Annan said.

Cox replied: “That’s something we’re taking very seriously.”