Facebook says it was 'too slow' to fight hate speech in Myanmar
Facebook has been “too slow” to address hate speech in Myanmar and is acting to remedy the problem by hiring more Burmese speakers and investing in technology to identify problematic content, the company said in a statement on August 16.
The acknowledgment came a day after a Reuters investigation showed why the company has failed to stem a wave of vitriolic posts about the minority Rohingya.
Some 700,000 Rohingya fled their homes last year after an army crackdown that the United States denounced as ethnic cleansing. The Rohingya now live in teeming refugee camps in Bangladesh.
“The ethnic violence in Myanmar is horrific and we have been too slow to prevent misinformation and hate speech on Facebook,” Facebook said.
The Reuters story revealed the social media giant for years dedicated scant resources to combating hate speech in Myanmar, which is a market it dominates and where there have been repeated eruptions of ethnic violence.
In early 2015, for instance, there were only two people at Facebook who could speak Burmese monitoring problematic posts.
In August 16's statement, posted online, Facebook said it was using tools to automatically detect hate speech and hiring more Burmese-language speakers to review posts, following up on a pledge made by founder Mark Zuckerberg to U.S. senators in April.
The company said that it had over 60 “Myanmar language experts” in June and plans to have at least 100 by the end of the year.
Reuters found more than 1,000 examples of posts, comments, images and videos denigrating and attacking the Rohingya and other Muslims that were on the social media platform as of last week.
Some of the material, which included pornographic anti-Muslim images, has been up on Facebook for as long as six years.
There are numerous posts that call the Rohingya and other Muslims dogs and rapists, and urge they be exterminated.
Facebook currently doesn’t have a single employee in Myanmar, relying instead on an outsourced, secretive operation in Kuala Lumpur – called Project Honey Badger – to monitor hate speech and other problematic posts, the Reuters investigation showed.
Because Facebook’s systems struggle to interpret Burmese script, the company is heavily dependent on users reporting hate speech in Myanmar.
Researchers and human rights activists say they have been warning Facebook for years about how its platform was being used to spread hatred against the Rohingya and other Muslims in Myanmar.
In its statement on August 16, Facebook said it had banned a number of Myanmar hate figures and organizations from the platform.