Facebook moderators acting on wrong interpretation of Indian laws: NYT
NYT was provided with more than 1,400 pages from the rulebooks by a Facebook employee who said he feared the company was exercising too much power, with too little oversight.
Trending Photos
As Facebook tries to control the "bonfires of hate and misinformation it has helped fuel across the world", its moderators are often "mistakenly" told to take down comments critical of religion in India, with certain guidelines making an incorrect reading of Indian laws, a US media report has said.
"Every other Tuesday morning, several dozen Facebook employees gather over breakfast to come up with the rules, hashing out what the site's two billion users should be allowed to say. The guidelines that emerge from these meetings are sent out to 7,500-plus moderators around the world," the New York Times report said.
NYT was provided with more than 1,400 pages from the rulebooks by a Facebook employee who said he feared the company was exercising too much power, with too little oversight and making too many mistakes.
The report said an examination of the files revealed numerous gaps, biases and outright errors.
In India, moderators were mistakenly told to take down comments critical of religion, the report said.
It said legal scholar Chinmayi Arun identified mistakes in Facebook's guidelines in India.
"One slide tells moderators that any post degrading an entire religion violates Indian law and should be flagged for removal. It is a significant curb on speech - and apparently incorrect," the report said. Arun however added that Indian law prohibits blasphemy only in certain conditions such as when the speaker intends to inflame violence.
Another slide says that Indian law prohibits calls for an "independent Kashmir", which some legal scholars dispute. The slide instructs moderators to "look out" for the phrase "Free Kashmir", though the slogan, common among activists, is completely legal, the NYT report said.
"Facebook says it is simply urging moderators to apply extra scrutiny to posts that use the phrase. Still, even this could chill activism in Kashmir. And it is not clear that the distinction will be obvious to moderators, who are warned that ignoring violations could get Facebook blocked in India," it added.
The report noted that Facebook's rules for India and Pakistan both include a diagram explaining that the company removes some content to avoid risk of legal challenge or being blocked by governments.
Citing other examples, the report said that moderators were once told to remove fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook's internal list of banned groups.
In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months.
"The closely held rules are extensive, and they make the company a far more powerful arbiter of global speech than has been publicly recognised or acknowledged by the company itself," the report said.
The report noted that the Facebook employees who meet to set the guidelines are mostly young engineers and lawyers, who try to distill highly complex issues into simple yes-or-no rules.
The company then outsources much of the actual post-by-post moderation to companies that enlist largely unskilled workers, many hired out of call centres.
Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day.
"When is a reference to 'jihad', for example, forbidden? When is a 'crying laughter' emoji a warning sign?" the report asks.
The report said that moderators express frustration at rules they say don't always make sense and sometimes require them to leave up posts they fear could lead to violence.
"You feel like you killed someone by not acting," one said, speaking on condition of anonymity as he had signed a nondisclosure agreement.
Facebook executives say they are working diligently to rid the platform of dangerous posts.
"It's not our place to correct people's speech, but we do want to enforce our community standards on our platform," said Sara Su, a senior engineer on the News Feed. "When you're in our community, we want to make sure that we're balancing freedom of expression and safety," she said.
Facebook's head of global policy management Monika Bickert said the primary goal was to prevent harm and to a great extent the company had been successful.
"We have billions of posts every day, we're identifying more and more potential violations using our technical systems," Bickert said. "At that scale, even if you're 99 per cent accurate, you're going to have a lot of mistakes," she said.
Stay informed on all the latest news, real-time breaking news updates, and follow all the important headlines in india news and world News on Zee News.
Live Tv