Size is part of the problem. Facebook has 2.7bn users, many of whom write in foreign languages. Their posts are vetted for hate speech and incitement. But the firm’s 15,000 content moderators struggle to cope. Most do not know Arabic and its dialects. So the firm relies on automated filters, which make mistakes. They screen flagged words, but cannot detect cultural nuance or satire. Facebook rarely explains why it deletes content. “Despite a huge number of users in the global south, they are largely excluded from the conversation,” says Wafa Ben-Hassine, a Tunisian- American human-rights lawyer.
Facebook is bound by American law, which counts some key players in the Middle East as terrorists. Iran’s Islamic Revolutionary Guard Corps, Hizbullah, Hamas and a raft of other armed Islamist groups are banned. Occasionally American media outlets give members of these groups airtime, but Facebook has a rigid interpretation of the law against aiding and abetting terrorists. More troubling is how it bans people sympathetic to these groups—or removes content that simply refers to them. Even Hizbullah’s opponents spell the militia’s name with a space between each letter to prevent Facebook deleting their posts.
译文由可可原创,仅供学习交流使用,未经许可请勿转载。