In reaction to criticism around the use of Messenger in some countries worldwide, particularly Myanmar, Facebook has introduced new tools that it allow users of the app to report conversations that violate its community standards.
A new tab inside the Messenger app lets users flag messages under a range of categories that include harassment, hate speech and suicide. The claim is then escalated for review, Facebook said, after which it can be addressed. Previously, Messenger users could only flag inappropriate content via the web - based app or Facebook itself, that’s clearly insufficient for a service with over a billion users, many of whom are mobile - only.
Facebook said the review team covers 50 languages. It has been widely criticised for its small team of Burmese language reviews, most of which is based in Ireland — with a six - hour time gap — although it has pledged to staff up on Burmese experts.
Source: Tech Crunch