Facebook said on Tuesday it had tightened its policy of allowing self-harm content, such as suicide, to be published amid mounting criticism over how social media companies deal with violent and dangerous content.

Facebook, which has the world’s largest social network with more than 2.4 billion active users per month, said in a post on its blog that self-harm content would now be more difficult to search on Instagram, and would ensure it would not be among the recommended content in the (Explore) in the photo and video sharing app.

The Facebook statement on World Suicide Prevention Day, where about 8 million people die from suicide each year, or one person every 40 seconds, according to a report issued by the World Health Organization.

Facebook has a team of moderators who watch content like live streaming to prevent violence as well as suicide. The company is working with at least five companies in at least eight countries to review content, according to a Reuters report in February.

Governments around the world are struggling to improve how content is controlled on social media platforms, and those platforms are often blamed for encouraging abuse, posting pornographic images on the Internet, and influencing or manipulating voters.

Companies such as Google, Facebook and Twitter have allotted emergency numbers to answer user queries that include “suicide”.

Source : Facebook

Related Articles
Leave a Reply

Your email address will not be published. Required fields are marked *