Facebook has announced some changes to their policies regarding the content they will allow people to show. The photos that are now banned involve self-harm or mutilation. These photos had been allowed in the past because it was thought that they offered some amount of help to people who were trying to recover from having suicidal thoughts. However, Facebook has talked to mental health experts from around the world. The social media giant eventually came to the conclusion that these photos do not have any positive purpose of redeeming value. Therefore, they need to be done away with completely.
Facebook has become much more aware and sensitive about the content of the photos that are posted to their site. They are concerned about studies that have shown that suicidal behavior can spread among people who are exposed to images that are considered to be disturbing. This has caused Facebook to announce the changes that are already in effect. Any self-harm or mutilation pic will be removed from the site as soon as it is discovered. Facebook hopes that this will help the overall mental health of their users. They have said that other changes to their policies will be announced down the road. However, they did not go into specifics.
The world’s biggest social network has come under fire for a wide range of things over the past few years. There were groups from Russia who posted election ads during 2016. Then there was the infamous Cambridge Analytica scandal where Facebook allowed the private data of their millions of users to be harvested by companies without their consent. The content that is allowed on Facebook has also been a big source of debate. They have already banned terrorism photos and other images that promote hate crimes and racism. The ban on self-harm images is now the next logical step.
Facebook feels that the changes they are making will make their platform much better overall for the people who use it on a daily basis. Their goal is to weed out as much objectionable content as possible. They hire third-party companies to monitor the site and remove content that is against their guidelines and policies. This system has worked very well so far. They had a problem earlier this year when the New Zealand mass shooting was live streamed on Facebook Live and it was not taken down quickly. However, Facebook says that will never happen again.