Facebook Can’t Monitor Content Because of Languages


Facebook has had more than its fair share of scandals over the past several years. Much of the animosity towards the social media giant involves the increase in fake news and hate speech that is proliferating on the platform. One of the biggest issues that is preventing Facebook from dealing with these problems is the fact that there are too many languages being used to keep track of. There are currently 110 languages that are supported by Facebook. Therefore, monitoring these languages and the 2.2 billion people who use the site is a task that the company has not been able to handle up to this point.

Facebook employs 15,000 people to monitor content on the platform. However, these people only speak roughly 50 languages. This is less than half of the languages that are supported by Facebook. The platform had guidelines that prohibit the posting anything that can be considered hateful towards a group of people. It is also not acceptable to post anything that celebrates violence of any kind. However, there are many languages supported by Facebook that these guidelines are not written in. This basically means that it will be an uphill battle for the company to adequately police the platform that it has created.

Facebook has said that it is very concerned about protecting their users and they go to great lengths to do so. A spokesperson for the company said that there have been some bumps along the way. However, the company is learning from their mistakes. The government of Sri Lanka had to take the drastic step of totally blocking Facebook for the entire country. They did this because they did not want rumors and conspiracies to be passed around regarding the bombings that happened last Sunday.

Facebook has been using automated software in its battle to detect and delete hate speech, terrorist propaganda and other prohibited content. The bad thing about the software is that it is only able to work with 30 languages. Many of the languages supported by Facebook do not have enough available text in order to program the automated software correctly. This presents a huge problem for Facebook moving forward. It would seem that Facebook has created platform that is too big for them to police. It also means that hate speech and other fake news will be posted and not taken down simply because the Facebook moderators and the automated software are not able to understand the language in question.


Please enter your comment!
Please enter your name here