Facebook is launching major updates to put an end to the proliferation of fake news and hate speech on its site.
The social media giant will try to lessen the visibility of links that are suspected to be clickbait or misleading content. Facebook is strengthening its fact-checking program by employing outside experts who will evaluate posted data on the online platform.
Facebook is also planning to look into its groups, which they claim are the primary source of misinformation. Suspicious groups are granted less visibility in users’ news feeds.
Other popular sites, such as Instagram and YouTube, are also combatting the same dilemma by imposing rigorous monitoring schemes while trying to avoid complete censorship of published content.
Vice President of Integrity Guy Rosen said that the attempt to safeguard privacy and ensure public safety is “something societies have been grappling for centuries.” This is in reference to Facebook’s struggle to guarantee encrypted messaging to its users, which is Facebook CEO Mark Zuckerberg’s vision for the site.
In Facebook’s quest to reinforce its private communications, Rosen emphasized the company’s endeavor in hiring experts who will help them out in their objective.
In fact, there are already assigned teams who monitor content that displays elements that are sexual and violent in nature.
Karen Courington, who works under the product-support operations, said 15,000 employees from the safety and security team are in charge of content reviewing for Facebook.
However, Facebook received backlash from the public for exposing its employees to negative content and putting them in a toxic working environment.
Courington hit back by saying these moderators underwent rigorous training and receive additional support because of the nature of their work. She also revealed that these individuals receive higher compensation.
For materials that fall on the gray area, the Facebook moderators call the shots. If they cannot decide whether the content should be removed, Facebook personnel can choose to reduce the information’s visibility in the news feed instead.
But the site continuously tries to find the balance between encouraging free expression and harnessing a safe online community, according to Facebook News Feed head Tessa Lyons. Facebook is determined to make headway regarding this matter through consulting with experts such as journalists, researchers, and fact-checking specialists.