Technology

Why content material moderation prices billions and is so troublesome for Fb, Twitter, YouTube and others

why-content-material-moderation-prices-billions-and-is-so-troublesome-for-fb-twitter-youtube-and-others

After the riots in the Capitol on January 6th, the debate about how platforms moderate content and what is protected as freedom of expression swirled.

It’s a messy and expensive process that Facebook spends billions on reviewing millions of pieces of content every day. While TikTok employs content moderators directly, Facebook, Twitter and YouTube outsource most of the strenuous work to thousands of employees in third-party companies.

Many moderators in the US and overseas say that because of the terrible things they see searching hundreds or thousands of jobs every day, they need higher wages, better working conditions, and better mental health support.

In response, some companies are relying more on algorithms that they hope can do most of the dirty work. However, experts say machines cannot detect everything, such as the nuances of hate speech and misinformation. There are also a variety of alternative social networks such as Parler and Gab, which became popular mainly because they promised minimal content moderation. This approach resulted in Parler being temporarily banned from Apple and Google app stores and from hosting Amazon Web Services.

Other platforms like Nextdoor and Reddit rely almost entirely on large numbers of volunteers for moderation.

Watch the video to find out how big the content moderation business has become and what real impact online social networking choices have on the content we can and cannot see.

0 Comments
Share

Katherine Clark