SOCIAL MEDIA MODERATORS: NOT AN EASY JOB
by John Rudy
Ever since the last presidential election, there has been considerable discussion about what limits should be placed on content made available through Facebook, YouTube, and other social media platforms. Facebook’s Mark Zuckerberg and others have been castigated for not doing enough to delete inappropriate content. Of course, the word “inappropriate” is not viewed identically by everyone. And, of course, there is the fact that we do take pride in our right of free speech. The murders in New Zealand a few months ago added another dimension to this discussion as the government insisted that all video of the murders be instantly deleted.
What has not been discussed is the role that thousands of lowly paid employees perform in order to help these social media platforms to monitor or self regulate the nature of their content. This article helps us to better understand what these social media moderators must do on a daily basis.
As everyone knows, it is very easy to post material to any of the social media sites. https://www.internetlivestats.com/twitter-statistics/ says that there are over 500 million Twitter posts per day. https://blog.microfocus.com/how-much-data-is-created-on-the-internet-each-day/ says that “more than 4 million hours of content are uploaded to YouTube every day, with users watching as many as 5.97 billion hours of those videos on a daily basis.” In addition, 67,305,600 Instagram posts are uploaded each day. And over 2 billion people, monthly, become active Facebook users.
It is impossible, at this point, for social media moderators to view all of that material for “inappropriate” content.
Just thought you’d want to know as you think about how this might be constrained.
A long-time technology expert and guide, John provides his helpful hints in this monthly BOLLI Matters feature. In the comment box below, provide John with questions, comments, or suggestions for future tech items to cover.