Are You A Serial Killer?

Facebook has announced that it will hire thousands of "content moderators" around the world to monitor the content uploaded by its users.

But really, who would want to do this?

There are already nearly 5000 employees who identify violence, hate speech and other generally disturbing stuff.

It seems there have been a number of disturbing things streamed on Facebook lately. And some of this content remained online up to 24 hours until it was flagged and removed. Zuckerberg has said "This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate."

Facebook is not the only social media company that is trying to tackle this kind of content. Microsoft also have employees who are responsible for watching questionable content and making a decision on whether it should be made available to the public.

The idea is great, but it does mean that thousands of people are being exposed to some horrendous stuff. And there have been two former Microsoft moderators who have begun legal action against the technology giant for post-traumatic stress caused as a result of being exposed to gruesome content.

"This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate."

Mark Zuckerberg

But you can also argue the flip side too. What is the criteria being used to assign these people to this position. One person's horror may be the next person's PG rating. I hope that social media giants like Microsoft and Facebook have clearly defined rules around what constitutes a content violation. Otherwise, Grandma's photo of the grandkids running through the sprinkler in their swimmers may get flagged and removed.