On one hand, social media has turned out to be something of a disaster: it has allowed the dissemination of BS and bad things… Holocaust denial, Apollo hoaxers, antivaxxers, democratic socialists, Nazis, commies, antisemitism, Jihadis, third wave feminists. On the other hand, *some* effort has been made to reign that stuff in. On the gripping hand, those efforts have often been ham-fisted and politically bass-ackwards. And on whatever the frak you call the fourth hand, those effort to reign in the horribleness have resulted in a secondary cascade of really horribl things. Gentlemen, behold:
Bodies in Seats
Where we read about the truly awful lives of people who moderate the worst things Facebook vomits forth. Every time someone posts a video of, say, animal cruelty (and there are descriptions of several of such, so, watch out), some poor slob has to review it.
“They kept reposting it again and again and again,” he said, pounding the table as he spoke. “It made me so angry. I had to listen to its screams all day.”
Yeesh.
Marcus went home on his lunch break, held his dog in his arms, and cried.
Yeeeeeeeeeeeesh.
Content moderation, especially reviewing videos of the worst of humanity in action, is the sort of thing that can mess a person up.These places hire regular schmoes and put on a pretense of being a modern clean office, but from the reporting the employees and the work environment quickly turn into the Lord Of The Flies. This is obviously bad for them (and their friends, and their families, and society), but it also damages their ability to effectively do their jobs.
So, on the one hand it’s an argument for working real hard to develop AI content moderation. But that can only be done effectively with a lot of human interaction, and will almost certainly involve lot of false negatives and false positives. On the other hand… perhaps it’s an argument to say “social media was a mistake.”