I think they’ll go after Telegram next. I know a lot of people use it to see uncensored news on Palestine and the Ukraine, which is a big no-no in the US. There’ve been a suspiciously high number of news articles linking it to CSAM even though Facebook is a much, much, much bigger offender.
Yep. Very big, and they do a dog shit job of addressing the problem. Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot. The worst part is that this method barely makes a dint in the amount of CSAM distribution.
And there’s no one for them to talk to because of how uniquely horrific these videos and images are. Therapists are only affordable to the rich. Can’t talk to family and friends without potentially traumatizing them. Even the people who interview these mods can’t print the details of their experiences because readers would complain.
I heard that sometimes it’ll keep showing the same traumatic video to one person over and over and over because the bot uploader has very slightly edited it thousands of times, and Facebook forces the reviewer to watch the whole thing every time even though they already know it’s in violation as soon as it starts
That’s so uniquely cruel in such a calculating way. It’s like they’re intentionally trying to traumatize their workers in a fucked up experiment. I don’t trust the in-house therapists Meta offers…
I think they’ll go after Telegram next. I know a lot of people use it to see uncensored news on Palestine and the Ukraine, which is a big no-no in the US. There’ve been a suspiciously high number of news articles linking it to CSAM even though Facebook is a much, much, much bigger offender.
Facebook has a CSAM problem?
Yep. Very big, and they do a dog shit job of addressing the problem. Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot. The worst part is that this method barely makes a dint in the amount of CSAM distribution.
https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona
https://www.ft.com/content/afeb56f2-9ba5-4103-890d-91291aea4caa
https://archive.ph/ter4Y
And there’s no one for them to talk to because of how uniquely horrific these videos and images are. Therapists are only affordable to the rich. Can’t talk to family and friends without potentially traumatizing them. Even the people who interview these mods can’t print the details of their experiences because readers would complain.
I heard that sometimes it’ll keep showing the same traumatic video to one person over and over and over because the bot uploader has very slightly edited it thousands of times, and Facebook forces the reviewer to watch the whole thing every time even though they already know it’s in violation as soon as it starts
That’s so uniquely cruel in such a calculating way. It’s like they’re intentionally trying to traumatize their workers in a fucked up experiment. I don’t trust the in-house therapists Meta offers…
And Instagram. Presumably others.