• whatup
    ·
    edit-2
    7 months ago

    Yep. Very big, and they do a dog shit job of addressing the problem. Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot. The worst part is that this method barely makes a dint in the amount of CSAM distribution.

    https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

    https://www.ft.com/content/afeb56f2-9ba5-4103-890d-91291aea4caa

    https://archive.ph/ter4Y

    • ComradeSharkfucker@lemmy.ml
      ·
      7 months ago

      Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot.

      barely makes a dint

      Show

      • whatup
        ·
        7 months ago

        And there’s no one for them to talk to because of how uniquely horrific these videos and images are. Therapists are only affordable to the rich. Can’t talk to family and friends without potentially traumatizing them. Even the people who interview these mods can’t print the details of their experiences because readers would complain.

        • SerLava [he/him]
          ·
          7 months ago

          I heard that sometimes it'll keep showing the same traumatic video to one person over and over and over because the bot uploader has very slightly edited it thousands of times, and Facebook forces the reviewer to watch the whole thing every time even though they already know it's in violation as soon as it starts

          • whatup
            ·
            edit-2
            7 months ago

            That’s so uniquely cruel in such a calculating way. It’s like they’re intentionally trying to traumatize their workers in a fucked up experiment. I don’t trust the in-house therapists Meta offers…