Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • while1malloc0@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    While the study itself is a good read and I agree with the conclusions—Mastodon, and decentralized social media need better moderation tools—it’s hard to not read the Verge headline as misleading. One of the study authors gives more context here https://hachyderm.io/@det/110769470058276368. Basically most of the hits came from a large Japanese instance that no one federates with; the author even calls out that the blunt instrument most Mastodon admins use is to blanket defederate with instances hosted in Japan due to their more lax (than the US) laws around CSAM. But the headline seems to imply that there’s a giant seedy underbelly to places like mastodon.social[1] that are rife with abuse material. I suppose that’s a marketing problem of federated software in general.

    1. There is a seedy underbelly of mainstream Mastodon instances, but it’s mostly people telling you how you’re supposed to use Mastodon if you previously used Twitter.
  • stravanasu@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m not fully sure about the logic and perhaps hinted conclusions here. The internet itself is a network with major CSAM problems (so maybe we shouldn’t use it?).

  • Jordan Lund@lemmy.one
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    “massive child abuse material problem”

    “112 instances of known CSAM across 325,000 posts”

    While any instance is unacceptable, does 112/325,000 constitute a “massive problem”?

    0.0000034462% of posts are unacceptable! Massive problem!