• @Clbull@lemmy.world
    link
    fedilink
    23
    edit-2
    1 year ago

    Not quite how it happened.

    Yahoo banned porn on Tumblr because Apple found a CSAM image on their platform, banned their app from the App Store, then as a condition for unbanning the Tumblr app, issued them an ultimatum to outright ban pornographic content. I originally thought that Yahoo’s justification was an excuse they used because they couldn’t be bothered to invest the resources needed to purge illegal content from their platform, but apparently Discord have been hit with similarly puritanical demands from Apple and Google, who basically have a monopoly on what apps go on your smartphone,

    Nevertheless, Tumblr’s porn exodus is the core reason why Yahoo sold the platform to Automattic for pennies compared to how much they originally acquired Tumblr for. It even sparked a renaissance of lewd artists flocking to Newgrounds - a site that was originally on its knees due to the death of Flash animations and browser gaming.

    It’s also worth noting that Automattic have since partially dialed-back the ban. Nudity is allowed on Tumblr again, but graphic sexual content isn’t. Obviously they did this in direct response to Elon Musk enshittifying Twitter. Another reason I think Tumblr is living on is because it was known for three things: porn, niche fandoms and social justice warriors. Musk has notably burned bridges with any Twitter users who oppose hate speech.

    God forbid if Tim Cook and Sundar Pichai find out there’s porn on Reddit…

    • Spaceman Spiff
      link
      fedilink
      101 year ago

      Do you have a source on that? It doesn’t smell right. Every platform (All of them. Every single one. No exceptions) that allows user-submitted images/videos has an issue that some of that content is illegal. CSAM is the most obvious, but not the only one. What made Tumblr different from the 20 million+ instances on Facebook? Source1, Source2 At the time, scrolling through r/All for just a few minutes was nearly certain to show something pornographic, although not CSAM.

      The story I heard (admittedly, I’m having trouble finding a source at the moment) is that Tumblr’s tools to remove CSAM weren’t good enough. While they would remove the offending image when it was reported, they did not delete the connections to other users/groups. Which meant it was easy to find more, even after some had been removed. In turn, that meant that it quickly became the platform of choice for anyone uploading this stuff, creating a higher volume and ratio of illegal content.

      While I know Apple has long been anti-porn, it seems unlikely that they would take such an arbitrary hard line while ignoring countless others.