Other samples:

Android: https://github.com/nipunru/nsfw-detector-android

Flutter (BSD-3): https://github.com/ahsanalidev/flutter_nsfw

Keras MIT https://github.com/bhky/opennsfw2

I feel it’s a good idea for those building native clients for Lemmy implement projects like these to run offline inferences on feed content for the time-being. To cover content that are not marked NSFW and should be.

What does everyone think, about enforcing further censorship, especially in open-source clients, on the client side as long as it pertains to this type of content?

Edit:

There’s also this, but it takes a bit more effort to implement properly. And provides a hash that can be used for reporting needs. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX .

Python package MIT: https://pypi.org/project/opennsfw-standalone/

    • @WhoRoger@lemmy.world
      link
      fedilink
      English
      510 months ago

      I wish there were such detectors for other triggering stuff, like gore, or creepy insects, or any visual based phobia. Everyone just freaks out about porn.

      • @pexavc@lemmy.worldOP
        link
        fedilink
        2
        edit-2
        10 months ago

        Actually am looking at this exact thing. Compiling them into an open source package to use on Swift. Just finished nsfw. But everything you mentioned should be in a “ModerationKit” as well. Allowing users to toggle based on their needs.

        • janAkali
          link
          fedilink
          English
          2
          edit-2
          10 months ago

          In many cultures around the world nudity in itself isn’t considered inappropriate or sexual.