There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • Apepollo11@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    1 year ago

    I’m only going to tackle the tech side of this…

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Easy. The most popular apps all filter for keywords, and I know that at least some then check the output against certain blacklisted criteria to make sure it hasn’t let something slip through.

    But…

    Anyone can host their own version and disable these features, allowing them to generate whatever they want, in the exactly same way that anyone can write their own story containing whatever they want. All you need is the determination to do it, and some modicum of ability.

    People have been been creating dodgy doctored photos long before computers. When Photoshop came out, it became easier, and with AI it’s easier still. The current laws about creating and distributing indecent images still apply to these new images though.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Technically the diffusers all have the ability to filter any material from the actual outputs using a secondary CLIP analysis to see if it kicks out any keywords which indicate that a topic is in the image. From what I have seen, most AI generation sites use this method as it is more reliable for picking up on naughty outputs than prompt analysis. AI’s are horny, I play with it a lot. All you have to do is generate a woman on the beach and about 20% of them will be at least topless. Now, “woman on the beach” should not he flagged as inappropriate, and I don’t believe the outputs should either because our demonization of the female nipple is an asinine holdover from a bunch of religious outcasts from Europe who were chased our for being TOO restrictive and prudish, but alas, we are stuck with it.