A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Marxism-Fennekinism@lemmy.ml
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

        • rchive@lemm.ee
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

          • Nyanix@lemmy.ca
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            While you’re correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Of course they exist. If the AI generated image “depicts” a person, a victim in this case, that person “by definition” exists.

            Your argument evaporates when you consider that all digital images are interpreted and encoded by complex mathematical algorithms. All digital images are “fake” by that definition and therefore the people depicted do not exist. Try explaining that to your 9 year old daughter.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Won’t somebody think of the make believe computer generated cartoon children?!

        • legios@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          Australia too. Hentai showing underage people is illegal here. From my understanding it’s all a little grey depending on the state and whether the laws are enforced, but if it’s about victimisation the law will be pretty clear.

          • Fal@yiffit.net
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            Absolutely absurd. Criminalizing drawings is the stupidest thing in the world.

            This case should already be illegal under harassment or similar laws. There’s no reason to make drawings illegal

            • Metz@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              In germany even a written story about it is illegal. it is considered “textual CSAM” then.

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

    • cannache@slrpnk.net
      link
      fedilink
      arrow-up
      0
      arrow-down
      2
      ·
      1 year ago

      Oooh that’s bad. Yeah I would never do that but I did hear about the idea floating around back in the day, though I don’t think the tech is there yet. It’s just generally not cool

  • gandalf_der_12te@feddit.de
    link
    fedilink
    arrow-up
    10
    arrow-down
    4
    ·
    1 year ago

    Honest opinion:

    We should normalize nudity.

    That’s the only healthy relationship that we can have with our bodies in the long term.

    • SuddenDownpour@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      There’s a pretty big fucking difference between normalizing nudity and people putting the faces of 14 year olds in porn video through deepfakes.

    • Basil@lemmings.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      This isn’t even the problem going on, though? Sure, normalize nudity, whatever, that doesn’t fix deep faked porn of literal children.

      • Thief_of_Crows@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        Why is that a problem though? Youre allowed to draw a picture of a specific child naked, why is it suddenly a crime if you use a computer to do it really well?

    • GiddyGap@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Having spent many years in both the US and multiple European countries, I can confidently say that the US has the weirdest, most unnatural, and most unhealthy relationship with nudity.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      For this to happen people would probably need to stop judging people on their bodies. I am pretty sure there is a connection there. With how extremely superficial media and many relationships are, and with how we value women in particular, this needs a lot of change in people and society.

      I also think it would be a good thing, but we still have to do something about it until we reach that point.

  • virock@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?

      • DarkGamer@kbin.socialBanned from community
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

        Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn’t appear personality rights apply.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Pretty sure it’s illegal to create sexual images of children, photos or not.

        Maybe in your distopian countries where drawings are illegal. Absolutely absurd you’re promoting that as a good thing.

  • Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

    Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    5
    ·
    edit-2
    1 year ago

    Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

    The rights to famous people’s “images” are bought and sold all the time.

    I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.

    The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.

    A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      1 year ago

      There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”

      Like, we still need to be able to have a public conversation about (especially) public figures and their actions as photographed

      • Zachariah@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        Seems like a typical copyright issue. The copyright owner has a monopoly on the intellectual property, but there are (genuine reasons) fair use exceptions (for journalism, satire, academic, backup, etc.)

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Reminder that the stated reason for copyrights to exist say all, per the US Constitution, is “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

          Anything that occurs naturally falls outside the original rationale. We’ve experienced a huge expansion of the concept of intellectual property since then, but as far as I can tell there has never been a consensus on what purpose intellectual property rights are supposed to serve beyond the original conception.

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

        Making fake images whole cloth is.

        The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

      When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

      I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

      I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

        • Chakravanti@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

          • Olgratin_Magmatoe@startrek.website
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            1 year ago

            People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

            And that’s all even assuming there ever ends up being open source AI.

            • Chakravanti@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

              • Olgratin_Magmatoe@startrek.website
                link
                fedilink
                arrow-up
                0
                ·
                1 year ago

                You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

                I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

                • Chakravanti@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  1 year ago

                  Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

    Step in the right direction, I guess.

    How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it’s difficult for human eyes to tell the difference.

    • CommanderCloon@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I wouldn’t call this a step in the tight direction. A call for a step yeah, but it’s not actually a step until something is actually done

  • calypsopub@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • finestnothing@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

          • derpgon@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

            Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

            • WoahWoah@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

              HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

              And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

              The entire thing is damaging and ugly.

  • PhantomAudio@lemm.ee
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    gee here is a novel idea, dont let children have access to social media. that would solve a lot of other problems also