• BougieBirdie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    1
    ·
    1 month ago

    Y’know, I think the only silver lining to deepfakes is that if my nudes get leaked I could just brush it off and say they’re fake.

    Of course, that’s easy to say until you’re slapped in the face with your own bits

    • dan1101@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 month ago

      Not enough ad impressions when you give the user what they’re looking for too quickly. ChatGPT will probably go that way eventually when the investor money runs out.

  • Sundial@lemm.ee
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    2
    ·
    1 month ago

    This is a good example of what happens when technology moves faster than our governments can legislate. We don’t live in a world where companies genuinely care about what affect their products will have on people. We rely on the government to care for us. Problem is, even when the government actually does the needed actions, it’s usually too late.

  • CanadaPlus@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    1 month ago

    I can only imagine how fucked gender relations would have to already be for this to be a normal thing to consider doing publicly.

  • peopleproblems@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 month ago

    You know what’s kind of odd. These deep fake things people are using are usually online ones. As far as I know, there aren’t any public models that are being used for deep fakes. I’m sure you could make them with the right hardware, but that’s a lot of heavy lifting.

    There’s plenty of generative AI models for creating porn. To me there’s a red flag people are ignoring by using the deep fake apps/websites. It gets tied to the uploader whether they say it’s private or not.

  • irotsoma@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    1 month ago

    It’s pretty fucked up. But maybe the solution is to make nudity more acceptable in the society. It’s not a coincidence that South Korea is such a big target for it because sex and nudity are so taboo there. The US is not that much better, but enough that it doesn’t feel as invasive to as much of the population and thus it doesn’t have as much value to people using it to shame and control women. But of course that takes generations to change and would require a lot of women to do unsafe things to affect the change. So it’s not something that will happen quickly.

    • I think it’s still fucked up even in a less taboo world.
      Like instead of receiving an unsolicited dick pic, you receive an unsolicited deepfaked version of you getting dicked is all kinds of fucked up on top of just the nudity.

  • ChicoSuave@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    6
    ·
    1 month ago

    Fight back with porn. Show men with small weiners sobbing uncontrollably, you know, showing non-masculine emotions and traits. Let them deal with their own body problems.

    • JoYo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      honestly with the pervasion of massive unrealistic dongs i wouldn’t mind seeing smaller ones in generated porn.

      i am horrified at all of this, im just trying to find the silver lining.

  • psychothumbs@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    12
    ·
    1 month ago

    I’m not sure I understand how deepfake porn is supposed to be ruining lives here. From the article it seems like the issue is not any concern that it would be mistaken for real, but instead just people having a very horrified reaction to seeing that sort of depiction of themselves? Mostly it seems like the deepfake aspect is sort of a trivial distraction to the real issue on display there of gangs of men targeting random women for online harassment. If there was no such thing as deepfakes other sexually explicit or disturbing images could sub in easily enough.

    Maybe the new deepfake ban will be useful as a way of going after these harassment gangs that previously didn’t face legal consequences? But it’s sort of an inexact tool for that job, given that there are presumably lots of deepfake images out there not used for harassment, and that’s it’s easy enough for harassers to switch away from deepfakes if using them becomes a major legal vulnerability.