George Carlin Estate Files Lawsuit Against Group Behind AI-Generated Stand-Up Special: ‘A Casual Theft of a Great American Artist’s Work’::George Carlin’s estate has filed a lawsuit against the creators behind an AI-generated comedy special featuring a recreation of the comedian’s voice.

  • cubism_pitta@lemmy.world
    link
    fedilink
    English
    arrow-up
    115
    arrow-down
    9
    ·
    10 months ago

    If its wrong to use AI to put genitals in someone’s mouth it should probably be wrong to use AI to put words in their mouth as well.

    • ClamDrinker@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      10 months ago

      I agree and I get it’s a funny way to put it, but in this case they started the video with a massive disclaimer that they were not Carlin and that it was AI. So it’s hard to argue they were putting things in his mouth. If anything it’s praiseworthy of a standard when it comes to disclosing if AI was involved, considering the hate mob revealing that attracts.

      • CleoTheWizard@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        3
        ·
        10 months ago

        The internet doesn’t care though. If I make fake pictures of people using their likeness and add a disclaimer, people will just repost it without the disclaimer and it will still do damage. Now whether or not we can or should stop them is another story

        • ClamDrinker@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          edit-2
          10 months ago

          Completely true. But we cannot reasonably push the responsibility of the entire internet onto someone when they did their due diligence.

          Like, some people post CoD footage to youtube because it looks cool, and someone else either mistakes or malicious takes that and recontextualizes it to being combat footage from active warzones to shock people. Then people start reposting that footage with a fake explanation text on top of it, furthering the misinformation cycle. Do we now blame the people sharing their CoD footage for what other people did with it? Misinformation and propaganda are something society must work together on to combat.

          If it really matters, people would be out there warning people that the pictures being posted are fake. In fact, even before AI that’s what happened after tragedy happens. People would post images claiming to be of what happened, only to later be confirmed as being from some other tragedy. Or how some video games have fake leaks because someone rebranded fanmade content as a leak.

          Eventually it becomes common knowledge or easy to prove as being fake. Take this picture for instance:

          It’s been well documented that the bottom image is fake, and as such anyone can now find out what was covered up. It’s up to society to speak up when the damage is too great.