Madi Hime is taking a deep drag on a blue vape in the video, her eyes shut, her face flushed with pleasure. The 16-year-old exhales with her head thrown back, collapsing into laughter that causes smoke to billow out of her mouth. The clip is grainy and shaky – as if shot in low light by someone who had zoomed in on Madi’s face – but it was damning. Madi was a cheerleader with the Victory Vipers, a highly competitive “all-star” squad based in Doylestown, Pennsylvania. The Vipers had a strict code of conduct; being caught partying and vaping could have got her thrown out of the team. And in July 2020, an anonymous person sent the incriminating video directly to Madi’s coaches.

Eight months later, that footage was the subject of a police news conference. “The police reviewed the video and other photographic images and found them to be what we now know to be called deepfakes,” district attorney Matt Weintraub told the assembled journalists at the Bucks County courthouse on 15 March 2021. Someone was deploying cutting-edge technology to tarnish a teenage cheerleader’s reputation.

But a little over a year later, when Spone finally appeared in court to face the charges against her, she was told the cyberharassment element of the case had been dropped. The police were no longer alleging that she had digitally manipulated anything. Someone had been crying deepfake. A story that generated thousands of headlines around the world was based on teenage lies, after all. When the truth finally came out, it was barely reported – but the videos and images were real.

  • theparadox@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    3
    ·
    edit-2
    6 months ago

    Honestly, this is what I consider to be the clear danger of generative “AI”. We’ve basically gone several steps backward toward a world where we can’t prove something happened because recording technology doesn’t exist because it can be faked well enough and easily enough to cast doubt on any recording.

    Sure, experts can analyze it and deem it legit or not legit but for the average person its becoming harder and harder to tell. Word spreads fast, and it’s been demonstrated that lies spread faster than their corrections by orders of magnitude. All it takes is for something to “go viral” or someone with authority to lazily confirm or deny that something is real/fake and the public at large loses touch with reality.

    This plus the insane misinformation campaigns and those that cry “fake news” whenever the news contains information they don’t like… I feel like truth and reality are becoming unfashionable.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      12
      ·
      6 months ago

      Sure, experts can analyze it and deem it legit or not legit but for the average person its becoming harder and harder to tell.

      Frankly, I think that’s exactly what will also be its downfall in such cases. At a certain amount of exposure of such material, people will stop caring, because of how common it will be. The larger issue is not how to tell what’s fake, but how to tell what’s true. Large players, such as governments or corporations, can and will abuse this tech for whatever narrative they need.

    • EatATaco@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      6 months ago

      Confirmation bias just leads people to believe whatever confirms their beliefs. Now we can give anyone anything they need to confirm their beliefs.