The San Francisco-based company formerly known as Twitter announced on Friday that it will build a “trust and safety center” in Austin, Texas.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    43
    arrow-down
    3
    ·
    10 months ago

    I didn’t hear anything about this Taylor Swift thing but holy fuck is it funny.

    First, AI porn of Tay Tay was shared on Twitter: https://www.businessinsider.com/taylor-swift-fans-furious-graphic-fake-ai-images-on-x-2024-1

    Then today, Elon banned searching for her name entirely because of it: https://www.businessinsider.com/taylor-swift-fake-ai-images-searches-blocked-elon-musk-x-2024-1

    I mean it sucks for Taylor Swift that people are creating AI porn of her obviously but the clusterfuck it created is amazing.

    • athos77@kbin.social
      link
      fedilink
      arrow-up
      41
      ·
      10 months ago

      Then today, Elon banned searching for her name entirely because of it

      He says it’s because of the deepfakes. But the right was absolutely furious when Taylor simply told people to register to vote, and neo-fascist Musk is absolutely going to take the opportunity to diminish her reach.

    • MagicShel@programming.dev
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      10 months ago

      The AI porn (in the collection of the Twitter AI fakes I found) wasn’t even very good. I’ve seen far better Photoshop fakes. So I’m not saying it’s cool, but I don’t think AI has moved the bar for fake celeb porn.

      I could generate 30 fake images of TS in an hour on the computer in my basement. Way more if I paid an online service. I don’t need someone else generating them. But if I wanted to see fakes of her naked, I could find 100 pics in a Google search which at least all have the correct number of fingers and legs. But in reality I don’t find fakes of any variety hot at all. I’d rather see a normal person willingly pose nude than a celebrity fake.

      I couldn’t give a shit about the nudity or AI angle. But when you use someone’s body, fake or real, to harass and humiliate them or share something that is then used to harass someone, you should be held financially and/or criminally liable for that behavior as circumstances dictate.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        18
        arrow-down
        3
        ·
        10 months ago

        I don’t disagree, I just think it’s really funny that Musk’s response to this whole incident was, “FINE! NO ONE GETS ANY TAYLOR SWIFT NOW!”

        • originalucifer@moist.catsweat.com
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          10 months ago

          exactly! thats how basic his control of that system now is. its a giant piece of garbage run by a few hamsters.

          im envisioning him dancing in front of a giant bulletin board with pepe silvia scratched out and tay tay in its place

          • baldingpudenda@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            10 months ago

            It’s like 2 ppl with work visas chain smoking, on uppers, and haven’t slept in 3 days just adding an if statement for any search with Taylor swift to return none. They then tell musk to search to prove they fixed it cause they still gotta MacGyver the house of cards that is X’s code base.

    • CoffeeJunkie@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      A couple of the photos were pretty hot. Flattering. I would say one-third to half of them didn’t quite get her face right for some reason. All the Muppet picture ones were crystal clear, perfect details, and kinda funny just on account of being completely absurd. 😂