• Andy@slrpnk.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Why do you guarantee that? It seems obviously wrong, on a technical level.

      The point I’m making is that even if we take it as a given that a shrewd enough AI could correctly distinguish sex at birth – which I think is obviously impossible based on the appearances of many ciswomen and the nature of statistical prediction – you’d still need a training data set.

      If the dataset has any erroneous input, that corrupts its ability, and the whole point of this exercise is trying to find passing transwomen. Why would anyone expect that training set of hundreds of thousands of supposed cis women wouldn’t have a few transwomen in it?

      • AlligatorBlizzard@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        Because Facebook’s data practices, and how much was volunteered by users on there, means that for some percentage of trans users Facebook knows that they’re trans. And you also have a percentage of pregnancy photos uploaded, if someone identifies as a woman on Facebook, and has uploaded photos with a baby bump, she’s cis (or at least a pre-hatching trans person). And at one point in time, a lot of people just volunteered that info to Facebook.

    • BURN@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Facebook couldn’t build a model that has 100% accuracy on if something is a dog or a cat, let alone if a woman is trans.

      • Fisch@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Especially since you oftentimes can’t tell at all from just a picture. There’s cis woman that look more like a man than some trans women.