• @[email protected]
      link
      fedilink
      English
      236 months ago

      Facebook couldn’t build a model that has 100% accuracy on if something is a dog or a cat, let alone if a woman is trans.

      • @[email protected]
        link
        fedilink
        English
        76 months ago

        Especially since you oftentimes can’t tell at all from just a picture. There’s cis woman that look more like a man than some trans women.

    • Andy
      link
      fedilink
      English
      146 months ago

      Why do you guarantee that? It seems obviously wrong, on a technical level.

      The point I’m making is that even if we take it as a given that a shrewd enough AI could correctly distinguish sex at birth – which I think is obviously impossible based on the appearances of many ciswomen and the nature of statistical prediction – you’d still need a training data set.

      If the dataset has any erroneous input, that corrupts its ability, and the whole point of this exercise is trying to find passing transwomen. Why would anyone expect that training set of hundreds of thousands of supposed cis women wouldn’t have a few transwomen in it?

      • @[email protected]
        link
        fedilink
        English
        36 months ago

        Because Facebook’s data practices, and how much was volunteered by users on there, means that for some percentage of trans users Facebook knows that they’re trans. And you also have a percentage of pregnancy photos uploaded, if someone identifies as a woman on Facebook, and has uploaded photos with a baby bump, she’s cis (or at least a pre-hatching trans person). And at one point in time, a lot of people just volunteered that info to Facebook.

        • Andy
          link
          fedilink
          English
          26 months ago

          Yeah, but the training set is nowhere near clean. That’s my point. “Close” is no where near good enough within this context,