An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

  • @21Cabbage
    link
    English
    91 year ago

    Honestly news stories about dumb ideas not working out don’t really bother me much. Congrats, the plagiarism machine tried to make you look like you fit in to a world that, to the surprise of nobody but idealists, still has a shitload of racial preferences.

    • @[email protected]
      link
      fedilink
      English
      10
      edit-2
      1 year ago

      Honestly it’s just not being used correctly. I actually believe this is just user error.

      These AI image creators rely on the base models they were trained with and more than likely were fed wayyyyy more images of Caucasians than anyone else. You can add weights to what you would rather see in your prompts, so while I’m not experienced with the exact program she used, the basics should be the same.

      You usually have 2 sections, the main prompt (positive additions) and a secondary prompt for negatives, things you don’t want to see. An example prompt could be “perfect headshot for linked in using supplied image, ((Asian:1.2))” Negative: ((Caucasian)), blue eyes, blonde, bad eyes, bad face, etc…

      If she didn’t have a secondary prompt for negatives I could see this being a bit more difficult, but still there are way better systems to use then. If she didn’t like the results from the one she used instead of jumping to “AI racism!” she could have looked up what other systems exist. Hell, with the model I use with Automatic1111 I have to put Asian in my negatives because it defaults to that often.

      Edit: figures I wrote all this then scrolled down and noticed all the comments saying the same thing lol at least we’re on the same page