Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    Ya, I guess I can see some uses for it, but nothing that makes the risks of it’s existence worth it.

    It seems like every tool/tech will be used by good people to do good things and bad people to do bad things. Some things like a spoon are handy for getting good things done but not very useful to bad people to do bad things with. Other tools like mood recognition might be quite handy for bad people looking to control others, but only moderately useful to good people.

    Tools in that second group I think we should be wary of letting them exist. Just because something can be done doesn’t mean it should be done or that it can be called “progress”.

    • SokathHisEyesOpen
      link
      fedilink
      English
      3
      edit-2
      1 year ago

      It has already existed for a decade or so. I’m surprised it hasn’t made headlines before. I saw a working demo of it at the Microsoft Visitor Center about 8 years ago. In addition to estimating your mood, it also assigns you a persistent ID, estimates your height, weight, eye color, hair color, ethnicity, and age. It is scarily accurate at all of those things. That ID can be shared across all linked systems at any number of locations. I completely agree with you that there are a lot of concerning, if not downright terrifying implications of this system. It’s a privacy nightmare.