• @[email protected]
    link
    fedilink
    English
    20
    edit-2
    6 months ago

    Even if they hired an actress with a similar voice to train the AI to sound similar to Johansonn, celebrity impersonators have been doing that for (I’d guess) longer than recorded voice media has even existed. I’m having a hard time seeing why one is fine but the other isn’t.

    Edit: corrected bad spelling of her name.

    • @[email protected]
      link
      fedilink
      English
      236 months ago

      I’m having a hard time seeing why one is fine but the other isn’t.

      I think the law says that neither is fine, in the context here. The law allows celebrity impersonators to engage in parody and commentary, but not to actually use their impersonation skills to endorse products, engage in fraud, and pretend to be that person being impersonated.

      • @[email protected]
        link
        fedilink
        English
        96 months ago

        But this is just using a voice. It might even be their natural voice. I don’t think there’s fraud because it wasn’t presented as Scarlett’s voice. If it wasn’t presented as not her voice, then maybe those other two would apply, though is allowing a service to use your voice the same as endorsement? Is it enough to sound like someone to be considered impersonating them?

        This situation lands in a grey area where I can’t endorse or condemn it. I mean, it would have been smarter to just use a different voice. Find a celebrity that would sign on or just use an unrecognisable voice. Ethical or not, and legal or not, it was stupid.

        • @[email protected]
          link
          fedilink
          English
          116 months ago

          It was explicitly represented as her voice when he tweeted “Her” in relation to the product, referencing a movie which she voiced. It’s not a legal grey area in the US. He sank his own ship here.

        • @[email protected]
          link
          fedilink
          English
          96 months ago

          I’m mostly going off of this article and a few others I’ve read. This article notes:

          Celebrities have previously won cases over similar-sounding voices in commercials. In 1988, Bette Midler sued Ford for hiring one of her backup singers for an ad and instructing the singer to “sound as much as possible like the Bette Midler record.” Midler had refused to be in the commercial. That same year, Tom Waits sued Frito-Lay for voice misappropriation after the company’s ad agency got someone to imitate Waits for a parody of his song in a Doritos commercial. Both cases, filed in California courts, were decided in the celebrities’ favor. The wins by Midler and Waits “have clear implications for AI voice clones,” says Christian Mammen, a partner at Womble Bond Dickinson who specializes in intellectual property law.

          There’s some more in there:

          To win in these cases, celebrities generally have to prove that their voice or other identifying features are unregistered trademarks and that, by imitating them, consumers could connect them to the product being sold, even if they’re not involved. That means identifying what is “distinctive” about her voice — something that may be easier for a celebrity who played an AI assistant in an Oscar-winning movie.

          I think taken with the fact that the CEO made a direct reference to the movie she voiced an AI assistant when announcing the product, that’s enough that a normal person would “connect them to the product being sold.”

        • @[email protected]
          link
          fedilink
          English
          86 months ago

          I read that Scarlett’s family & friends couldn’t tell it apart from her actual voice.

          I’d say that “Open AI” or whatever they’re called, trained it specifically on only her voice.

          The seems-narcissistic-machiavellian-sociopath-CEO whats-his-face tried to get her to agree to this,

          she wouldn’t agree,

          he tweeted “her” when releasing the update ( after Scarlett’s movie )

          she lawyered up,

          he backed down…


          I’d say it’s a clear case of identity-theft-for-profit of a celebrity, by a consistently narcissistic-machiavellian-sociopath who’s kinda leaving lots of corpses of “integrity” all over the place.

          There’s some law which protects celebrities from use of their likeness, and rightly:

          it’s their “coin” that their career is made-of, right?

          _ /\ _

          • @[email protected]
            link
            fedilink
            English
            16 months ago

            frankly I’m amazed they tried to pull this shit when it was so obvious and Johannson obviously wasn’t on board.

    • @[email protected]
      link
      fedilink
      English
      86 months ago

      Legally maybe its fine, I’m not sure. But because they tried to license or get permission and involvement officially from her, but she declined, then they asked again , she declined again and two days later they released it with (possibly) her voice anyway. At best it displays them to be bad faith plundering abusers including of individuals’ likenesses. We in this type of forum are not surprised of course - its par for the course with these tech bros who’ve made a business out of other peoples content largely without consent. Respect to Johansonn for making this known publicly though. But even weirder that they then took it down when they saw the reaction. Highlighting themselves as Sociopaths. Plenty of those around, but with this much power and access to data? Creepy.

      • @[email protected]
        link
        fedilink
        English
        36 months ago

        Yeah, it is kinda sketchy, though they might have backed down because they realized there was no winning this in the court of public opinion, regardless of whether they were trying to act in good faith prior to the controversy coming out.

        IMO Johansonn making it public was an obvious strategic move because it gave her a strong position because of how unpopular AI is these days. She might have otherwise just paid some lawyers a lot of money to accomplish nothing if it was legally fine and she was adamant about them not using a voice that sounded like hers (guessing the best she would have gotten without going public is them paying her some money to continue using that similar voice or maybe a bit more money to use her actual voice, either way they would have gotten what they wanted).

        • @[email protected]
          link
          fedilink
          English
          36 months ago

          Yeh she effectively chose an ethical position with no downside I can think of. Unless they made her sign an NDA / MOU which they clearly didn’t. Their sketchiness is enhanced if anything. Makes me wonder if they made some low level threat at that last minute approach. e.g we are using your voice anyway, now’s your chance to get onboard the gravy train or look bad. Just speculation of course. She wasnt aware apparently. Also the fact they want to mimic the “her” ai is just weird. They are worse than the cautionary fiction.