Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • Willie
    link
    fedilink
    39 months ago

    If the machine can prove that it is conscious (prior to the torture, of course), I’d most likely class it on the same level as a cat or a dog. Cats and dogs are friendly critters who help me do tasks and spend time with me, and an AI would be no different at that point. They’d just be able to do more complex tasks. I guess they might be a little lower, since they lack agency, accept commands, and must follow sets of rules to decide to do tasks, unlike animals and people, who we have accepted can decide what they do and don’t wish to do.

    The only other real difference is that cats, dogs, and people are individuals, with their own upbringings and personalities. Meanwhile an AI would be able to be copied, and many of them could be born from the same original experiences. If basement man copied his tortured AI a few million times, did he torture one AI, or did he torture a million? I think that’s where the real difference lies, that makes the AI less than human.

    If you lopped a cat’s brain out, and were able to hook it up to the AI torture device, and it was magically compatible, it’d be a far greater torture, because there is only one cat, and there will only ever be one cat, the cat cannot be restored from a snapshot, and you cannot copy the cat. If you did the same with a human, it would be an even greater torture yet for the same reasons.

    From an ethical standpoint, today I think it would be equal to animal abuse, however, we won’t perceive it that way, since it will benefit corporations for us to think that real AI are not alive and have no rights. So they’ll likely spend lots of time and money to change our perception to agree with that standpoint. We will think of them as we think of cows and pigs, where they might have feelings and such, but it doesn’t really matter, because those animals are made of tasty food.