You have to believe that a malevolent AI will give enough of a damn about you to bother simulating anything at all, let alone infinite torture, which is useless for it to do once it already exists. Everyone on LessWrong has a well-fed ego so I get why they were in a tizzy for a while.
Silly thought experiment, the result of which, in gullible people could make them potential victims of psychosomatic symptoms like headaches and insomnia.
It’s essentially a thought experiment, without getting too specific it goes along the lines of “what if there was a hypothetical bad scenario that gets triggered by you knowing about it”, so if you look it up now you’re doomed.
Removed by mod
In other news… I lost the game.
What about if you read about it and didn’t understand it?
Removed by mod
You have to believe that a malevolent AI will give enough of a damn about you to bother simulating anything at all, let alone infinite torture, which is useless for it to do once it already exists. Everyone on LessWrong has a well-fed ego so I get why they were in a tizzy for a while.
I don’t really see how the thought experiment differs from Christianity…
Removed by mod
Guess it depends on the denomination but mine had mandatory missions :P
Did someone checked? I need a second opinion to not research it
Silly thought experiment, the result of which, in gullible people could make them potential victims of psychosomatic symptoms like headaches and insomnia.
Removed by mod
It’s essentially a thought experiment, without getting too specific it goes along the lines of “what if there was a hypothetical bad scenario that gets triggered by you knowing about it”, so if you look it up now you’re doomed.
Kinda like the game