Roko's Basilisk. But here's the thing, once you're aware of it, you're fucked. The only solution is to not research it, don't know anything about it. Live in blissful ignorance.
You have to believe that a malevolent AI will give enough of a damn about you to bother simulating anything at all, let alone infinite torture, which is useless for it to do once it already exists. Everyone on LessWrong has a well-fed ego so I get why they were in a tizzy for a while.
Well one punishes you if you deny it's existence, the other punishes you if you fail to assist in it's development. So it's a LITTLE different. :)
Fortunately, for me personally, I helped fund a key researcher who could, in theory, be a major contributor to such a thing. So I have plausible deniability. ;) And I've been promised a 15 minute head start before he turns it on.
Silly thought experiment, the result of which, in gullible people could make them potential victims of psychosomatic symptoms like headaches and insomnia.
It's essentially a thought experiment, without getting too specific it goes along the lines of "what if there was a hypothetical bad scenario that gets triggered by you knowing about it", so if you look it up now you're doomed.