I read a story about an AI that was designed to save the world by telling humans what to do. The AI’s sentience was powered by taking in human souls. One of the AI’s moral directives was to feel empathy for any person it hurt. Since it was an AI designed to run the world, it would have to feel bad for hurting everyone in the world who ever died of a preventable cause. All the world’s pain and suffering, experienced by a single being possessing many souls.
So the AI came up with the solution that it would force the souls of sinners it took in to feel this agony, and that’s how it would remain ethical.
It invented hell.
Because a scientist told it that it ought to feel bad when people are hurt.