Pascal’s Wager is a pretty infamous case of religious logic. The Wager goes, given the choice between believing in god or not believing in god, there are four possible outcomes. If you believe in god, then if god exists then you go to heaven. If you believe in god and god doesn’t exist, then nothing happens. On the other hand, if you don’t believe in god, and god exists, then you go to hell. If you don’t believe in god and god doesn’t exist, then nothing happens.
Pascal argued that eternity in heaven is a much better option than an eternity in hell, so it would be rational to believe in god, just in case. What’s the harm, even if god doesn’t exist?
From a decision theory perspective, Pascal is succumbing to unbounded utility; that is in this case, a utility function that allows for infinities. Since 0 and 1 are not probabilities, it follows that anything multiplied by infinity is also infinity. Meaning, even if you assign a 0.000000000000000000000000000000000000000001% chance of god existing, adding an infinite utility in going to heaven creates a situation where you gain infinite utility with believing in god and infinite negative utility in not believing in god. This might seem like a win for a theist who subscribes to Pascal’s Wager and/or infinite utility, but it also introduces some other wacky “rational” behavior.
Let’s say you are walking down a dark alley. A dark figure approaches you from the shadows, demanding money. There’s no weapon visible on the person, and he doesn’t actually seem very threatening. He asks you about Pascal’s Wager, and hypothetically you think it’s a rational decision to believe in god, given the infinite utility it rewards you with. The shady person then claims that he’s not actually from this world, but the world above; he claims that your reality is really a simulation and he’s one of the programmers. He then says that if you don’t give him 10 dollars, he will use his programming powers to generate 3^^^^3* people and torture them for eternity, in your simulated reality.
Is $10 really worth the lives of 3^^^^3 people being tortured for eternity? If you accept the unbounded utility function of Pascal’s Wager, then you must also accept a similarly unbounded utility function here; $10 certainly isn’t worth 3^^^^3 lives given the off chance that this mugger is telling the truth. Meaning, 99.99999999999999999999999% * 10 is never going to overcome 0.0000000000000000000000001% * 3^^^^3. This hypothetical decision theory scenario is called Pascal’s Mugging, a sort of reductio ad absurdum critique of Pascal’s Wager.
So the solution, then, is to not have unbounded utility functions. There should be some sort of cutoff of utility that won’t lead to absurd, yet “rational”, choices in your decision theory algorithm. Of course, I’m not going to begin to attempt to figure out what sort of decision theory framework satisfies that consistently… that’s for economists and AI researchers to debate and decide 😉
3^3 = 3*3*3 = 27
3^^3 = (3^(3^3)) = 3^27 = 3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3*3 = 7625597484987
3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = 3^(3^(3^(… 7625597484987 times …)))
In other words, a really frickin’ huge number (much larger than 10^80, which is the number of atoms estimated to exist in the universe)