# Monthly Archives: March 2018

## Ambidextrous people tend to be less religious, study suggests

Research published in 2004 found that strongly handed individuals were more likely to believe in biblical creationism rather than biological evolution. The original study proposed that strongly handed individuals were less likely to update their beliefs in light of evidence. But Chan wondered if other factors could explain the association.

The new study of 743 U.S. adults confirmed that handedness was correlated to religiosity. The strongly handed participants were more likely to agree with statement such as “There is a personal God” while disagreeing with statement such as “Religion makes people do stupid things.”

Chan also found evidence that authoritarianism mediated the relationship between handedness and religiosity. In other words, strongly handed individuals tended to score higher on a measure of right-wing authoritarianism, which in turn was associated with stronger religious belief.

1 Comment

Posted by on March 20, 2018 in religion

## The Monty Hall Problem Refutes Your Religion

Well the title of this post is a bit inflammatory. So I won’t be arguing that it “refutes” your religion, but will be arguing more that it’s weak Bayesian evidence against your religion.

So. The Monty Hall problem is an illustration of how our intuitions of probability don’t always match up with reality. In its original formulation, you’re given a choice between three doors. One door has a prize, the other two do not. If you choose one of the doors, then another door that doesn’t have a prize is shown to you. You then have the option of staying with the door you chose or switching doors.

Most people think that it either doesn’t matter whether you switch or that switching lowers your probability of winning. Neither of those is true!

Your initial probability of winning the prize is 1 out of 3. Once one of the doors is opened, the probability that you had picked the correct door stays at 1 out of 3 whereas the other non-picked door now contains the remaining probability of 2 out of 3. Because you have to do a Bayesian update once new information — in this case, the one door revealed to not have the prize — is introduced.

I’ve gone over this before. Yet, I want to add an additional wrinkle to the problem to make intuition fall more in line with Bayesian reasoning.

If, instead of picking one door out of three to win the prize, what if it were one door out of 100? And once you’ve made your selection, 98 other doors are opened up to show that they have no prize, leaving only your choice and one other unknown door? In this case it seems more obvious that something is suspicious about the only other door that wasn’t opened up. And this intuition lines up with a Bayesian update using the same scenario:

P(H): 1 out of 100 or .01

P(~H): 99 out of 100, or .99

P(E | H): Probability of all other doors besides yours and one other being opened to reveal no prize given that you’ve picked the correct door: 100%.

P(E | ~H): Probability of all other doors besides yours and one other being opened to reveal no prize given that you’ve picked the incorrect door is 100%.

This is an easy Bayesian update to do. Both conditional probabilities, P(E | H) and P(E | ~H) are both 100%. Meaning the likelihood ratio is 1, and your posterior probability is the same as your prior probability. But now your selection is still 1 out of 100 and the only other remaining door has a probability of 99 out of 100 of having a prize! So in this case, both Bayesian reasoning and intuition line up: There is something suspicious about the only other door that wasn’t opened.

How does this relate to religion? Specifically, the religion that you grew up with?

Using Willy Wonka’s logic in the meme above, the chance that you just happened to grow up with the correct religion is pretty low. Instead of the chance of picking the correct door out of 3, or out of 100, you’ve picked a door out of thousands of religions; many of which no longer exist. They are “opened doors” revealing no prize in the analogy.

So a Bayesian update will work the same way as it did with picking one door out of 100. Meaning, your religion is probably wrong. And you should probably switch religions. The only reason I say this is weak Bayesian evidence is because there are still a few religions to choose from. But their joint probability of being correct is yet higher than the single chance that your family religion is the correct one.

Analogously, it would be like if, say, you had a choice between choosing one door out of 10,000, and after your choice all but 10 of the doors are closed. Your initial chance of having chosen the correct door is still 1 out of 10,000, but the 10 doors that remained open after closing the rest have a joint probability of 9,999 out of 10,000 of being the correct door: Those 10 other doors individually have (approximately) 10% chance of being the correct door. As opposed to your original selection’s probability of 1 out of 10,000.

So the Monty Hall problem is weak Bayesian evidence against your religion.

Posted by on March 5, 2018 in Bayes, religion

NeuroLogica Blog

Slate Star Codex

SELF-RECOMMENDING!

Κέλσος

Matthew Ferguson Blogs

The Wandering Scientist

What a lovely world it is

NT Blog

PsyBlog

Understand your mind with the science of psychology -

Vridar

Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

Skepticism, Properly Applied

Criticism is not uncivil

Say..

Research Digest

Disrupting Dinner Parties

Feminism is for everyone!