Monthly Archives: February 2015

Unfalsifiable Beliefs Are More Attractive When We’re Threatened

Phalanx from 300


When people hear the phrase “unfalsifiable” it’s usually in a scientific context. It was one of Karl Popper’s definitions of science, which he crafted in opposition to Freud’s method of psychoanalysis. In this case, religious people aren’t really concerned that their beliefs are unfalsifiable; religion is not science.

But the appeal to the unfalsifiable isn’t restricted to religious belief. It seems to apply and appeal to people in a general moral domain, and the largest sample of unfalsifiable beliefs outside of religion are found in the realm of politics.

From Friesen, et al. (2014):


We propose that people may gain certain “offensive” and “defensive” advantages for their cherished belief systems (e.g., religious and political views) by including aspects of unfalsifiability in those belief systems, such that some aspects of the beliefs cannot be tested empirically and conclusively refuted. This may seem peculiar, irrational, or at least undesirable to many people because it is assumed that the primary purpose of a belief is to know objective truth. However, past research suggests that accuracy is only one psychological motivation among many, and falsifiability or testability may be less important when the purpose of a belief serves other psychological motives (e.g., to maintain one’s worldviews, serve an identity). In Experiments 1 and 2 we demonstrate the “offensive” function of unfalsifiability: that it allows religious adherents to hold their beliefs with more conviction and political partisans to polarize and criticize their opponents more extremely. Next we demonstrate unfalsifiability’s “defensive” function: When facts threaten their worldviews, religious participants frame specific reasons for their beliefs in more unfalsifiable terms (Experiment 3) and political partisans construe political issues as more unfalsifiable (“moral opinion”) instead of falsifiable (“a matter of facts”; Experiment 4). We conclude by discussing how in a world where beliefs and ideas are becoming more easily testable by data, unfalsifiability might be an attractive aspect to include in one’s belief systems, and how unfalsifiability may contribute to polarization, intractability, and the marginalization of science in public discourse.

As an sort of aside, just because a belief is unfalsifiable doesn’t mean that it’s false. A belief can be unfalsifiable but yet still be true. Falsifiability is a problem for epistemology, not ontology. For example, there’s no possible observation I can make where I’m not alive. So from my point of view, being alive is unfalsifiable.

Anyway, retreating to unfalsifiable beliefs once you feel you’re under attack seems like it’s a pretty good example of a Mott and Bailey tactic. If you recall, Mott and Baliey behavior, as Scott describes, is when:

I feel like every single term in social justice terminology has a totally unobjectionable and obviously important meaning – and then is actually used a completely different way.

The closest analogy I can think of is those religious people who say “God is just another word for the order and beauty in the Universe” – and then later pray to God to smite their enemies. And if you criticize them for doing the latter, they say “But God just means there is order and beauty in the universe, surely you’re not objecting to that?”

The result is that people can accuse people of “privilege” or “mansplaining” no matter what they do, and then when people criticize the concept of “privilege” they retreat back to “but ‘privilege’ just means you’re interrupting women in a women-only safe space. Surely no one can object to criticizing people who do that?”

So someone presents evidence that the type of god that the average religious person believes in doesn’t exist and a sophisticate rejoins with a completely unfalsifiable version of that god, saying that “of course” no one believes in the first type of god. But you can bet that once the sophisticate feels like they are no longer under attack, they will go back to believing in the “falsifiable” version of that god again.

So what we have here seems to be basic human psychology. When we feel threatened, we retreat to our Mott: The unfalsifiable version of our cherished belief(s). Maybe in the future we’ll read psychology articles about the Mott & Bailey Effect that describes people’s tendency to retreat to the unfalsifiable version of their beliefs when they feel like they’re under attack.

Related, if you want to try persuading someone, try to make sure they don’t feel like you’re attacking them.

(h/t Epiphenom)

Comments Off on Unfalsifiable Beliefs Are More Attractive When We’re Threatened

Posted by on February 8, 2015 in cognitive science, religion


What Gambling Monkeys Teach Us About Human Rationality

From the website Mind Hacks:

When we gamble, something odd and seemingly irrational happens.

It’s called the ‘hot hand’ fallacy – a belief that your luck comes in streaks – and it can lose you a lot of money. Win on roulette and your chances of winning again aren’t more or less – they stay exactly the same. But something in human psychology resists this fact, and people often place money on the premise that streaks of luck will continue – the so called ‘hot hand’.

The opposite superstition is to bet that a streak has to end, in the false belief that independent events of chance must somehow even out. This is known as the gambler’s fallacy, and achieved notoriety at the Casino de Monte-Carlo on 18 August 1913. The ball fell on black 26 times in a row, and as the streak lengthened gamblers lost millions betting on red, believing that the chances changed with the length of the run of blacks.


An experiment reported by Tommy Blanchard of the University of Rochester in New York State, and colleagues, shows that monkeys playing a gambling game are swayed by the same hot hand bias as humans. Their experiments involved three monkeys controlling a computer display with their eye-movements – indicating their choices by shifting their gaze left or right. In the experiment they were given two options, only one of which delivered a reward. When the correct option was random – the same 50:50 chance as a coin flip – the monkeys still had a tendency to select the previously winning option, as if luck should continue, clumping together in streaks.

The reason the result is so interesting is that monkeys aren’t taught probability theory as school. They never learn theories of randomness, or pick up complex ideas about chance events. The monkey’s choices must be based on some more primitive instincts about how the world works – they can’t be displaying irrational beliefs about probability, because they cannot have false beliefs, in the way humans can, about how luck works. Yet they show the same bias.

As the writer says, people being bad at probability might have some sort of primitive cause. A module or something that evolved in our brain before homo sapiens were sapient. If seeing consistency where there is none came about due to our evolutionary heritage, then things like believing in conspiracy theories or the supernatural were sort of bred into us by evolutionary processes. Combine this premise with our highly social brain and we might have another reason why belief in god is so prevalent, which is usually the go-to example of failing at using probability correctly.

Another commenter on the site provides some additional common irrationalities between us and other animals:

Humans also succumb to another fallacy that is strikingly irrational from an economic standpoint: They often give greater value to objects of good quality than to the same objects together with objects of lesser quality. This so-called “less is more effect” can be demonstrated when humans are asked to estimate the value of two alternatives, one of which is objectively of greater value than the other. For example, in one study subjects bid at an auction on 10 baseball cards in mint condition and at a different time on the same 10 cards with an additional 3 cards that were judged to be in poorer condition. Although the 3 cards in poorer condition were not worth as much as the cards in mint condition, they were each worth something. Nevertheless, the bid for the 10-card set was on average 59% higher than it was for the 13-card set.

Interestingly, animals, too, appear to experience this kind of sub-optimal judgment. For instance, monkeys willingly ate a piece of sliced vegetable or a grape but when offered a choice between them, showed a clear preference for the grape over the vegetable slice. However, surprisingly, when they were offered a choice between a single grape and a grape plus a slice of vegetable, they reliably preferred the single grape. This is hard to understand, as one would think that the struggle for existence teaches animals “every calorie counts”.

To test the generality of this effect, Zentall conducted a similar experiment with dogs. The dogs showed a preference for a small piece of cheese over a small piece of carrot but would willingly eat the piece of carrot when offered by itself . When, on critical test trials, the researcher offered the dogs a piece of cheese together with a piece of carrot versus a piece of cheese alone, all of the dogs except one preferred the piece of cheese alone.

We should keep stuff like this in mind when we get upset that people are behaving in irrational ways… like not vaccinating their children. Which is definitely another example of failing at applying probability correctly (probability is logic of science). We’re still animals. Social animals, but animals nonetheless.

Comments Off on What Gambling Monkeys Teach Us About Human Rationality

Posted by on February 4, 2015 in cognitive science

NeuroLogica Blog

My ὑπομνήματα about religion

Slate Star Codex

The Schelling Point for being on the #slatestarcodex IRC channel (see sidebar) is Wednesdays at 10 PM EST


Matthew Ferguson Blogs

The Wandering Scientist

Just another site

NT Blog

My ὑπομνήματα about religion

Euangelion Kata Markon

A blog dedicated to the academic study of the "Gospel According to Mark"


My ὑπομνήματα about religion


Understand your mind with the science of psychology -


Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

My ὑπομνήματα about religion

My ὑπομνήματα about religion

My ὑπομνήματα about religion

atheist, polyamorous skeptics

Criticism is not uncivil


My ὑπομνήματα about religion

Research Digest

My ὑπομνήματα about religion

Disrupting Dinner Parties

Feminism is for everyone!

My ὑπομνήματα about religion

The New Oxonian

Religion and Culture for the Intellectually Impatient

The Musings of Thomas Verenna

A Biblioblog about imitation, the Biblical Narratives, and the figure of Jesus