Monthly Archives: September 2019

When People Close To Us Behave Immorally, We Are Inclined To Protect Them — Even If Their Crimes Are Particularly Heinous

If you saw a stranger break into someone’s house in the middle of the night, you’d probably call the police. But what if it was a friend or family member who was committing the crime? A new study in Personality and Social Psychology Bulletin looks at the tension between wanting to punish people who commit immoral acts and protecting those with whom we have close relationships. And it turns out that if someone close to us behaves immorally, we tend to err on the side of protecting them — even if their crime is especially egregious.

Read more at BPS Research Digest

Leave a comment

Posted by on September 30, 2019 in cognitive science


A Cognitive Bias Codex

A list of all known cognitive biases, from

Leave a comment

Posted by on September 30, 2019 in cognitive science, religion


New Study Finds Majority of Christians Do Not Have Meaningful Contact with Atheists

Leave a comment

Posted by on September 29, 2019 in religion


Is One Study as Good as Three? College Graduates Seem to Think So, Even if They Took Statistics Classes


When people interpret the outcome of a research study, do they consider other relevant information such as prior research? In the current study, 251 college graduates read a single brief fictitious news article. The article summarized the findings of a study that found positive results for a new drug. Three versions of the article varied the amount and type of previous research: (a) two prior studies that found the drug did not work, (b) no prior studies of the drug, or (c) two prior studies that found the drug had a positive effect. After reading the article, participants estimated the probability the drug is effective. Average estimates were similar for the three articles, even for participants who reported more statistics experience. Overall, just 4% of participants appeared to use prior research to make probability estimates—most seemed to focus on the latest study, while ignoring or discounting prior studies. Implications for statistics education and reporting are discussed.

Is One Study as Good as Three? College Graduates Seem to Think So, Even if They Took Statistics Classes

Leave a comment

Posted by on September 25, 2019 in cognitive science


Extreme, painful ritual appears to have a positive effect on psychophysiological well-being

Painful rituals may improve psychological well-being, according to new research published in Current Anthropology. The findings shed new light on why ritual practices involving pain and suffering are performed by millions of people around the world.

Read more at PsyPost

Leave a comment

Posted by on September 23, 2019 in religion


The Univariate Fallacy And Everest Regressions

The Fallacy of Univariate Solutions to Complex Systems Problems


Complex biological systems, by definition, are composed of multiple components that interact non-linearly. The human brain constitutes, arguably, the most complex biological system known. Yet most investigation of the brain and its function is carried out using assumptions appropriate for simple systems—univariate design and linear statistical approaches. This heuristic must change before we can hope to discover and test interventions to improve the lives of individuals with complex disorders of brain development and function. Indeed, a movement away from simplistic models of biological systems will benefit essentially all domains of biology and medicine. The present brief essay lays the foundation for this argument.

The Univariate Fallacy is when someone argues that, because there is no single quality that separates two categories, the two categories do not exist and are actually just one category.

So for example, there’s no one single quality that separates Windows from Mac iOS, therefore Windows and iOS are the same.

Ridiculous, right?

There are multiple differences between Windows and iOS. There are also many commonalities. Yet there’s no one indicator that all Macs have that Windows OSs do not, though. Or vice versa. Because there isn’t one, concluding that Windows and iOS are the same would be the Univariate Fallacy.

Another example: There’s no single brain structure that separates left-handedness from right-handedness, therefore left or right handedness does not exist.

Here’s an example from Tw****r:

Another example I like is accent recognition: it’s a lot easier to say “She has a British accent” rather than individually describing all the phoneme-level features that your brain is using to make that judgement

The Univariate Fallacy can probably be thought of as a type of statistical fallacy, since this sort of thing seems to always happen in discussions with laypeople about differing statistical populations.

While I’m on the subject of statistics, there’s another statistics fail I see happen pretty regularly. Someone has named it the “Everest Regression”.

The Everest Regression is what happens when you “control” for a fundamental variable when comparing two populations. You might even think of it as the opposite side of, or similar lane to, the Univariate Fallacy. Maybe a sort of multivariate fallacy? I defer to the creator of the Everest Regression.

Basically, “controlling for height, Mount Everest is room temperature”.

Another: Controlling for number of electrons, helium and carbon have the same freezing point.

Controlling for distance from the equator, Alaska and Italy are the same climate.

Controlling for AU, Mars and Earth can support complex life.

You get the point. It’s assuming multivariate differences between univariate phenomena. This is understandable if you’re dealing with new phenomena, but is pointless and frankly sophistry to apply to concepts and categories that we already know are different along one or few axes in order to prove an ideological point.

Leave a comment

Posted by on September 19, 2019 in Bayes, economics/sociology, religion


Is Logic Alone Enough To Become Rational?

Binary true/false Aristotelian logic is not sufficient to guarantee rationality.

Let’s say you go to the doctor due to an annoying mole on your nose. The doctor takes one look at it and says “That’s a cancerous mole. You should get surgery”. What do you do? Is this true or false? What binary major premise-minor premise-conclusion style argument would you formulate from this in order to support your decision?

Let’s say a friend of yours spent $5,000 on a 2 week cruise. Two days before your friend is set to go, he reads two separate stories of cruise liners sinking, and decides to cancel his entire trip. What major-premise-minor-premise-conclusion argument could you use to persuade your friend to keep his cruise? Or would you formulate a syllogism to support his decision?

Let’s say you meed Ned at a party. Ned is 25, majored in Computer Science, and lives in California. Which statement about Ned is more likely? A) Ned is a software engineer B) Ned is a software engineer who works in Silicon Valley

What all of these examples have in common is that they’re dealing with incomplete information. That’s the world we live in; every one of our decisions deals with varying levels of uncertainty. We don’t live in a world of Aristotelian logic. Any system that claims rationality has to deal — rationally — with uncertainty.

In the doctor example, it’s somewhat common knowledge that the doctor might be wrong. We have a handy meme for dealing with this uncertainty: Getting a second opinion. Formally, though, trusting a doctor in this situation is called a base rate fallacy. That is, cancer is so uncommon, and making a judgement on so little information… this is less than the likelihood that the mole is just a mole: A false positive. A much better rule of thumb would be to compare the likelihood of false positives and the likelihood of true positives while keeping in mind how (un)common cancer (or whatever the claim) is. Or for multiple competing claims, comparing the likelihood of true positives for each claim, while keeping in mind how (un)common the claim is.

What about Ned? It seems pretty intuitive that Ned is software engineer who works in Silicon Valley. But this is wrong, no matter how intuitive it seems. Because the population of software engineers who work throughout the entire state of California is larger than the population of software engineers who work in Silicon Valley, so (A) is more likely. This brings up another, related point. Our *feeling* of something being correct is also subject to uncertainty itself… though it doesn’t *feel* that way: There are other illusions besides optical illusions.

Comments Off on Is Logic Alone Enough To Become Rational?

Posted by on September 11, 2019 in Bayes

NeuroLogica Blog

My ὑπομνήματα about religion

Slate Star Codex



Matthew Ferguson Blogs

The Wandering Scientist

What a lovely world it is

NT Blog

My ὑπομνήματα about religion


Understand your mind with the science of psychology -


Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

My ὑπομνήματα about religion

My ὑπομνήματα about religion

My ὑπομνήματα about religion

Skepticism, Properly Applied

Criticism is not uncivil


My ὑπομνήματα about religion

Research Digest

My ὑπομνήματα about religion

Disrupting Dinner Parties

Feminism is for everyone!

My ὑπομνήματα about religion

The New Oxonian

Religion and Culture for the Intellectually Impatient

The Musings of Thomas Verenna

A Biblioblog about imitation, the Biblical Narratives, and the figure of Jesus

The Syncretic Soubrette

Snarky musings from an everyday woman