Daily Archives: September 11, 2019

Is Logic Alone Enough To Become Rational?

Binary true/false Aristotelian logic is not sufficient to guarantee rationality.

Let’s say you go to the doctor due to an annoying mole on your nose. The doctor takes one look at it and says “That’s a cancerous mole. You should get surgery”. What do you do? Is this true or false? What binary major premise-minor premise-conclusion style argument would you formulate from this in order to support your decision?

Let’s say a friend of yours spent $5,000 on a 2 week cruise. Two days before your friend is set to go, he reads two separate stories of cruise liners sinking, and decides to cancel his entire trip. What major-premise-minor-premise-conclusion argument could you use to persuade your friend to keep his cruise? Or would you formulate a syllogism to support his decision?

Let’s say you meed Ned at a party. Ned is 25, majored in Computer Science, and lives in California. Which statement about Ned is more likely? A) Ned is a software engineer B) Ned is a software engineer who works in Silicon Valley

What all of these examples have in common is that they’re dealing with incomplete information. That’s the world we live in; every one of our decisions deals with varying levels of uncertainty. We don’t live in a world of Aristotelian logic. Any system that claims rationality has to deal — rationally — with uncertainty.

In the doctor example, it’s somewhat common knowledge that the doctor might be wrong. We have a handy meme for dealing with this uncertainty: Getting a second opinion. Formally, though, trusting a doctor in this situation is called a base rate fallacy. That is, cancer is so uncommon, and making a judgement on so little information… this is less than the likelihood that the mole is just a mole: A false positive. A much better rule of thumb would be to compare the likelihood of false positives and the likelihood of true positives while keeping in mind how (un)common cancer (or whatever the claim) is. Or for multiple competing claims, comparing the likelihood of true positives for each claim, while keeping in mind how (un)common the claim is.

What about Ned? It seems pretty intuitive that Ned is software engineer who works in Silicon Valley. But this is wrong, no matter how intuitive it seems. Because the population of software engineers who work throughout the entire state of California is larger than the population of software engineers who work in Silicon Valley, so (A) is more likely. This brings up another, related point. Our *feeling* of something being correct is also subject to uncertainty itself… though it doesn’t *feel* that way: There are other illusions besides optical illusions.

Leave a comment

Posted by on September 11, 2019 in Bayes

NeuroLogica Blog

My ὑπομνήματα about religion

Slate Star Codex



Matthew Ferguson Blogs

The Wandering Scientist

What a lovely world it is

NT Blog

My ὑπομνήματα about religion


Understand your mind with the science of psychology -


Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

My ὑπομνήματα about religion

My ὑπομνήματα about religion

My ὑπομνήματα about religion

Skepticism, Properly Applied

Criticism is not uncivil


My ὑπομνήματα about religion

Research Digest

My ὑπομνήματα about religion

Disrupting Dinner Parties

Feminism is for everyone!

My ὑπομνήματα about religion

The New Oxonian

Religion and Culture for the Intellectually Impatient

The Musings of Thomas Verenna

A Biblioblog about imitation, the Biblical Narratives, and the figure of Jesus

The Syncretic Soubrette

Snarky musings from an everyday woman