RSS

Monthly Archives: July 2013

Science Doesn’t Trust You To Be A Bayesian

20130730-132008.jpg

The title of this post is basically the idea that I get from a few of Yudkowsky’s posts over at Less Wrong. I was reading Luke’s Sequence on Philosophy (one of those posts I’ve linked to previously) and he quotes some of Yudkowsky’s explanation for why non- or semi-technical explanations (like Evolutionary Psychology) might not be a “science”. But he qualifies it by saying that semi-technical explanations should still be Bayesian and then explains that the social process of science (i.e. the scientific method) was created because people suck at probability, so this social process forces practitioners to be implicit Bayesians.

A technical explanation of a technical explanation:

People eagerly jump the gun and seize on any available reason to reject a disliked theory. That is why I gave the example of 19th-century evolutionism, to show why one should not be too quick to reject a “non-technical” theory out of hand. By the moral customs of science, 19th-century evolutionism was guilty of more than one sin. 19th-century evolutionism made no quantitative predictions. It was not readily subject to falsification. It was largely an explanation of what had already been seen. It lacked an underlying mechanism, as no one then knew about DNA. It even contradicted the 19th-century laws of physics. Yet natural selection was such an amazingly good post-facto explanation that people flocked to it, and they turned out to be right. Science, as a human endeavor, requires advance prediction. Probability theory, as math, does not distinguish between post-facto and advance prediction, because probability theory assumes that probability distributions are fixed properties of a hypothesis.

The rule about advance prediction is a rule of the social process of science – a moral custom and not a theorem. The moral custom exists to prevent human beings from making human mistakes that are hard to even describe in the language of probability theory, like tinkering after the fact with what you claim your hypothesis predicts. People concluded that 19th-century evolutionism was an excellent explanation, even if it was post-facto. That reasoning was correct as probability theory, which is why it worked despite all scientific sins. Probability theory is math. The social process of science is a set of legal conventions to keep people from cheating on the math.

[…]

But the rule of advance prediction is a morality of science, not a law of probability theory. If you have already seen the data you must explain, then Science may darn you to heck, but your predicament doesn’t collapse the laws of probability theory. What does happen is that it becomes much more difficult for a hapless human to obey the laws of probability theory. When you’re deciding how to rate a hypothesis according to the Bayesian scoring rule, you need to figure out how much probability mass that hypothesis assigns to the observed outcome. If we must make our predictions in advance, then it’s easier to notice when someone is trying to claim every possible outcome as an advance prediction, using too much probability mass, being deliberately vague to avoid falsification, and so on. (my emphasis)

Luke’s main criticism is, if you think that modern EP isn’t a science, then neither was 19th century Darwinism. But both are Bayesian, and that’s why they’re valid explanations.

Take a look at the post I wrote called What Makes A Good Explanation?. In it, I’m basically following the laws of probability theory; the laws of thought. There’s nothing in a good explanation that says “prediction”. The only thing that predictions force you do to — if you’re not already good at it — is to restrict the types of data that your hypothesis allows. If you’re not good at having your hypothesis restrict data, then making a prediction is a good heuristic to follow that will implicitly make you do so.

This is one reason why some historians think that Bayes Theorem, or probability theory in general, is only meant for “future events”. It’s not. As Yudkowsky says, probability theory has no separate rules for “predictions” and “postdictions”. So historians can be Bayesians without being scientists. Moreover, the hoopla about “all knowledge is scientific” vs. “other ways of knowing” is confusion over the science vs. probability theory distinction. The social process of the scientific method is a special case of probability theory. As I’ve written about before, the scientific method is implicitly Bayesian. So, a person who fixes their bike or a plumber who investigates your leaky faucet isn’t doing science. They are both doing probability theory. And someone who knew absolutely nothing about the scientific method and only followed correct probability theory would already be practicing concepts like falsifiability and Occam’s Razor.

This is why, even though EP might not be a science because it makes no predictions (it actually does make predictions), we shouldn’t really care at this point. As long as EP follows correct probability theory and restricts the types of data that we should see then it still succeeds in its job of explaining certain behaviors.

—–

As an addendum, since I’m talking about probability theory being the key to good explanations, there’s one aspect of what makes a good explanation that I left out (even though I’ve referenced it before). The four qualities that I wrote that good explanations have are 1) Mechanism 2) Testability 3) Simplicity 4) Precision. There’s another one that’s embedded in the way many mathematicians write BT, which has an extra variable that I didn’t learn when I was taught BT in undergrad:

P(H | E & B) = [P(E | H & B) x P(H | B)] / P(E | B)

What is that B? It represents background knowledge. And this is another quality that good explanations have. They make use of our background knowledge, or said another way, good explanations make valid analogies. So I wrote in my Why I’m not a Christian post:

Why are there four gospels instead of one? What was the historical situation that produced a fourfold gospel canon? Instead of using traffic accidents to describe religious history, we should use religion to explain religious history. Why is there, for example, one book of Joshua? Trick question; there isn’t just one book of Joshua, there are two. One, the Jewish version which is in the Christian Bible, and another one, the Samaritan version. So using religion as our explanatory example, we see why there are two books of Joshua: Religious sectarianism. Jews don’t consider Samaritans to be the true version of their religion and Samaritans don’t consider Jews to be the true version of their religion. If this explains why there is more than one book of Joshua, this probably also explains why there is more than one gospel. Religious sectarianism; Matthew wasn’t written to corroborate Mark, as the traffic accident explanation assumes, but was written to replace Mark. The same with every other gospel.

Here I used the analogy of Jewish/Samaritan sectarianism to explain why there are four gospels instead of one. If that isn’t clear enough, maybe I should try another example. Since I mentioned Evolutionary Psychology, maybe I should bring up an example in its natural enemy feminism.

Why do guys harass women on the streets? Why do guys try to pick up women in supermarkets, bookstores, the gym, the DMV, the laundry room, or anywhere you can imagine… to the abject chagrin of many of my female friends? From what I’ve read, the overarching feminist explanation is “male privilege”. However, this explanation seems like it doesn’t restrict the types of data we would see and doesn’t fit any of our background knowledge. What would be the analogous situation to men harassing women anywhere women go outside of their homes be? Where is a similar dynamic located at in humanity’s collective background knowledge? Instead of using male privilege to explain it, I think it’s due to economics. And unlike male privilege, economics actually has theories that are mathematically grounded.

As an analogy, let’s say that I’m a rich white tourist backpacking around rural India. In my travels, I get stopped every 30 minutes by a poor Indian beggar saying he can read my fortune for 50 rupees. Of course, I don’t want my fortune told while I’m looking at cool Indian rugs to buy or attempting to go to the bathroom, so I decline these many offers. After about two days of this, being stopped every 30 minutes, I start getting annoyed. I just want to backpack in peace! Eventually I start getting weary of walking around India by myself since this increases the odds of me being harassed by an Indian beggar, and… well, you can see where this is going. Imagine instead of this happening on a 3 week trip, it happened every day of my life since I was about 13!

In this situation, does it make sense to talk about poor privilege as the reason these beggars are harassing me? Or does it make more sense to say that these poor people are approaching me due to a sort of scarcity mentality? If you read that link, you realize that this isn’t some hypothetical, but a situation that actually happens; poor people in rural India harassing rich white tourists, giving the rich white tourists the analogous creeped-out feeling that women get around men in the USA (this makes sense, since the author of that post is also a Bayesian). Not only is this analogy good for explaining behavior, but it’s also good for getting men to understand the woman’s point of view. Much better than telling a man to check his privilege… especially since the behavior is coming from a scarcity mentality, which is already a position of weakness.

Continuing to think like a Bayesian, what sort of culture would come about due to this scarcity mentality, and the continual harassing of rich white tourists by Indian beggars? What would the data look like? Would a beggar who was able to get lots of rupees from rich white tourists be respected by other beggars and in that larger society controlled by poor people as a whole? Would a particular rich white tourist who always gets their fortune told be respected? Would a subculture of poor Indian beggars prop up that came up with more creative ways of getting money from rich white tourists? Would it lead some beggars to be super pushy, getting into personal spaces, or even resorting to coercion with drugs or alcohol — or even flat out robbery — to get some rupees? Would this poor society blame rich white tourists who get robbed by stating that they looked too rich and too white?

When I was growing up in NYC, there was this scam for a while by homeless people wherein they would come up to a car that was stopped at a stoplight and begin washing the car’s windshield. After they were done, they would demand money. Most of the time this simply wouldn’t work. What did the homeless person do sometimes? They would get mad. Sorry homeless dude, your act of niceness doesn’t create an obligation in me to pay you (though the psychology behind it is easy to have anticipated. Halo effect + Just World Fallacy = ???). These homeless people certainly felt entitled to my money, but telling them to check their privilege, again, is not the correct framework.

The only objection here is that in some cases scarcity can be manufactured. And as one should know, the big three religions are experts at creating (fake) scarcity. Nonetheless, it seems to me that a scarcity mentality is the reason for the behavior. Male privilege has no analogous situation in some other area of life whereas scarcity (mentality) does. It fits into our background knowledge of many other economic situations and dynamics such as that between rich white tourists and poor Indian beggars, car salesmen and car buyers, and other relationships between advertisers and their target audiences. It also explains the emergent lauding/shame cultures as well; car salesmen who sell a lot of cars are lauded, customers who buy anything without apparent discrimination are seen as shameful… and car salesmen who don’t sell lots of cars are losers while customers who buy very few — if any — things are respected.

Granted, I haven’t read any scholarly feminist articles but it doesn’t seem like many (if any) of the feminist memes circulating in popular online articles and blogs are Bayesian in nature. As a sociological theory it doesn’t necessarily have to be scientific (indeed, it probably doesn’t want to). But it does have to be Bayesian if it’s to maximize its explanatory power.

 
2 Comments

Posted by on July 30, 2013 in Bayes, economics/sociology

 

An… Interesting Interview With Reza Aslan

Reza Aslan has a new book out called Zealot: The Life and Times of Jesus of Nazareth. In this book he argues that the historical Jesus was a Zealot. Like I’ve mentioned a few times, I’m agnostic about the existence of Jesus. But the reconstruction that I feel makes the most sense of his crucifixion is if he were himself a Zealot (owing to the strange translation between “Canaanite” and “Zealot”) or if he had really strong ties to the Zealot party (as opposed to the Pharisees, Sadducees, or Essenes) . For example, I wrote:

Simon the Zealot in the gospel narratives is the same Simon the Zealot in Josephus. Josephus’ Simon was executed (along with his brother James [the Zealot]) sometime in the mid 40s CE.

It’s telling that two of Jesus’ disciples share the same names as these two sons of Judas [the Zealot]. Not only that, but these two are also among the “pillars”. While I think that part is coincidence, I do think there’s significance that Simon the Zealot was listed as one of Jesus’ disciples in Mark and the other two Synoptics. I don’t see any reason for Simon the Zealot’s inclusion, either from a wholly literary point of view or from the traditional peace preaching Jesus historical view.

What if Jesus on the other hand was the disciple of Simon the Zealot and not the other way around?

This would mean that not only is Mark’s narrative theology; that Mark’s Jesus is mythical, but that Mark’s narrative is also apology. I think this makes sense of the silence in early Christian writings about the teachings of Jesus – because there were none. This makes sense of why no one talked about any of the Earthly activities of Jesus – because he was a revolutionary, and his actions were disreputable. That’s why they used to think of “christ” from a human point of view (2 Cor 5:16) but no longer. This might mean that Jesus was executed along with the brothers James and Simon, hence the two other criminals on the crosses with Jesus.

So I’m partial to Aslan’s thesis that Jesus was a Zealot.

As for the interview itself, the Fox News correspondent comes across as obtuse and unable to think outside of a very narrowly defined box. She actually sounds eerily like how some NT scholars react to the Jesus Myth hypothesis. Her objections to Aslan being a Muslim writing about Christianity — even though he has the relevant expertise — sound a lot like Bart Ehrman’s objection to anyone writing about the historical Jesus unless they had super-duper specific qualifications. And the fact that the Fox News’ correspondent’s questions were answered right in the book reminds me of how James McGrath doesn’t read books he reviews. It’s interesting how bias always looks the same, no matter the medium.

 

Are Decisions Made Without Emotion?

20130726-102345.jpg

I was reading an interesting blog post that someone posted in a general group on Facebook. It was a blog by a conservative Christian. Normally I would have dismissed this as the random musings of a lay Christian, but he had some interesting things to say that I might have also dismissed a few years ago.

Obviously, his evidences for the existence of god and the resurrection of Jesus I’ve addressed and refuted over a couple of posts on my blog. His evidences:

* The argument and evidence of the beginning of our universe out of non-being
* The design of our universe in a very finely-tuned manner to support life (not just human life, but life of any sort)
* The existence of certain phenomena that simply cannot be explained or exist within an atheist’s worldivew, including:

**Consciousness
**Rationality
**The 1st person perspective
**Free Will

*The existence of objective moral duties and obligations which seem to span all cultures, geographies and time periods in history
*The historically reliable evidence that Jesus truly did live, teach, die and rise again on the third day

Like I said, I’ve addressed a lot of these things from the framework of the laws of thought; laws of thought that go beyond and are more specific than the run of the mill atheist Traditional Rationality rules-of-thumb. Not that I’m knocking atheist/Traditional rationality, it’s just that they aren’t precise enough.

So. The design of the universe? Fine tuning is actually an argument for atheism per the rules of probability theory (most theists grossly misuse probability when attempting to argue for the Earth’s/Solar System’s/Universe’s fine tuning). Or at least, an argument for a non-all powerful god. And the reason that the fine tuning of the universe is evidence against the Christian god is because the Christian god is unfalsifiable; there are too many possible other “finely tuned” constants that the Christian god could have decided to go with to definitively rule those out and use our current universe’s configuration instead. The Christian god could have had us survive on Mercury, Neptune, a comet that orbits the sun every 500 years, have had us live in a universe where only five stars could form in the entire universe or one where a star existed every Planck-length if he really wanted to. καθὼς γέγραπται: παρὰ δὲ θεῷ πάντα δυνατά

The certain phenomenon that can’t be explained from an atheist point of view: These actually make less sense from a supernaturalist point of view since that point of view requires more metaphysical coin flips. So e.g. supernatural beings break the 2nd law of thermodynamics. Supernatural beings as described move around, so unless they are perpetual motion machines they would be generating heat from their movement. But we have no evidence of said heat, and absence of evidence is evidence of absence. So you would have to posit some other physics that allows for perpetual motion machines to prop up that belief, which is an extra (highly unlikely) metaphysical coin flip.

Did Jesus rise from the dead? The history of early Christianity is itself unreliable, and a man rising from the dead is too extraordinary to explain the pretty mundane and wholly predictable emergence of Christianity.

But that’s not what I was interested in. What interested me was his examples of groupthink, which as I’ve been writing about, atheists are not necessarily immune to.

But it seems to me that this is just a dodge since when we get into the evidence, I find that they mostly don’t want to discuss it. They want to insult and jeer and dismiss anything that might possibly disagree with what they want to be true. And as I looked further, I realized that was the key…it boils down to what atheists want to be true.

As Michael Talbot, author The Holographic Universe, put it:

“But why is science so resistant to the paranormal in particular? This is a more difficult question. In commenting on the reistance he experienced to his own unorthodox views on health, Yale surgeon Dr. Bernie S. Siegel, author of the best-selling book ‘Love, Medicine, and Miracles’, asserts that it is because people are addicted to their beliefs. Siegel says this is why when you try to change someone’s belief they act like an addict.

“There seems to be a good deal of truth to Siegel’s observation, which perhaps is why so many of civilization’s greatest insights and advances have at first been greeted with such passionate denial. We are addicted to our beliefs and we do act like addicts when someone tries to wrest from us the powerful opium of our dogmas. And since Western science has devoted several centuries to not believing in the paranormal, it is not going to surrender its addiction lightly.”

– Michael Talbot, The Holographic Universe pp. 6 & 7

My own experiences matched up with what Talbot was saying. People seemed to have adopted a worldview and were addicted to it to the point that they were simply trying to defend it at all costs regardless of whether the evidence supported them or not.

He has a few other examples. It’s ironic that he writes as though these same biases don’t apply to him and his conservative Christian worldview. The fact that he’s writing in English is pretty good evidence that he was raised in a Christian household, even moreso that he was raised in a Christian culture. All of which he probably strongly identifies with. And it’s highly unlikely that he’s studied any cognitive science to learn where these biases come from and how to overcome his biases. It’s this fact of groupthink which, ironically, is one of the reasons why I have a hard time thinking that free will is a coherent concept; the evidence that he presents against atheism as atheists engaging in groupthink is one of the strongest evidences against free will. I would go so far as to say that thinking souls exists is a massive cognitive bias based on how our minds are embodied.

So it’s not so much that atheists are “afraid” of religion (I would bet that a good lot of them are) but that they identify too strongly with atheism, leading to such reactions. As I’ve learned from the cognitive science of rationality (again, rationality that most atheist are unaware of), the first step towards epistemic irrationality is identifying too strongly with a group or an ideology (on the other hand, this might be the instrumentally rational thing to do). This identity will lead to motivated skepticism and biases like the sophistication effect and the introspection bias.

Though the thing about biases, unlike with logical fallacies, is that they simply weigh towards an irrational conclusion. They don’t necessitate an irrational conclusion; this fits well within the framework of thinking of rationality in terms of probability.

All of this talk about biases, however, doesn’t mean that atheism is false nor that conservative Christianity is false. This is an actual logical fallacy that the blogger rests his conclusion on which shouldn’t be done. But his observation that a lot of atheists react emotionally to critiques of atheism, or feel more at home in atheist groups, makes sense when you know how our brains are wired for groupthink (unless you tend towards the Aspergers/autism spectrum like I do). Indeed, without emotion we wouldn’t be able to place value on the act of being rational. To choose to be rational is itself an emotional decision. And we have no control over our emotions.

 
Comments Off on Are Decisions Made Without Emotion?

Posted by on July 26, 2013 in cognitive science, god, religiosity

 

Blindness

20130717-121906.jpg

(Three unfalsifiable mice)

It is a mathematical axiom that P(E | H) + P(~E | H) = 100%. That is, the probability of having the evidence at hand given the truth of the hypothesis plus the probability of not having the evidence at hand given the truth of the hypothesis equals 100%. The two have to exhaust all possibilities. E and ~E aren’t necessarily binary; it could represent the entire range of the types of evidence we would see given H (like, say, numbers 1 – 6 for the roll of dice).

Why am I pointing this out? Well, in What Is Evidence, Yudkowsky writes:

This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise. If your retina ended up in the same state regardless of what light entered it, you would be blind. Some belief systems, in a rather obvious trick to reinforce themselves, say that certain beliefs are only really worthwhile if you believe them unconditionally— no matter what you see, no matter what you think. Your brain is supposed to end up in the same state regardless.

What would it look like, from a Bayesian-hypothetical model, to have your retina end up in the same state regardless of what light entered it? P(E | H) and P(~E | H) would reach maximum entropy; meaning that P(E | H) = P(~E | H). Or, if there are 100 types of evidences to be found, P(E | H) = 1/100 and each instance of P(~E | H) other than P(E | H) are equal. Again, like rolling dice; each number 1 – 6 is equally likely (unless the die is weighted).

As I’ve hammered on multiple times, this is the Bayesian definition of unfalsifiable; also as the blog Maximum Entropy points out. This just means that if your hypothesis is unfalsifiable, it is no different than being blind, since there isn’t any type of light (evidence) that would have a different effect on your retina (hypothesis). And this means that to have blind faith is to have a faith that is unfalsifiable. They are metaphorically equal.

So if you don’t want to have “blind faith” in some hypothesis (like god), then believe in a hypothesis that can be disproved. Or to reword it, you can’t both believe in a hypothesis that is unfalsifiable and claim to not have blind faith in said hypothesis.

 
Comments Off on Blindness

Posted by on July 17, 2013 in Bayes

 

One-Box Or Two-Box?

20130710-114448.jpg

So there’s a classic Decision Theory paradox called Newcomb’s Paradox. This is where you’re presented with an agent that supposedly can predict your actions (on Less Wrong he’s called “Omega”). It descends from its spaceship and presents you with two boxes. One box is transparent and contains $1,000 and the other box is closed and may or may not contain $1,000,000. You have a choice of choosing only the million dollar box or taking both boxes. The rub is that Omega has predicted your choice in advance, and if it predicts that you’ll take both boxes it didn’t put 1 million in the second box but if you only choose the unknown box then it put 1 million in it.

Which would you choose?

There is more information in this scenario. Omega has done this game with five other people and all five of those people two-boxed and only got $1,000. Since Omega is gone by this point, your decision that you haven’t made yet, in a way, already affected the contents of the unknown box. And if you decide to one-box, then the transparent box explodes and the $1,000 is burnt up.

This hypothetical unpacks a bunch of assumptions that people have about free will, omniscience, and decision theory, which leads to the paradox. If you’re the type of person to one-box, then this means that deciding to two-box should net you $1,001,000 right? Or maybe Omega already knew you would think like that so you’ll really only get $1,000? But if Omega really did know you’d think like that, then simply deciding to one-box automatically puts $1 million in the second box? (Seems like I’m arguing with a Sicilian when death is on the line…)

The majority of Less Wrongers choose to one-box, while garden-variety atheists choose to two-box, and theists choose to one-box. What’s going on here? Granted, like that link says, this is a pretty informal survey so it might not reflect the wider populations of those subgroups. But it seems to me that the deciding factor is whether you allow for the possibility of an all-knowing being who is able to predict your move.

Theists, obviously, allow for the possibility of a being that knows everything and can predict their actions. Garden-variety atheists reject such a possibility. These two facts probably account for the tendency for theists to one-box and for atheists to two-box. Even though Less Wrongers are much more likely to be atheists, they are atheists who study rationality a lot more than the average garden-variety atheist (garden-variety atheist probably know a lot of logical fallacies, but logical fallacies are not the be-all, end-all of rationality). It might be Less Wrong’s commitment to rationality and its focus on AI that leads more towards one-boxing, as it is theoretically possible for a superintelligent AI to predict ones actions. As they say at Less Wrong, just shut up and multiply. Since I am technically a Less Wronger, that’s exactly what I’ll do 🙂

First of all, since I don’t think the concept of free will exists or is even coherent, I don’t see any reason why there can’t be some supersmart being out there that could predict my actions. Given that, it doesn’t mean that this Omega person actually is that supersmart being. The only evidence I have is its say so and the fact that five previous people two-boxed and only got $1,000.

Is the reason that these other five people two-boxed and got $1,000 due to Omega accurately predicting their actions? Or is there some other explanation… like Omega not being a supersmart being and he never puts $1 million in the second box? Which is more likely, if I were actually presented with this scenario in real life? It seems like the simplest explanation, the one with the least metaphysical coin flips, is that Omega is just a being with a spaceship and doesn’t have $1 million to give. If I had some evidence that people actually have one-boxed and gotten the $1 million then I would put more weight on the idea that he actually has $1 million to spare, and more weight on the possibility that Omega is a good/perfect predictor.

I guess I’m just a garden-variety atheist 😦

But let me continue to shut up and multiply, this time using actual numbers and Bayes Theorem. What I want to find out is P(H | E), or the probability that Omega is a perfect predictor given the evidence. To update, I need my three variables to calculate Bayes: The prior probability, the success rate, and the false positive rate. Let’s say that my prior for Omega being a supersmart being is 50%; I’m perfectly agnostic about its abilities (even though I think this prior should be a lot less…). To update my prior based on the evidence at hand (five people have two-boxed and gotten $1,000), I need the success rate for the current hypothesis and the success rate of an alternative hypothesis or hypotheses (if it were binary evidence, like the results of a cancer test, I would call it the false positive rate).

The success rate is asking what is the probability of the previous five people two-boxing and getting $1,000 given that Omega is a perfect predictor. Assuming Omega’s prediction powers are true, P(E | H) is obviously 100%. The success rate of my alternative is asking what is the probability of the previous five people two-boxing and getting $1,000 given that Omega never even puts $1 million in the second box. Again, assuming that Omega never puts $1 million in the second box is true, P(E | ~H) would also be 100%. In this case, Bayes Factor is 1 since the success rate for the hypothesis and the success rate for the alternative hypothesis are equal, meaning that my prior probability does not move given the evidence at hand.

Now that I have my posterior, which is still agnosticism about Omega’s powers of prediction, I find out my decision theory algorithm. If I two-box, there are two possible outcomes: I either get only $1,000 or I get $1,001,000. Both outcomes have a 50% chance of happening due to my subjective prior, so my decision theory algorithm is 50% * $1,000 + 50% * $1,001,000. This sums to a total utility/cash of $501,000.

If I one-box, there are also two possible outcomes: I either get $1,000,000 or I lose $1,000. Both outcomes, again, have a 50% chance of happening due to my subjective probability about Omega’s powers of prediction, so my decision theory algorithm is 50% * $1,000,000 + 50% * $1,000. This sums to $499,000 in total utility.

So even using some rudimentary decision theory, it’s still in my best interests to two-box given the evidence at hand and my subjective estimation of Omega’s abilities. But like I said, if there was evidence that Omega ever puts $1 million in the second box, this would increase my subjective probability that I could win $1 million. According to my rudimentary decision theory algorithm, one-boxing and two-boxing have equal utility around 50.1%. Meaning that any probability that I estimate at above 50.1% for Omega’s powers of prediction it makes more sense to one-box. I assume that Less Wrongers and theists have a subjective probability about Omega’s powers of prediction to be close to 100%, and in that case it overwhelmingly makes more sense to one-box. Things get more complicated, though, if Omega both puts $1 million in a box and someone got $1,001,000! Omega’s success rate is now less than 100% and I known that he does put $1 million in the closed box.

But again, if your subjective estimate of Omega’s powers of prediction are less than 50% it makes more sense to two-box. And that’s probably why Vizzini is dead.

 
Comments Off on One-Box Or Two-Box?

Posted by on July 10, 2013 in Bayes, decision theory

 

Rationality Quote June 2013

‘Nother rationality quote!

It is no accident, I would maintain, that quantum mechanics is so wildly counterintuitive. Part of the nature of explanation is that it must eventually hit some point where further probing only increases opacity rather than decreasing it. Consider the problem of understanding the nature of solids. You might wonder where solidity comes form. What if someone said to you, “The ultimate basis of this brick’s solidity is that it is composed of a stupendous number of eensy weensy bricklike objects that themselves are rock-solid”? You might be interested to learn that bricks are composed of micro-bricks, but the initial question – “What accounts for solidity?” – has been thoroughly begged. What we ultimately want is for solidity to vanish, to dissolve, to disintegrate into some totally different kind of phenomenon with which we have no experience. Only then, when we have reached some completely novel, alien level will we feel that we have really made progress in explaining the top-level phenomenon.

Why scientific explanations must necessarily end up being counter intuitive. As I wrote before, science aims to kill the metaphor while religion aims to keep it alive.

 
Comments Off on Rationality Quote June 2013

Posted by on July 2, 2013 in rationality

 
 
NeuroLogica Blog

My ὑπομνήματα about religion

Slate Star Codex

The Schelling Point for being on the Discord server (see sidebar) is Wednesdays at 10 PM EST

Κέλσος

Matthew Ferguson Blogs

The Wandering Scientist

Just another WordPress.com site

NT Blog

My ὑπομνήματα about religion

Euangelion Kata Markon

A blog dedicated to the academic study of the "Gospel According to Mark"

PsyPost

My ὑπομνήματα about religion

PsyBlog

Understand your mind with the science of psychology -

Vridar

Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

My ὑπομνήματα about religion

My ὑπομνήματα about religion

My ὑπομνήματα about religion

atheist, polyamorous skeptics

Criticism is not uncivil

Say..

My ὑπομνήματα about religion

Research Digest

My ὑπομνήματα about religion

Disrupting Dinner Parties

Feminism is for everyone!

My ὑπομνήματα about religion

The New Oxonian

Religion and Culture for the Intellectually Impatient

The Musings of Thomas Verenna

A Biblioblog about imitation, the Biblical Narratives, and the figure of Jesus