This is the first installment of a series of writings talking about the difficulties of reasoning under uncertainty. Uncertainty is a fundamentally probabilistic concept; as such, our discussion must begin by examining the very foundations of probability itself. As with any concept as vague yet intuitive as probability, we must conduct our analysis at multiple levels:

• What do we colloquially mean by the word “probability”?
• How may we rigorously translate some colloquial notion into formal, mathematical syntax?
• What are we forced to logically admit about our concept of probability, once we’ve committed to a rigorous translation?

We’ll start with the first point, surveying various different meanings of the word probability, before subdividing each meaning into a set of rigorous translations compatible with it; such a combination of a meaning and translation is known as an interpretation of probability. Finally, in order to cover the third point, we will draw out the implications of and paradoxes within each interpretation, as well as comparing their strengths and weaknesses relative to one another. As we’ll see, it turns out that even something as intuitive as probability turns out to be an extremely thorny concept upon further examination, with every interpretation being either limited to an extremely narrow domain or prone to internal contradiction; in short, we’ll discover why probability is difficult.

While I try to stay away from arguments via pure mathematics, seeking to translate them into conceptual terms, some parts are excessively technical. Some calculus, and a basic understanding of sets and set operations, is required to understand most of these.

The Uncertainty Series

Part 1: Probability is Difficult

Part 2: Statistics is Difficult

Part 3: Causality is Difficult

Planned

Part 4: Modeling is Difficult

Planned

Part 5: Prediction is Difficult

Planned

Part 6: Experimentation is Difficult

Planned

Part 7: Science is Difficult

Planned

Epilogue: The Past, Present, and Future of Uncertainty

Planned

Appendix: Uncertain Appendices

Expanding

# What is Probability?

Any introduction to probability will generally begin with a few classical examples — rolling die, flipping a coin, being dealt a certain hand from a deck, and so on. These intuition pumps allow us to examine the features of probability in the most basic cases. To introduce the main interpretations of probability we’ll cover in this essay, let’s go with a particularly contrived example of an uncertain die roll.

I videotape myself rolling a normal, six-sided die, and show this video to a small focus group of people, each of whom has a different idea of what probability means. As the die leaves my hand in the video, I pause it, and ask them what the probability that the die will land on an even number is.

• Carrie: I notice that there are six possibilities for what the outcome was, but have no knowledge of what the outcome was nor any reason to prefer one possible outcome over another; therefore, I should treat each possibility as having an equal likelihood, whereupon I arrive at one in two by dividing the number of even possibilities by the number of total possibilities.
• Barry: Normally, I'd agree with Carrie, but the fact that you asked whether it was even makes me think that it was probably odd. People just tend to do that. If it were actually even, maybe there's a one in three chance you'd ask whether it was even, and if it were actually odd, a two in three chance. So, I’m going to guess that the chance of the die rolling on an even number is one in three.
• Frank: The only question is whether the die did land on an even number, or whether it did not land on an even number. One of these has already happened, so I can’t give any answer that is neither zero nor one — what would such an answer even physically correspond to, given that the outcome is settled? We can only speak of probability as, well, a statistical phenomenon, in the sense that it comes out of many different iterations of the same event. In my experience, though, dice come out odd just about as often as they come out even, so all I can say is that the frequency of even numbers among of a collection of rolled dice is one in two.
• Paul: When a die rapidly rolls across a table before coming to a halt, its final position, which includes the side that ends up on top, depends very sensitively on the way in which you release it, the weight distribution of the die, the surface of the table, and so on — it's more or less chaotic. So, unless you've tightly controlled the conditions in some way, the die should land on any number facing up when it comes to a rest with roughly equal tendencies, giving us a one in two chance of an even number coming up.

Which justification do we take to be the most accurate? Obviously, the answer depends on what we mean by probability, but what exactly we mean can be very difficult to pin down. Many people have tried, coming up with many different answers, and we'll explore those in turn. Each of these "willing" participants has given a unique account: Carrie the Classical account, Barry the Bayesian account, Frank the Frequentist account, Paul the Propensity account.

document.querySelectorAll('.notion-topbar').style.height = "0px"; document.querySelectorAll('.notion-frame').style.height = "auto";