A series of writings exploring the issues of reasoning under uncertainty. Each of these explores the foundational issues and main schools of some different area of uncertain reasoning.


The Uncertainty Series

Part 1: Probability is Difficult

Part 2: Statistics is Difficult

Appendix: Uncertain Appendices


I’ve decided not to freeze any of the finished parts — while they are polished and readable as is, I may continue to add bits and pieces to make them more complete (though it’s not my intention that they become all-encompassing; they are simply meant as comprehensive overviews).

1. Probability is Difficult

Link: Probability is Difficult

Summary

An analysis of what “probability” is. We use our exposition of the classical interpretation of probability to introduce probability in the mathematical sense (Kolmogorov axioms), before showing some fatal flaws of the classical interpretation such as the fragility of uniform distributions to reparametrization and Bertrand’s paradox, mentioning some possible solutions to these problems.

Bayesianism follows, a careful analysis of Dutch books showing that certain conditions on “rational partial beliefs” demand that these beliefs conform to the Kolmogorov axioms as well as Bayes’ law for updating. We then talk about the various degrees of and restrictions on “rationality”, as well as the most problematic part of Bayesianism, the generation of priors, before jumping into a tangent about complexity-based priors and Solomonoff induction. Most discussion about the generation of priors, though, is saved for the next part.

Afterwards, we introduce the propensity interpretation, which attributes probabilities directly to the physical world and takes multiple forms. The incompatibility of the propensity interpretation with the conditional probabilities coming from the Kolmogorov axioms is shown via the Humphreys and Sober paradoxes, and the ambiguities thrown up by the reference class problem are discussed as well.

Our final interpretation is the frequentist interpretation of probability, which comes in finite and hypothetical forms, the two forms being bridged by Bernoulli’s law of large numbers. Most of our discussion, though, is saved for the next part.

Having covered the main interpretations of probability, we go on to talk about how to deal with uncertainty without treating probabilities as first-class objects. The main family of methods for doing this falls under Dempster-Shafer evidence theory, which collates a set of evidence into a method for generating proposals about belief in and plausibility of any given statement. We come to its main technical issue, the combination of different sets of evidence, with several different proposed combination rules being discussed.

Status

Many parts of this demand rewrites and more detailed/clear expositions, especially the Dutch book analysis and subjective/objective division of Bayesianism. The Bayesianism could also do with reference to some basic canonical problems, such as the boy/girl paradox, Monty Hall problem, and two envelopes problem. Dempster-Shafer theory should also be explained more clearly, perhaps with reference to better examples.

I want to write an appendix detailing the measure-theoretic foundations of mathematical probability, and the more nuanced views of some issues these foundations sometimes offer.

The devil on my shoulder is telling me to write a second, more philosophical appendix concerning the modal foundations of probability, e.g. David Lewis’ work on modality and counterfactual reasoning.

The devil on my other shoulder is telling me to write a case study, analyzing the role of probability in the anthropic principle, e.g. in Weinberg’s discussion of the string theory landscape, in the Doomsday argument, and in the Sleeping Beauty problem (and Bostrom’s self-sampling vs self-indication assumptions). Also, a case study on expectation paradoxes, for instance the St. Petersburg paradox and Pascal’s mugging.

Finally, I’d like to add a section covering probability from the information-theoretic lens: surprisal, entropy, coding (in Shannon’s sense), and so on.

Powered by Fruition
document.querySelectorAll('.notion-topbar')[0].style.height = "0px"; document.querySelectorAll('.notion-frame')[0].style.height = "auto";