Contents
Introduction.
The Three Revolutions in Parametric Statistical Inference.
James Bernoulli's Law of Large Numbers for the Binomial, 1713,
and its Generalization.
De Moivre's Normal Approximation to the Binomial, 1733,
and its Generalizations.
Bayes's Posterior Distribution of the Binomial Parameter
and His Rule for Inductive Inference, 1764.
Laplace's Theory of Inverse Probability, 1774-1786.
A Nonprobabilistic Interlude: The Fitting of Equations to Data, 1750-1805.
Gauss's Derivation of the Normal Distribution and the Method of Least Squares,
1809.
Credibility and Confidence Intervals by Laplace and Gauss.
The Multivariate Posterior Distribution.
Edgeworth's Genuine Inverse Method and the Equivalence of Inverse
and Direct Probability in Large Samples, 1908 and 1909.
Criticisms of Inverse Probability.
Laplace's Central Limit Theorem and Linear Minimum Variance Estimation.
Gauss's Theory of Linear Minimum Variance Estimation.
The Development of a Frequentist Error Theory.
Skew Distributions and the Method of Moments.
Normal Correlation and Regression.
Sampling Distributions Under Normality, 1876-1908.
Fisher's Early papers, 1912-1921.
The revolutionary paper, 1922.
Studentization, the F Distribution and the Analysis of Variance, 1922-1925.
The Likelihood Function, Ancillarity and Conditional Inference.
References.
Subject Index.
Author Index.