Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators. Modern electronic computation has encouraged a host of new statistical methods that require fewer distributional assumptions than their predecessors and can be applied to more complicated statistical estimators. These methods allow the scientist to explore and describe data and draw valid statistical inferences without the usual concerns for mathematical tractability. This is possible because traditional methods of mathematical analysis are replaced by specially constructed computer algorithms. Mathematics has not disappeared from statistical theory. It is the main method for deciding which algorithms are correct and efficient tools for automating statistical inference.
[1]
Robert Tibshirani,et al.
Bootstrap Methods for Standard Errors, Confidence Intervals, and Other Measures of Statistical Accuracy
,
1986
.
[2]
Leo Breiman,et al.
Classification and Regression Trees
,
1984
.
[3]
Bradley Efron,et al.
Computer-Intensive Methods in Statistical Regression
,
1988
.
[4]
W. Cleveland.
Robust Locally Weighted Regression and Smoothing Scatterplots
,
1979
.
[5]
Joseph P. Romano,et al.
A Review of Bootstrap Confidence Intervals
,
1988
.
[6]
John Law,et al.
Robust Statistics—The Approach Based on Influence Functions
,
1986
.
[7]
R. Tibshirani,et al.
Generalized Additive Models
,
1991
.