Publication | Closed Access
Resampling: The New Statistics
121
Citations
0
References
1995
Year
Mathematics EducationStatistical ThinkingJulian L. SimonStatistical FoundationStatistical ComputingSampling TechniqueStatistical EvidenceSampling (Statistics)Statistical InferencePoker HandStatistical SciencePublic HealthMathematical StatisticMedicineSampling MethodsStatisticsNew Statistics
Julian L. Simon, Resampling: The New Statistics, Arlington, VA: Resampling Stats. Inc., 1992, pp. 261. Familiarity with statistical concepts if not competence with statistical procedures and their application is a requirement in most university business programs and increasingly necessary in business practice and research. This large student market ensures an abundance of texts on probability and statistics. The approaches of various statistics texts vary widely resulting from attempts to serve different needs and abilities of both students and instructors. Some books are general while others are theoretical and laden with mathematical formula while others stress applications and problem-solving. Some require no mathematical background, while others assume knowledge of probability, algebra, or even calculus. Increasingly texts are accompanied by statistical software that can be used by students to minimize the computational burden in problem-solving. Resampling: The New Statistics by Julian Simon defies easy categorization. It is a basic non-mathematical introduction to topics typically found in an introductory probability and statistics, elementary probability theory, sampling, inferential statistics, hypothesis testing and simple correlation. But this book differs significantly from the approaches of traditional texts. Rather than present derivations of formula, the presentation is based on the concept that simulation of problems and repeated sampling (resampling) will intuitively lead students to an understanding of difficult statistical concepts. The author contends that derivations and proofs are too difficult for many students to understand and get in the way of an appreciation of the value of statistics. The general procedure is to structure each problem in such a way that it can be simulated using some random device--either a die, a coin, or random number generator--and then generate enough repeated samples to result in a reliable estimate of the answer. For example, what is more likely, a poker hand with two pairs or one with three of a kind? The traditional approach would be to derive the answer analytically from a formula. The approach of Simon is to take repeated samples of poker hands and compare the proportion with two of a kind to that with three of a kind. Again the premise is that the answer obtained this way will be sufficiently close to the derived answer and students will gain greater understanding of the underlying concept. Of course, repeated sampling can be cumbersome and time-consuming. The text is designed to accompany software which enables numerous random trials to be performed quickly and the results tabulated. The book is conceptually but not explicitly divided into two parts. Chapters one, two, and three introduce the uses of probability and statistics, describe the resampling method of solving problems, and define basic concepts such as conditional and unconditional probability, sample and universe, and independent events. …