Instituting Statistics Reform

Charles P. Armstrong, Ph.D.

Professor of Management Science

and Information Systems

The University of Rhode Island

Voice: (401)-423-0706
Fax: (401)-423-2079

E-mail:

Introduction

Although statistics reform is in its early stages, the need for change is pressing. If anything, the problem is worse in statistics than it ever was in calculus. Each year about half a million students are required to take one or two courses in statistics. (This is about the same number of students who are taking business or engineering calculus.) Students studying statistics are confused and terrified. The calculus reform movement provides a good model for changing statistics. Statistics reform can borrow the knowledge of the calculus and the philosophy of the calculus reform.

Historical Background

Very little has changed in statistics courses in the past three or four decades. The major exception was the introduction of statistical software such as Minitab. The welcome introduction of statistical processing software and the personal computer has eased the burden of statistical calculation. The task of understanding underlying mathematical concepts is, however, not addressed by software designed to make numerical calculations.

Examination of statistics textbooks reveals that neither topics nor pedagogy have changed very much over the post-war era. Typical coverage includes descriptive statistics, probability, discrete distributions, continuous distributions, estimation, tests of hypotheses, regression, and time series. Typically, mathematical explanations, especially those involving the calculus, are not used. Publishers advise authors to avoid using mathematics because the students will not understand it, and reliance on mathematical explanations will raise anxiety levels.

Such advice is rapidly becoming out obsolete. Prior to the mid-seventies, tables of squares and square roots were commonly placed in an appendix in statistics texts. Common availability of hand calculators put an end to this practice.

Interactive graphics and computer algebra systems are just as much a fact of life today as calculators were twenty years ago. Along with extremely affordable personal computers, the setting is right for using good mathematical explanations as a tool for teaching statistics.

Using The Calculus Reform Philosophy

The key to calculus reform is providing the student with multiple explanations of calculus concepts. The rule of three says do it graphically, numerically, and algebraically. The same guidelines can be applied to teaching statistical concepts.

Doing it Graphically

The tangent to the normal distribution and the standard deviation

A common way to describe a point of inflection is by showing the tangent line at the point of inflection and on either side of this point. Many functions have points of inflection. By choosing the normal distribution and by animating the tangent line, it is easy to locate the points of inflection and show that they are located one standard deviation above and below the mean.

The area under the normal distribution

Perhaps nothing causes so much fear to the beginning statistics student as the concept of the area under the normal distribution and the standard normal table. This should not be surprising because the concept of the area under any function probably is unclear to most students. Even if the statistics student has had calculus, probably there was no attempt either in the calculus course or the statistics course to tie these concepts together in order to give the student a useful frame of reference. At one time, there may not have been a better way of doing this. Computer algebra systems and interactive graphics now permit a better approach.

I use a graphical explanation using the concepts involving Riemann sums, limits and integrals in order to show the notion of the area under the normal distribution function. As the last step in the process, I show the relationship between the definite integral and the table of normal areas.

Doing it Numerically

A good example of a modern application of numerical calculation comes from the subject of quality control. The common concept of three sigma limits produces a probability of 0.9974. If ten components, operating in series, form an assembly, then the probability of a successful assembly is 0.9974 raised to the tenth power, or 0.9743. A microprocessor might have 500 components operating in series, each one of critical importance to the process. Then the probability of a successful assembly is 0.9974 raised to the 500th power, or 0.2721. Many examples such as microprocessors, airplanes, and complex computer software such as that used in the Star Wars missile defense system have hundreds or thousands of interdependent parts. It is easy to show that for such systems employing three sigma limits is not very useful.

In the next step, I show how four, five or six sigma limits produce greater reliability. The numerical values of these limits are not even listed in most statistics books, but they are easily found as definite integrals using computer algebra systems.

Doing it algebraically

The Binomial Expansion

A very good example of an analytical explanation is the development of the binomial formula by using the binomial expansion. After doing the square and cube of the binomial by hand, I employ the full power of the computer algebra system and use more realistic exponents (sample size) such as 10. It is an easy matter to explain the behavior of the exponents in the expansion and the symmetry of the coefficients. Explaining the coefficients requires more effort as it always has no matter how one explains it. My objective is to make the binomial formula plausible to the students.

Following this explanation, I present several binomial distributions graphically in order to show the influence of the parameters of the distribution. We calculate the probability of several events. I end the discussion with a practical binomial sampling exercise.

The Mean and Variance of the Binomial Distribution

Finding the mean and variance of the binomial distribution is another good application of computer algebra systems. It is a relatively simple matter to prove the formula for the mean and variance of the binomial distribution. This proof combined with good graphs facilitates comprehension.

Using Statistical Pedagogical Software

Unlike other subjects in mathematics, statistics requires the analysis of data. Usually, using software in statistics means performing data analysis by means of a statistical processor such as Minitab. While this has its place there is also a need for pedagogical software that will assist the student in performing routine statistical calculations such as building confidence intervals and regression equations. An excellent system for statistical tutorials and automated homework grading is the Adventures in Statistics Series by Quant Systems. Students are given tutorial exercises which incorporate various help levels. At the student’s convenience, a test for certification is offered which provides evidence of mastery of a particular technique.

Summary

Difficulties in statistics are every bit as bad as if not worse than problems in the calculus were before the calculus reform movement progressed. Reform in statistics can borrow lessons from the calculus reform. The expositions used to explain tangent lines and areas can be applied directly to statistics by adding the normal distribution as another example in a calculus course. The philosophy of the calculus reform, which rests in large part with using alternative explanations, can be applied to statistics. These combined with good statistical pedagogical software and data analysis software can go a long way to improving instruction in and comprehension of statistics.

1