Dictionary.com's Word of the Year is...
standard deviation n.
A statistic used as a measure of the dispersion or variation in a distribution, equal to the square root of the arithmetic mean of the squares of the deviations from the arithmetic mean.
In statistics, a measure of how much the data in a certain collection are scattered around the mean. A low standard deviation means that the data are tightly clustered; a high standard deviation means that they are widely scattered.
Note: About sixty-eight percent of the data are within one standard deviation of the mean.
in statistics, a measure of the variability (dispersion or spread) of any set of numerical values about their arithmetic mean (average; denoted by mu). It is specifically defined as the positive square root of the variance (sigma2); in symbols, sigma2=Sigma(xImu)2/n, where Sigma is a compact notation used to indicate that as the index (I) changes from 1 to n (the number of elements in the data set), the square of the difference between each element xI and the mean, divided by n, is calculated and these values are added together.