What does Boxing Day have to do with boxing?

standard deviation

noun, Statistics.
1.
a measure of dispersion in a frequency distribution, equal to the square root of the mean of the squares of the deviations from the arithmetic mean of the distribution.
Origin
1920-1925
1920-25
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source
Examples from the web for standard deviation
• The standard deviation is essentially the average difference between each data point and the average.
• In the game probability model, this roughly translates into a standard deviation below average.
• But over ten-year holding periods the standard deviation drops to five percentage points.
• The main arguments it takes are the average and the standard deviation of the distribution you want.
• Every study does not necessarily have bias,also bias has nothing to do with standard deviation or p value.
• Your plotted error bars should be the standard deviation of the mean, not the standard deviation.
• There is a kurtosis measure, a fourth moment as standard deviation is a second moment, for this variation.
• The standard deviation is how much a set of data is different from the curve it should make when plotted on a graph.
British Dictionary definitions for standard deviation

standard deviation

noun
1.
(statistics) a measure of dispersion obtained by extracting the square root of the mean of the squared deviations of the observed values from their mean in a frequency distribution
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source
standard deviation in Medicine

standard deviation n.
Symbol σ
A statistic used as a measure of the dispersion or variation in a distribution, equal to the square root of the arithmetic mean of the squares of the deviations from the arithmetic mean.

The American Heritage® Stedman's Medical Dictionary
Cite This Source
standard deviation in Science
 standard deviation   (stān'dərd)    A statistic used as a measure of the dispersion or variation in a distribution, equal to the square root of the arithmetic mean of the squares of the deviations from the arithmetic mean.
The American Heritage® Science Dictionary
Cite This Source
standard deviation in Culture

standard deviation definition

In statistics, a measure of how much the data in a certain collection are scattered around the mean. A low standard deviation means that the data are tightly clustered; a high standard deviation means that they are widely scattered.

Note: About sixty-eight percent of the data are within one standard deviation of the mean.
The American Heritage® New Dictionary of Cultural Literacy, Third Edition
Cite This Source
standard deviation in Technology
statistics
(SD) A measure of the range of values in a set of numbers. Standard deviation is a statistic used as a measure of the dispersion or variation in a distribution, equal to the square root of the arithmetic mean of the squares of the deviations from the arithmetic mean.
The standard deviation of a random variable or list of numbers (the lowercase greek sigma) is the square of the variance. The standard deviation of the list x1, x2, x3...xn is given by the formula:
sigma = sqrt(((x1-(avg(x)))^2 + (x1-(avg(x)))^2 + ... + (xn(avg(x)))^2)/n)
The formula is used when all of the values in the population are known. If the values x1...xn are a random sample chosen from the population, then the sample Standard Deviation is calculated with same formula, except that (n-1) is used as the denominator.
[dictionary.com (http://dictionary.com/)].
["Barrons Dictionary of Mathematical Terms, second edition"].
(2003-05-06)
The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source
Encyclopedia Article for standard deviation

in statistics, a measure of the variability (dispersion or spread) of any set of numerical values about their arithmetic mean (average; denoted by mu). It is specifically defined as the positive square root of the variance (sigma2); in symbols, sigma2=Sigma(xImu)2/n, where Sigma is a compact notation used to indicate that as the index (I) changes from 1 to n (the number of elements in the data set), the square of the difference between each element xI and the mean, divided by n, is calculated and these values are added together.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.
Cite This Source

Difficulty index for standard deviation

Few English speakers likely know this word

Word Value for standard

10
11
Scrabble Words With Friends