|statistics a measure of dispersion obtained by extracting the square root of the mean of the squared deviations of the observed values from their mean in a frequency distribution|
In statistics, a measure of how much the data in a certain collection are scattered around the mean. A low standard deviation means that the data are tightly clustered; a high standard deviation means that they are widely scattered.
Note: About sixty-eight percent of the data are within one standard deviation of the mean.
in statistics, a measure of the variability (dispersion or spread) of any set of numerical values about their arithmetic mean (average; denoted by mu). It is specifically defined as the positive square root of the variance (sigma2); in symbols, sigma2=Sigma(xImu)2/n, where Sigma is a compact notation used to indicate that as the index (I) changes from 1 to n (the number of elements in the data set), the square of the difference between each element xI and the mean, divided by n, is calculated and these values are added together.
Learn more about standard deviation with a free trial on Britannica.com.