Comparing Data Sets
Concept

Standard Deviation

The standard deviation is a measure of spread of a data set that measures how much the data elements differ from the mean. The standard deviation, often represented by the Greek letter (sigma), is calculated by taking the square root of the variance of the data set. Let be the data values in a set and their mean.
The applet below calculates the standard deviation for the data set on the number line. Move the points around to change the data.
Applet that calculates the standard deviation of a set of five numbers
As shown, finding the standard deviation involves calculating the average of the squared differences between each data point and the mean, and then taking the square root of that average. The sum of squares can also be written in sigma notation.
Standard deviation is sensitive to outliers because of the squaring of differences. It is commonly used when analyzing a data set that exhibits a normal distribution.
Loading content