Big Ideas Math Algebra 2, 2014
BI
Big Ideas Math Algebra 2, 2014 View details
Maintaining Mathematical Proficiency
Continue to next subchapter

Exercise 7 Page 593

A standard deviation of a data set tells us how much data values differ from the mean.

See solution.

Practice makes perfect
Let's begin by recalling that the standard deviation of a data set is a measure of spread. In other words, this measure tells us how much data values differ from the mean x. Standard Deviation sqrt((x_1-x)^2+(x_2-x)^2+...+(x_n-x)^2/n)

Now, if a standard deviation is equal to 0 the data values do not differ from the mean. This means that all data values are the same. Let's consider an example of the number of meals per day we had in the previous week, assuming that each day we ate 5 meals. { 5,5,5,5,5,5,5 } Here the mean of the above data set is 5 and each data value is equal to the mean, so the standard deviation will be 0. Let's take a look at the formula for standard deviation once again. Standard Deviation sqrt((x_1-x)^2+(x_2-x)^2+...+(x_n-x)^2/n) We can see that standard deviation is defined as a square root of the sum of the squared differences between data values and the mean. Since a square root cannot be a negative number, a standard deviation also cannot be negative.