# Understanding the Statistical Properties of the Normal Distribution

When you understand the properties of the normal distribution, you'll find it easier to interpret statistical data. A continuous random variable *X* has a normal distribution if its values fall into a smooth (continuous) curve with a bell-shaped pattern. Each normal distribution has its own mean, denoted by the Greek letter

(say "mu"); and its own standard deviation, denoted by the Greek letter

(say "sigma"). But no matter what their means and standard deviations are, all normal distributions have the same basic bell shape. The following figure shows some examples of normal distributions.

Every normal distribution has certain properties. You use these properties to determine the relative standing of any particular result on the distribution, and to find probabilities. The properties of any normal distribution are as follows:

Its shape is symmetric (that is, when you cut it in half the two pieces are mirror images of each other).

Its distribution has a bump in the middle, with tails going down and out to the left and right.

The mean and the median are the same and lie directly in the middle of the distribution (due to symmetry).

Its standard deviation measures the distance on the distribution from the mean to the

*inflection point*(the place where the curve changes from an "upside-down-bowl" shape to a "right-side-up-bowl" shape).Because of its unique bell shape, probabilities for the normal distribution follow the Empirical Rule, which says the following:

About 68 percent of its values lie within one standard deviation of the mean. To find this range, take the value of the standard deviation, then find the mean plus this amount, and the mean minus this amount.

About 95 percent of its values lie within two standard deviations of the mean. (Here you take 2 times the standard deviation, then add it to and subtract it from the mean.)

Almost all of its values (about 99.7 percent of them) lie within three standard deviations of the mean. (Take 3 times the standard deviation and add it to and subtract it from the mean.)

Take a look again at the above figure. To compare and contrast the distributions shown in the figure, you first see they are all symmetric with the signature bell shape. Examples (a) and (b) have the same standard deviation, but their means are different; the mean in Example (b) is located 30 units to the right of the mean in Example (a) because its mean is 120 compared to 90. Examples (a) and (c) have the same mean (90), but Example (a) has more variability than Example (c) due to its higher standard deviation (30 compared to 10). Because of the increased variability, most of the values in Example (a) lie between 0 and 180 (approximately), while most of the values in Example (c) lie only between 60 and 120.

Finally, Examples (b) and (c) have different means and different standard deviations entirely; Example (b) has a higher mean which shifts the graph to the right, and Example (c) has a smaller standard deviation; its data values are the most concentrated around the mean.

Note that the mean and standard deviation are important in order to properly interpret numbers located on a particular normal distribution. For example, you can compare where the value 120 falls on each of the normal distributions in the above figure. In Example (a), the value 120 is one standard deviation above the mean (because the standard deviation is 30, you get 90 + 1[30] = 120). So on this first distribution, the value 120 is the upper value for the range where the middle 68% of the data are located, according to the Empirical Rule.

In Example (b), the value 120 lies directly on the mean, where the values are most concentrated. In Example (c), the value 120 is way out on the rightmost fringe, 3 standard deviations above the mean (because the standard deviation this time is 10, you get 90 + 3[10] = 120). In Example (c), values beyond 120 are very unlikely to occur because they are beyond the range where the middle 99.7% of the values should be, according to the Empirical Rule.