For example, in the case of exam scores, who cares what the mean is, as long as you scored better than most of the class? Who knows, it may have been an impossible exam and 40 points out of 100 was a great score. In this case, your score itself is meaningless, but your percentile tells you everything.

Suppose your exam score is better than 90 percent of the rest of the class. That means your exam score is at the 90th percentile (so *k *= 90), which hopefully gets you an A. Conversely, if your score is at the 10th percentile (which would never happen to you, because you’re such an excellent student), then *k *= 10; that means only 10 percent of the other scores are below yours, and 90 percent of them are above yours; in this case an A is not in your future.

A nice property of percentiles is they have a universal interpretation: Being at the 95th percentile means the same thing no matter if you are looking at exam scores or weights of packages sent through the postal service; the 95th percentile always means 95 percent of the other values lie below yours, and 5 percent lie above it. This also allows you to fairly compare two data sets that have different means and standard deviations (like ACT scores in reading versus math). It evens the playing field and gives you a way to compare apples to oranges, so to speak.

A percentile is *not *a percent; a percentile is a value (or the average of two values) in the data set that marks a certain percentage of the way through the data. Suppose your score on the GRE was reported to be the 80th percentile. This doesn’t mean you scored 80 percent of the questions correctly. It means that 80 percent of the students’ scores were lower than yours and 20 percent of the students’ scores were higher than yours.