If we have a random variable and any number , what is the probability that ? If we know the mean of , Markov’s inequality can give an upper bound on the probability that . As it turns out, this upper bound is rather loose, and it can be improved if we know the variance of in addition to its mean. This result is known as Chebyshev’s inequality after the name of the famous mathematician Pafnuty Lvovich Chebyshev. It was first proved by his student Andrey Markov who provided a proof in 1884 in his PhD thesis (see wikipedia).
Let be a random variable with mean and variance . Let be any positive number. We ask the following question: What is the probability that will assume a value which which will be away from its mean? Chebyshev’s inequality says,
From Markov’s inequality, we know that for the random variable with mean ,
Now consider the random variable . The expected value of this variable is which, by definition, is the variance of . Therefore,
Also, since , it follows that .
According to Markov’s inequality, the probability that is
Putting , the above inequality becomes
Since occurs precisely when , it follows that
What is the use?
Suppose, the random variable takes values between 0 and 1 (inclusive), has mean 0.5, and variance $0.16$. We ask: what is the probability that assumes a value ?
According to Markov’s inequality,
However, according to Chebyshev’s inequality,
which is a much tighter bound.