You are currently browsing the category archive for the ‘Probability’ category.

In this post I will discuss my own understanding of the spectral sparsification paper by Daniel Spielman and Nikhil Srivastava (STOC 2008). I will assume the following:

  1. The reader is a beginner, like me, and have already glanced through the Spielman-Srivastava paper (from now on, the SS paper).
  2. The reader has, like me, a basic understanding of spectral sparsification and associated concepts of matrix analysis. I will assume that she has read and understood the Section 2 (Preliminaries) of the SS paper.
  3. The reader holds a copy of the SS paper while reading my post.

First, I will mention the main theorems (actually, I will mention only what they “roughly” say).

Read the rest of this entry »

If we have a random variable X and any number a, what is the probability that X \geq a? If we know the mean of X, Markov’s inequality can give an upper bound on the probability that X.  As it turns out, this upper bound is rather loose, and it can be improved if we know the variance of X in addition to its mean. This result is known as Chebyshev’s inequality after the name of the famous mathematician Pafnuty Lvovich Chebyshev. It was  first proved by his student Andrey Markov who provided a proof in 1884 in his PhD thesis (see wikipedia).

Read the rest of this entry »

How likely is it that the absolute value of a random variable X will be greater than some specific value, say, a? The Russian mathematician Andrey Andreyevich Markov proved a simple, yet nice, result which enables us to answer the above question.

Markov’s Inequality states that if X is a random variable with mean \mu, and a>0 is a positive number, then

Pr[|X| \geq a] \leq \frac{|\mu|}{a}

Read the rest of this entry »