You are currently browsing the category archive for the ‘Mathematics’ category.

This is an elementary (yet important) fact in matrix analysis.

**Statement**

Let be an complex Hermitian matrix which means where denote the conjugate transpose operation. Let be two different eigenvalues of . Let be the two eigenvectors of corresponding to the two eigenvalues and , respectively.

Then the following is true:

Here denote the usual inner product of two vectors , i.e.,

.

We want to show that for , the following holds:

First, recall the power series . For , this sum is definitely larger than its th term only. In other words, , which implies . Now,

This proof is extracted from the StackExchange discussion here. For more inequalities/bounds for binomial coefficients, see Wikipedia.

We will show that the eigenvalues of the Laplacian matrix of the complete graph are where the eigenvalue has (algebraic) multiplicity and the eigenvalue has multiplicity .

In this post, we provide a proof of Kirchhoff’s Matrix Tree theorem [1] which is quite beautiful in our biased opinion. This is a 160-year-old theorem which connects several fundamental concepts of matrix analysis and graph theory (e.g., eigenvalues, determinants, cofactor/minor, Laplacian matrix, incidence matrix, connectedness, graphs, and trees etc.). There are already many expository articles/proof sketches/proofs of this theorem (for example, see [4-6, 8,9]). Here we compile yet-another-expository-article and present arguments in a way which should be accessible to the beginners.

## Statement

Suppose is the Laplacian of a connected simple graph with vertices. Then the number of spanning trees in , denoted by , is the following:

the product of all non-zero eigenvalues of .

In this post I will discuss my own understanding of the spectral sparsification paper by Daniel Spielman and Nikhil Srivastava (STOC 2008). I will assume the following:

- The reader is a beginner, like me, and have already glanced through the Spielman-Srivastava paper (from now on, the SS paper).
- The reader has, like me, a basic understanding of spectral sparsification and associated concepts of matrix analysis. I will assume that she has read and understood the Section 2 (Preliminaries) of the SS paper.
- The reader holds a copy of the SS paper while reading my post.

First, I will mention the main theorems (actually, I will mention only what they “roughly” say).

Stirling Numbers (of the first and second kind) are famous in combinatorics. There are well known recursive formulas for them, and they can be expressed through generating functions. Below we mention and explain the recursive definitions of the Stirling numbers through combinatorial ideas.

Since the Stirling numbers of the second kind are more intuitive, we will start with them.

Here we give an illustrative proof of Kleene’s recursion theorem, a fundamental theorem in computability/recursion theory. The proof is so simple it can be stated in few lines. On the other hand, the proof is a beauty in itself (after you get hold of it). Without further ado, we will first state the preliminary definitions/lemmas/theorems and then give the formal statement.

Here we would show a beautiful application of Fermat’s Little Theorem.

An integer is a quadratic residue with respect to prime if for some integer .

Given a prime and an integer , how fast can we decide whether is a quadratic residue modulo ? As it turns out (courtesy to Euler), pretty fast.

This is elementary, yet useful and interesting.

**Statement:** All eigenvalues of a Hermitian matrix are real.

**Proof:**