- Published on
Notes on Graphs and Spectral Properties
- Authors
- Name
Here is the first series of a collection of notes which I jotted down over the past 2 months as I tried to make sense of algebraic graph theory. This one focuses on the basic definitions and some properties of matrices related to graphs. Having all the symbols and main properties in a single page is a useful reference as I delve deeper into the applications of the theories. Also, it saves me time from googling and checking the relationship between these objects.
Adjacency Matrix
Let be the number of vertices and the number of edges. Then the adjacency matrix of dimension is a matrix where where there is an edge from vertex i to vertex j and zero otherwise. For a weighted adjacency matrix, , we replace 1 with the weights, .
Here we consider the case of undirected graphs. This means that the adjacency matrix is symmetric which implies it has a complete set of real eigenvalues (not necessary positive) and an orthogonal eigenvector basis. The set of eigenvalues () is known as the spectrum of a graph.
Properties
- The greatest eigenvalue, is bounded by the maximum degree.
- Given two graphs with adjacency matrix and , the graphs are isomorphic iff there exist a permutation matrix such that . Implies same eigenvalue /eigenvectors/determinant/trace etc. Note: Two graphs may be isospectral (same set of eigenvalues) but NOT isomorphic.
Incidence Matrix
An incidence matrix is of dimension with if or if or zero otherwise. In other words, each column represents an edge that shows the vertex it is emitting from (1) and the vertex it is pointing to (-1).
For an undirected graph, there are two kinds of incidence matrix: oriented and unoriented. In the unoriented graph, we just put 1 for any vertex that is connected to an edge. The unoriented graph is similar to that of a directed graph (1 and -1) and is unique up to the negation of the columns.
Laplacian Matrix
Laplacian matrix is defined as , or the degree matrix minus the adjacency matrix . Hence, the diagonals are the degree while if and are connected, else 0.
Note:
- is the unoriented incidence matrix.
- The degree matrix is defined as .
- For a weighted degree matrix, the diagonal element .
- The conventional ordering of eigenvalue is opposite to the adjacency matrix! (.)
Walks
A walk on a graph is an alternating path of vertex and series from one vertex to another. A walk between two vertices and is called a walk. It's length is the number of edges.
Cool fact: Take the adjacency matrix and multiply it times, then , an entry from the matrix gives the number of walks of length . Divide the entry by the degree of vertex . Then the entry would give the probability that starting from , you will end up at after steps.
Matrices as operators on the vertices
The adjacency and laplacian matrix can be interpreted as operators on functions of a graph. That is, given , can be interpreted as a function on the vertices, while is a linear mapping of the function .
Or in other words, it is the sum of the elements of x that are connected to vertex . It can also be viewed as a quadratic form:
Similarly, expressing the weighted laplacian matrix as an operator:
As a quadratic form
The symmetric normalised Laplacian matrix is defined as .
Since the degree matrix is a diagonal matrix, is just the matrix with the diagonals square rooted.
Properties of
- is symmetry because is symmetric.
- is an eigenvector of the matrix (sum of any column=0), and , hence 0 is the smallest eigenvalue.
- The eigenvalues are real and non-negative.
Laplacian Matrix and Connectedness
Define a path as a walk without any repeated vertices. A graph is connected if any two of its vertices are contained in a path.
In a fully connected graph, . Proof that the only eigenvector is : Let be the eigenvector associated with the eigenvalue 0. From the quadratic form:
This implies that for any . Since, there exist a path from any two vertices, for all :
Hence, the multiplicity (number of linearly independent) of eigenvalue 0 is 1, and .
In fact, the multiplicity of the eigenvalue 0 tells us the number of connected components in the graph. For example, a graph with two connected components (the adjacency matrix and the laplacian matrix will have a block diagonal structure), you will get two eigenvectors associated with the eigenvalue 0. Something like and .
To summarise, the number of connected components is equal to the multiplicity of eigenvalue 0 which is equal to the dimension of the null space of .
Normalised Symmetric Laplacian and Random walk matrix
The normalised symmetric laplacian is defined as:
In other words, it has 1 on the diagonals and if is adjacent to and 0 otherwise.
The random walk matrix is defined as:
and are similar matrices.
, and
Properties of- The three matrices are symmetric, positive, semidefinite.
- and share the same eigenvalues. is an eigenvector of iff is an eigenvector of .
- is a solution of the eigenvalue problem iff is an eigenvector of for the eigenvalue iff is an eigenvector of for the eigenvalue .
- A similar connection between the connected components and can be made with and .