Home / IB DP Maths / Application and Interpretation HL / IB Mathematics AI AHL Adjacency theory  MAI Study Notes

IB Mathematics AI AHL Adjacency theory  MAI Study Notes - New Syllabus

IB Mathematics AI AHL Adjacency theory  MAI Study Notes

LEARNING OBJECTIVE

  • Adjacency matrices.

Key Concepts: 

  • Adjacency Matrices
  • Transition matrix

MAI HL and SL Notes – All topics

ADJACENCY MATRIX

An adjacency matrix, also known as a connection matrix, is used to represent a simple labeled graph. It is a matrix where both the rows and columns correspond to the graph’s vertices. Each entry $(v_i, v_j)$ in the matrix is either 1 or 0, depending on whether the vertex $v_i$ is connected to $v_j$. If the graph does not contain self-loops, then the diagonal entries must all be 0. In the case of an undirected graph, the matrix is symmetric.

Example

We can represent the graph G:

 

▶️Answer/Explanation

Solution:

 1s show a connection by an edge.
 0s show no connection

The matrix:

is called adjacency matrix(M).

Notice that:

  • The main diagonal contains 0’s (why? Because it’s a simple graph with no loops.)
  • The matrix is symmetric about the main diagonal (why? Because edges are undirected.)
  • The sum in each column (or row) is the degree of the vertex.

THE POWERS OF AN ADJACENCY MATRIX

For the adjacent matrix M of the graph:

$M^2=\left(\begin{array}{lllll}2 & 1 & 2 & 1 & 1 \\ 1 & 3 & 1 & 2 & 2 \\ 2 & 1 & 3 & 1 & 2 \\ 1 & 2 & 1 & 2 & 1 \\ 1 & 2 & 2 & 1 & 4\end{array}\right)$

This matrix shows the number of walks of length 2.

For example:

 there are 2 walks from A to A: ABA and AEA
 there is only 1 walk from A to B: AEB (Note: Source incorrectly states AB which is length 1.)
 there are 3 walks from B to B: BAB, BEB and BCB.

the main diagonal contains the degrees of the vertices (why? A walk of length 2 back to the start vertex must go out along an edge and back along the same edge, so the number of such walks equals the number of edges connected to the vertex, i.e., its degree.)

Similarly, $M^3$ shows the numbers of walks of length 3, 

$M^3=\left(\begin{array}{lllll}2 & 5 & 3 & 3 & 6 \\ 5 & 4 & 7& 3 & 7 \\ 3 & 7 & 4 & 5 & 7 \\ 3 & 3 & 5 & 2 & 6 \\ 6 & 7 & 7 & 6 & 6\end{array}\right)$

Finally, look at the following sums:

 $M+M^{2}$ shows the numbers of walks of length at most 2.
 $M+M^{2}+M^{3}$ shows the numbers of walks of length at most 3.

Weighted adjacency tables.

 In a weighted graph, each edge has an associated numerical value called its weight. This weight often represents a quantity like distance, cost, or capacity associated with the connection between the vertices. For example, a map showing cities (vertices) and driving distances (weights) between them can be represented as a weighted graph.

Sometimes the edge alone is not enough to represent the information we want. For example, we want to include the
distances between 5 cities:

The path ABCD has

$\text{Length = 3 [as it contains 3 edges]}$
$\text{Total weight = 27 [sum of weights 8+9+10=27]}$

We can represent the weighted graph by a weighted adjacency table.(we do not say matrix!).

Transition Matrices

Transition systems were created by Andrei Markov, a Russian mathematician during the 19th century. A transition system can be used to represent the manner in which states change from one state to the next. For example a transition system can be used to represent the chance of rain or a dry day, given yesterday’s weather.

Consider the following statement for Traralgon’s weather in spring: “Long run data suggests that there is a 70% change that if today is dry, then tomorrow will also be dry. Also, if today is wet, there is an 80% chance that tomorrow will also be wet.”

The transition matrix (T) for the above example would be:

Example

In Melbourne there are two major daily newspapers, The Age and the Herald Sun. Readers are fairly
loyal to the newspaper they intend to buy. Records in a particular country town have shown that of
the people who purchase a newspaper every day, 90% of people who buy The Age on one day will
buy it the next day and 80% of people who buy the Herald Sun on one day will buy it the next day.

Construct a transition table
Construct a transition diagram
Construct a transition matrix (T)

▶️Answer/Explanation

Solution:

Transition table

Transition diagram

Transition matrix (T)

Page rank

Google’s success is based on an algorithm to rank webpages, the Page rank, named after Google founder Larry Page.
The basic idea is to determine how likely it is that a web user randomly gets to a given webpage. The webpages are then ranked by these probabilities.

Example

Suppose the internet consisted of only the four webpages A,B, C,D linked as in the following graph:

▶️Answer/Explanation

Solution:

Imagine a surfer following these links at random.

For the probability $\text{PR}_n(A)$ that she is at $A$ (after $n$ steps), we add:

the probability that she was at $B$ (at exactly one time step before), and left for $A$,
→ that’s $\text{PR}_{n-1}(B) \cdot \frac{1}{2}$

the probability that she was at $C$, and left for $A$,
→ that’s $\text{PR}_{n-1}(C) \cdot \frac{1}{1}$

the probability that she was at $D$, and left for $A$,
→ that’s $\text{PR}_{n-1}(D) \cdot \frac{0}{1}$

Hence:

$\text{PR}_n(A) = \text{PR}_{n-1}(B) \cdot \frac{1}{2} + \text{PR}_{n-1}(C) \cdot \frac{1}{1} + \text{PR}_{n-1}(D) \cdot \frac{0}{1}$

We can express this as a Markov matrix:

$
\begin{bmatrix}
\text{PR}_n(A) \\
\text{PR}_n(B) \\
\text{PR}_n(C) \\
\text{PR}_n(D)
\end{bmatrix}
=
\begin{bmatrix}
0 & \frac{1}{2} & 1 & 0 \\
1 & 0 & 0 & 0 \\
0 & \frac{1}{2} & 0 & 1 \\
0 & 0 & 0 & 0
\end{bmatrix}
\begin{bmatrix}
\text{PR}_{n-1}(A) \\
\text{PR}_{n-1}(B) \\
\text{PR}_{n-1}(C) \\
\text{PR}_{n-1}(D)
\end{bmatrix}
$

The PageRank vector $\begin{bmatrix} \text{PR}(A) \\ \text{PR}(B) \\ \text{PR}(C) \\ \text{PR}(D) \end{bmatrix}$ is the long-term equilibrium.

It is an eigenvector of the Markov matrix with eigenvalue 1.

To find this vector, solve:

$
\left( M – I \right) \vec{v} = \vec{0}
\quad \text{where} \quad
M =
\begin{bmatrix}
0 & \frac{1}{2} & 1 & 0 \\
1 & 0 & 0 & 0 \\
0 & \frac{1}{2} & 0 & 1 \\
0 & 0 & 0 & 0
\end{bmatrix}
$

Perform row reduction on:

$
\begin{bmatrix}
-1 & \frac{1}{2} & 1 & 0 \\
1 & -1 & 0 & 0 \\
0 & \frac{1}{2} & -1 & 1 \\
0 & 0 & 0 & -1
\end{bmatrix}
\xrightarrow{\text{RREF}}
\begin{bmatrix}
1 & 0 & 0 & -2 \\
0 & 1 & 0 & -2 \\
0 & 0 & 1 & -3 \\
0 & 0 & 0 & 0
\end{bmatrix}
$

$
\Rightarrow \text{Eigenspace of } \lambda = 1 \text{ is spanned by }
\begin{bmatrix}
2 \\
2 \\
3 \\
1
\end{bmatrix}
$

The PageRank vector is:

$
\begin{bmatrix}
\text{PR}(A) \\
\text{PR}(B) \\
\text{PR}(C) \\
\text{PR}(D)
\end{bmatrix}
=
\frac{3}{16}
\begin{bmatrix}
2 \\
1 \\
2.5 \\
1.5
\end{bmatrix}
=
\begin{bmatrix}
0.375 \\
0.125 \\
0.313 \\
0.188
\end{bmatrix}
$

Final Ranking:

The corresponding ranking of the webpages is:

$
\rm{A > C > D > B}
$

Connection to Markov Chains

The concepts of adjacency matrices and transition matrices in graph theory have a strong connection to the study of Markov chains in probability theory.

Essentially, a Markov chain can be visually and mathematically represented as a weighted directed graph. In this representation:

  • Vertices correspond to the different states of the Markov chain.
  • Edges illustrate the possible transitions between these states.
  • Edge weights represent the transition probabilities – the likelihood of moving from one state to another along a specific edge.

This graphical representation provides an intuitive way to understand the dynamics and transitions within a Markov chain.

Scroll to Top