MATH 371
Graph Theory — Quick-recall theorem reference
Glossary
Theorem 1.1 — Handshaking Lemma
$$\sum_{v \in V} \deg(v) = 2|E|$$
Every edge has 2 endpoints. When we count degrees, each edge gets counted once at each endpoint — so every edge is counted exactly twice!
Click "Add Edge" to see how each edge contributes +1 to each endpoint's degree
Let $S = \sum_{v \in V} \deg(v)$. Each edge $\{u,v\}$ contributes +1 to $\deg(u)$ and +1 to $\deg(v)$, so contributes exactly 2 to $S$.
Therefore $S = 2|E|$. ∎
Corollary: Since $2|E|$ is even, the number of odd-degree vertices must be even.
Corollary: Since $2|E|$ is even, the number of odd-degree vertices must be even.
Theorem 1.2 — Walk Contains Path
Every $u$-$v$ walk contains a $u$-$v$ path.
If a walk revisits a vertex, you took a detour — cut out the loop and you get a shorter walk. Keep cutting until no vertex repeats.
Walk: a → b → c → b → d → e → d → f (vertices b, d repeat!)
Strong induction on walk length $k$.
Base: $k \le 2$: no room for repeated vertex, so walk is already a path.
Step: If $w_j = w_r$ for $j < r$, skip $w_{j+1}, \ldots, w_r$ to get a shorter walk. By induction, it contains a path. ∎
Base: $k \le 2$: no room for repeated vertex, so walk is already a path.
Step: If $w_j = w_r$ for $j < r$, skip $w_{j+1}, \ldots, w_r$ to get a shorter walk. By induction, it contains a path. ∎
Theorem 1.3 — Bipartite Characterization
$G$ is bipartite $\iff$ $G$ has no odd cycles.
Walk along a cycle alternating colors: Red, Blue, Red, Blue... Even cycle → you land on your starting color. Odd cycle → you need a DIFFERENT color — clash!
Even cycle C₄ — click "Animate Coloring" to see 2-coloring succeed
(⇒) Bipartite ⇒ parts $X, Y$, every edge crosses. Any cycle alternates $X, Y, X, Y\ldots$ — to return to the start set, must have even length.
(⇐) No odd cycles ⇒ pick $v$, set $X = \{$even-dist from $v\}$, $Y = \{$odd-dist$\}$. If two in $X$ were adjacent, their paths + that edge create an odd cycle. Contradiction. ∎
(⇐) No odd cycles ⇒ pick $v$, set $X = \{$even-dist from $v\}$, $Y = \{$odd-dist$\}$. If two in $X$ were adjacent, their paths + that edge create an odd cycle. Contradiction. ∎
Theorem 1.4 — Radius and Diameter Bounds
For any connected graph $G$: $$\\text{rad}(G) \\le \\text{diam}(G) \\le 2 \\cdot \\text{rad}(G)$$
The diameter is always between the radius and twice the radius. You can't have a graph where the farthest pair is more than twice the "best center" distance.
$\\text{rad}(G) \\le \\text{diam}(G)$ is true by definition (min $\\le$ max).
For $\\text{diam}(G) \\le 2 \\cdot \\text{rad}(G)$: let $c$ be a center vertex, $u$ and $v$ the farthest pair:
$\\text{diam}(G) = d(u,v) \\le d(u,c) + d(c,v) \\le 2 \\cdot \\text{ecc}(c) = 2 \\cdot \\text{rad}(G)$. ∎
For $\\text{diam}(G) \\le 2 \\cdot \\text{rad}(G)$: let $c$ be a center vertex, $u$ and $v$ the farthest pair:
$\\text{diam}(G) = d(u,v) \\le d(u,c) + d(c,v) \\le 2 \\cdot \\text{ecc}(c) = 2 \\cdot \\text{rad}(G)$. ∎
Theorem 1.5 — Every Graph is a Center
Every graph $G$ is isomorphic to the center of some graph $H$.
No matter what graph $G$ you have, you can "wrap" it inside a bigger graph $H$ by adding antenna-like vertices on the outside. This forces the outside vertices to have higher eccentricity than anything inside $G$.
Construct $H$ by adding 4 new vertices $w, x, y, z$ to $G$. Connect $w-x$ and $y-z$. Connect $x$ and $y$ to every vertex in $G$.
Now in $H$: any $v \\in G$ has $\\text{ecc}(v)=2$. $\\text{ecc}(x)=\\text{ecc}(y)=3$, and $\\text{ecc}(w)=\\text{ecc}(z)=4$. So $G$ uniquely rests at the center! ∎
Now in $H$: any $v \\in G$ has $\\text{ecc}(v)=2$. $\\text{ecc}(x)=\\text{ecc}(y)=3$, and $\\text{ecc}(w)=\\text{ecc}(z)=4$. So $G$ uniquely rests at the center! ∎
Theorem 1.6 — Periphery Characterization
A graph $G$ is the periphery of some graph $\\iff$ either every vertex has eccentricity $1$, or no vertex has eccentricity $1$.
A graph can be made into the "outermost" rim of a bigger graph exactly when it doesn't have a mix of "adjacent-to-everything" and "not-adjacent-to-everything" vertices.
$(\\impliedby)$ If all have ecc 1, $G$ is complete and is its own periphery. If none have ecc 1, add a single universal vertex $w$. Now $w$ has ecc 1, and all of $G$ has ecc 2, making $G$ the periphery.
$(\\implies)$ Discarding the mixed case: if some but not all vertices had ecc 1, you can't push the entire graph $G$ out to the same periphery distance because the ecc 1 vertices will always be closer to the center than the others. ∎
$(\\implies)$ Discarding the mixed case: if some but not all vertices had ecc 1, you can't push the entire graph $G$ out to the same periphery distance because the ecc 1 vertices will always be closer to the center than the others. ∎
Theorem 1.7 — Adjacency Matrix & Walks
Let $G$ have adjacency matrix $A$. The $(i,j)$ entry of $A^k$ equals the number of walks of length $k$ from $v_i$ to $v_j$.
Matrix multiplication secretly counts walks! Raising $A$ to the $k$-th power tells you exactly how many ways to walk $k$ steps between any two vertices.
Showing A¹ (1-step walks — just the edges!)
Induction on $k$. Base: $k=1$, $A^1=A$, $(i,j)$ is 1 if edge exists, 0 otherwise.
Step: A $k$-walk from $v_i$ to $v_j$ is a $(k-1)$-walk from $v_i$ to some $v_h$, plus edge $v_h \\to v_j$. Summing $[A^{k-1}]_{ih} \\cdot A_{hj}$ over all $h$ is exactly the formula for matrix multiplication $[A^k]_{ij}$. ∎
Step: A $k$-walk from $v_i$ to $v_j$ is a $(k-1)$-walk from $v_i$ to some $v_h$, plus edge $v_h \\to v_j$. Summing $[A^{k-1}]_{ih} \\cdot A_{hj}$ over all $h$ is exactly the formula for matrix multiplication $[A^k]_{ij}$. ∎
Theorem 1.9 — Distance Metrics from Matrices
Define $S_k = I + A + A^2 + \\cdots + A^k$.
1. $\\text{ecc}(v_j) = k$ (first $S_k$ where row $j$ has no zeros)
2. $\\text{rad}(G) = r$ (first $S_r$ with at least one strictly positive row)
3. $\\text{diam}(G) = m$ (first $S_m$ where EVERY entry is strictly positive)
1. $\\text{ecc}(v_j) = k$ (first $S_k$ where row $j$ has no zeros)
2. $\\text{rad}(G) = r$ (first $S_r$ with at least one strictly positive row)
3. $\\text{diam}(G) = m$ (first $S_m$ where EVERY entry is strictly positive)
$S_k$ accumulates all walks of length $\\le k$. Once $[S_k]_{j,i} > 0$, vertex $j$ can reach $i$. When a row fills up, that vertex can reach everyone. When the whole matrix fills up, everyone can reach everyone!
For all $i,j$, $[S_k]_{j,i} > 0$ iff there is a walk from $v_j$ to $v_i$ of length at most $k$ (because $S_k = I + A + \cdots + A^k$ accumulates walk counts).
(1) Eccentricity: The first $k$ where row $j$ has no zeros means $v_j$ can reach every vertex within $k$ steps, and this fails for $k-1$. Hence $\text{ecc}(v_j)=k$.
(2) Radius: Radius is the minimum eccentricity, so it is the first $r$ where some row becomes strictly positive.
(3) Diameter: Diameter is the maximum eccentricity, so it is the first $m$ where every row (equivalently every entry) is strictly positive. ∎
(1) Eccentricity: The first $k$ where row $j$ has no zeros means $v_j$ can reach every vertex within $k$ steps, and this fails for $k-1$. Hence $\text{ecc}(v_j)=k$.
(2) Radius: Radius is the minimum eccentricity, so it is the first $r$ where some row becomes strictly positive.
(3) Diameter: Diameter is the maximum eccentricity, so it is the first $m$ where every row (equivalently every entry) is strictly positive. ∎
Theorem 1.10 — Tree Edge Count
A tree of order $n$ has exactly $n - 1$ edges.
Remove any edge from a tree → it splits into 2 smaller trees. Add up their edges + 1 = the original count. By induction: always $n-1$.
Tree with 7 vertices, 6 edges (= 7 - 1 ✓)
Strong induction on $n$.
Base: $n=1$: 0 edges. ✓
Step: Remove any edge $e$. Tree splits into $T_1, T_2$ with $k_1 + k_2 = k$ vertices. By hypothesis: $(k_1-1) + (k_2-1) + 1 = k - 1$ edges. ∎
Base: $n=1$: 0 edges. ✓
Step: Remove any edge $e$. Tree splits into $T_1, T_2$ with $k_1 + k_2 = k$ vertices. By hypothesis: $(k_1-1) + (k_2-1) + 1 = k - 1$ edges. ∎
Theorem 1.11 — Forest Edge Count
If $F$ is a forest of order $n$ containing $k$ connected components, then $F$ has $n - k$ edges.
A forest is a collection of trees. Each tree uses $n_i - 1$ edges, so we "lose" one edge per component.
Each component $T_i$ is a tree with $n_i$ vertices $\\implies n_i - 1$ edges.
Total edges = $\\sum (n_i - 1) = (\\sum n_i) - k = n - k$. ∎
Total edges = $\\sum (n_i - 1) = (\\sum n_i) - k = n - k$. ∎
Theorems 1.12 & 1.13 — Tree Equivalences
A graph of order $n$ is a tree $\\iff$ it is connected and has $n - 1$ edges $\\iff$ it is acyclic and has $n - 1$ edges.
To check if something is a tree, you don't need to check BOTH "connected" and "no cycles". The magic edge count ($n-1$) lets you swap one check for the other!
1.12 (Connected + $n-1$ edges): If it had a cycle, you could remove an edge without breaking connectivity until it becomes a tree. But that resulting tree would have strictly fewer than $n-1$ edges, which is impossible.
1.13 (Acyclic + $n-1$ edges): Acyclic means it's a forest with $k$ components. Edges = $n-k$. But we are given edges = $n-1$. Therefore $k=1$, meaning it's connected. ∎
1.13 (Acyclic + $n-1$ edges): Acyclic means it's a forest with $k$ components. Edges = $n-k$. But we are given edges = $n-1$. Therefore $k=1$, meaning it's connected. ∎
Theorem 1.14 — Every Tree Has ≥ 2 Leaves
If $T$ is a tree with $n \ge 2$, then $T$ has at least two leaves.
The endpoints of the longest path MUST be leaves — if either had another neighbor, you could extend the path, contradicting "longest."
Let $P=u_0,u_1,\ldots,u_\ell$ be a longest path in $T$.
Suppose $\deg(u_0)\ge 2$. Then $u_0$ has a neighbor $w\neq u_1$. In a tree, $w$ cannot be elsewhere on $P$ (that would create a cycle), so $w\notin V(P)$. Then $w,u_0,u_1,\ldots,u_\ell$ is a path longer than $P$, contradiction.
So $\deg(u_0)=1$, i.e. $u_0$ is a leaf. The same argument at $u_\ell$ shows $\deg(u_\ell)=1$. Therefore $T$ has at least two leaves. ∎
Suppose $\deg(u_0)\ge 2$. Then $u_0$ has a neighbor $w\neq u_1$. In a tree, $w$ cannot be elsewhere on $P$ (that would create a cycle), so $w\notin V(P)$. Then $w,u_0,u_1,\ldots,u_\ell$ is a path longer than $P$, contradiction.
So $\deg(u_0)=1$, i.e. $u_0$ is a leaf. The same argument at $u_\ell$ shows $\deg(u_\ell)=1$. Therefore $T$ has at least two leaves. ∎
Theorem 1.15 — Center of a Tree
In any tree, the center is a single vertex or two adjacent vertices.
Peel off ALL leaves → peel new leaves → repeat. You always end up with 1 or 2 vertices. That's the center!
Click "Peel Leaves" to iteratively remove leaf vertices
Removing all leaves decreases every remaining vertex's eccentricity by exactly 1. Since all eccentricities shift uniformly, the minimizer (center) is preserved. Eventually you reach $K_1$ or $K_2$. ∎
Theorem 1.16 — Subtrees in High Degree Graphs
Let $T$ be a tree with $k$ edges. If $\delta(G) \ge k$, then $G$ contains $T$ as a subgraph.
If every vertex in $G$ has degree $\ge k$, then any tree with $k$ edges can be found hiding inside $G$. Highly connected graphs contain all small trees.
Proof by induction on $k$:
- Base: $k = 0 \Rightarrow T = K_1$ (single vertex), trivially a subgraph. $k = 1 \Rightarrow T = K_2$, and $\delta(G) \ge 1$ means $G$ has at least one edge.
- Inductive step: Assume true for trees with $k-1$ edges.
- $T$ has a leaf $v$ (by Theorem 1.14). Let $w$ be its neighbor.
- $T - v$ has $k-1$ edges. Since $\delta(G) \ge k \ge k-1$, by induction $T - v$ is a subgraph of $G$.
- In $G$, vertex $w$ (from the copy of $T-v$) has degree $\ge k$, but only $k-1$ vertices of $T-v$ are using its edges. So $w$ has a neighbor $u$ outside the copy.
- Attach $u$ as the leaf $\Rightarrow$ we've found $T$ inside $G$. ∎
Theorem 1.17 — Kruskal's Algorithm is Optimal
Kruskal's algorithm computes a minimum weight spanning tree.
Greedy actually works here! Always picking the cheapest non-cycle-forming edge builds the globally cheapest tree.
Click 'Next Edge' to run Kruskal's
Proof by contradiction:
- Let $T$ = Kruskal's result, with edges added as $e_1, e_2, \dots, e_{n-1}$.
- Assume $T$ is not an MST. Pick the MST $T'$ that agrees with $T$ for the most edges.
- Let $e_{k+1}$ be the first edge where they differ ($T'$ has $e_1, \dots, e_k$ but not $e_{k+1}$).
- Adding $e_{k+1}$ to $T'$ creates a cycle $C$. This cycle has an edge $e' \notin T$ (since $T$ has no cycles).
- Kruskal's chose $e_{k+1}$ before $e'$, so $w(e_{k+1}) \le w(e')$.
- Swap: $T' + e_{k+1} - e'$ is a spanning tree with weight $\le$ weight of $T'$, and it agrees with $T$ on one more edge — contradicting our choice of $T'$! ∎
Theorem 1.18 — Cayley's Tree Formula
There are $n^{n-2}$ distinct labeled trees of order $n$.
If you label $n$ vertices uniquely, you can wire them into a tree in exactly $n^{n-2}$ ways. E.g., 3 vertices $= 3^1 = 3$ trees.
Proof via Prüfer sequences (bijection).
Tree $\to$ sequence: Repeat $n-2$ times: choose the smallest labeled leaf, record its neighbor, then delete that leaf.
Sequence $\to$ tree: Start with labels $A=\{1,\dots,n\}$. Reading the sequence left to right, each step connects the smallest label in $A$ not appearing later in the sequence to the current sequence entry, then removes that leaf label from $A$. When the sequence ends, connect the last two labels in $A$.
These constructions are inverse to each other, so labeled trees on $n$ vertices are in bijection with sequences of length $n-2$ over $\{1,\dots,n\}$. Therefore the count is $$n^{n-2}.$$ Also, each vertex $v$ appears exactly $\deg(v)-1$ times in the Prüfer sequence, so sequence length is $$\sum_{v}(\deg(v)-1)=2(n-1)-n=n-2.$$ ∎
Tree $\to$ sequence: Repeat $n-2$ times: choose the smallest labeled leaf, record its neighbor, then delete that leaf.
Sequence $\to$ tree: Start with labels $A=\{1,\dots,n\}$. Reading the sequence left to right, each step connects the smallest label in $A$ not appearing later in the sequence to the current sequence entry, then removes that leaf label from $A$. When the sequence ends, connect the last two labels in $A$.
These constructions are inverse to each other, so labeled trees on $n$ vertices are in bijection with sequences of length $n-2$ over $\{1,\dots,n\}$. Therefore the count is $$n^{n-2}.$$ Also, each vertex $v$ appears exactly $\deg(v)-1$ times in the Prüfer sequence, so sequence length is $$\sum_{v}(\deg(v)-1)=2(n-1)-n=n-2.$$ ∎
Theorem 1.19 — Matrix Tree Theorem
The number of spanning trees of $G$ is any cofactor of its Laplacian matrix $L = D - A$.
You can count spanning trees purely with linear algebra! Build the Laplacian, delete one row and column, and take the determinant.
The recipe:
Let $G$ have $n$ vertices and $k$ edges. Define the incidence matrix $N$ ($n \times k$): $[N]_{i,j} = 1$ if vertex $v_i$ touches edge $f_j$, else 0. Each column has exactly two 1s.
Create $M$ from $N$ by flipping the top 1 in each column to $-1$ (directed edges).
Claim A: $MM^T = D - A$.
- Diagonal $(i=j)$: counts degrees.
- Off-diagonal (edge): $+1 \cdot -1 = -1$.
- Off-diagonal (no edge): 0.
Claim B: Take any subgraph $H$ with $n$ vertices and $n-1$ edges. Delete row $p$ from $M$, keep only columns for $H$'s edges to get $M'$.
- If $H$ is a tree: $|\det(M')| = 1$
- If $H$ is not a tree: $\det(M') = 0$
Putting it together:
Since all row/column sums of $D-A$ are 0, all cofactors are equal. Pick the $(1,1)$ cofactor:
$\det((D-A)_{(1|1)}) = \det(M_1 M_1^T)$
By the Cauchy-Binet formula, this equals the sum of $\det(M')^2$ over all $(n-1)$-column subsets. Each spanning tree contributes $1^2 = 1$ and non-trees contribute 0. $\Rightarrow$ Cofactor = number of spanning trees. ∎
- Build A (adjacency matrix) and D (diagonal matrix of degrees)
- Compute $L = D - A$ (called the Laplacian)
- Delete any one row and matching column from $L$
- The determinant of what's left = number of spanning trees
Let $G$ have $n$ vertices and $k$ edges. Define the incidence matrix $N$ ($n \times k$): $[N]_{i,j} = 1$ if vertex $v_i$ touches edge $f_j$, else 0. Each column has exactly two 1s.
Create $M$ from $N$ by flipping the top 1 in each column to $-1$ (directed edges).
Claim A: $MM^T = D - A$.
- Diagonal $(i=j)$: counts degrees.
- Off-diagonal (edge): $+1 \cdot -1 = -1$.
- Off-diagonal (no edge): 0.
Claim B: Take any subgraph $H$ with $n$ vertices and $n-1$ edges. Delete row $p$ from $M$, keep only columns for $H$'s edges to get $M'$.
- If $H$ is a tree: $|\det(M')| = 1$
- If $H$ is not a tree: $\det(M') = 0$
Putting it together:
Since all row/column sums of $D-A$ are 0, all cofactors are equal. Pick the $(1,1)$ cofactor:
$\det((D-A)_{(1|1)}) = \det(M_1 M_1^T)$
By the Cauchy-Binet formula, this equals the sum of $\det(M')^2$ over all $(n-1)$-column subsets. Each spanning tree contributes $1^2 = 1$ and non-trees contribute 0. $\Rightarrow$ Cofactor = number of spanning trees. ∎
Theorem 1.20 — Eulerian Characterization
Let $G$ be a connected graph. The following are equivalent:
1. $G$ is Eulerian (has a closed trail containing all edges)
2. Every vertex of $G$ has even degree
3. The edge set $E(G)$ can be partitioned into cycles
Euler circuit means every time you enter a vertex you must leave it, so degree parity is even. Even degrees let you peel cycles, and cycle pieces can be spliced into one big circuit.
(1 $\Rightarrow$ 2) In an Eulerian circuit, each visit to a vertex uses one entering and one leaving edge. Edges at every vertex are paired, so every $\deg(v)$ is even.
(2 $\Rightarrow$ 3) Induct on $|E|$. If all degrees are even and $G$ is connected, $G$ contains a cycle $C$. Remove edges of $C$. Remaining components still have all even degrees; by induction each component's edges partition into cycles. Add back $C$ to get a cycle partition of $E(G)$.
(3 $\Rightarrow$ 1) Take a longest circuit formed by splicing cycles from the partition. If unused edges remain, connectivity gives an unused cycle meeting the circuit at some vertex; splice it in to obtain a longer circuit, contradiction. Hence one circuit uses every edge, so $G$ is Eulerian. ∎
Corollary (Euler trail): A connected graph has an Euler trail iff it has at most two odd-degree vertices.
(2 $\Rightarrow$ 3) Induct on $|E|$. If all degrees are even and $G$ is connected, $G$ contains a cycle $C$. Remove edges of $C$. Remaining components still have all even degrees; by induction each component's edges partition into cycles. Add back $C$ to get a cycle partition of $E(G)$.
(3 $\Rightarrow$ 1) Take a longest circuit formed by splicing cycles from the partition. If unused edges remain, connectivity gives an unused cycle meeting the circuit at some vertex; splice it in to obtain a longer circuit, contradiction. Hence one circuit uses every edge, so $G$ is Eulerian. ∎
Corollary (Euler trail): A connected graph has an Euler trail iff it has at most two odd-degree vertices.
Theorem 1.22 — Dirac's Theorem
Let $G$ be a graph of order $n \ge 3$. If $\delta(G) \ge \frac{n}{2}$, then $G$ is Hamiltonian.
If every individual vertex is connected to at least half the graph, the graph is so dense that a Hamiltonian cycle must exist.
Using proof by contradiction:
- Assume the Opposite: Suppose we have a graph where every vertex has degree $\ge n/2$, but there is no Hamiltonian cycle.
- The Longest Path: Let $P$ be the longest possible path in this graph, starting at $v_1$ and ending at $v_p$.
- Endpoints Must Stay on the Path: Because $P$ is already the "longest," the neighbors of start ($v_1$) and end ($v_p$) must be vertices that are already on that path. If they connected outside, the path could be longer!
- The "Folding" Trick: Since $v_1$ and $v_p$ each have at least $n/2$ neighbors on this path, pigeonhole logic proves there must be a spot in the middle where $v_1$ is connected to one vertex ($v_{j+1}$) and $v_p$ is connected to the vertex right before it ($v_j$).
- Creating the Cycle: This "crossing" allows you to re-route the path into a cycle.
- The Contradiction: If this cycle doesn't include every vertex, the high degree condition ($\ge n/2$) would actually let us find a way to make a path even longer than our "longest path." This is impossible, so our original assumption was wrong. A Hamiltonian cycle must exist. ∎
Theorem 1.23 — Ore's Theorem
Let $G$ be a graph of order $n \ge 3$. If $\deg(x) + \deg(y) \ge n$ for all pairs of nonadjacent vertices $x, y$, then $G$ is Hamiltonian.
A stronger version of Dirac's. You don't need EVERY vertex to have degree $\ge n/2$. You just need any two disconnected vertices to have enough combined friends (at least $n$) to pull the graph together into a cycle.
The Goal: Prove that if $\deg(u)+\deg(v) \ge n$ for all nonadjacent vertices, $G$ is Hamiltonian.
Step 1: Create a "Maximum" Non-Hamiltonian Graph: Assume there is a graph $G$ that follows the rule but is not Hamiltonian. We add edges one by one until it is impossible to add another without creating a Hamiltonian cycle. Call this $\overline{G}$.
Step 2: Find the Longest Path: Since $\overline{G}$ is not Hamiltonian, there must be two nonadjacent spots, $u$ and $v$. But adding a road between them would have finished the circle, so there must already be a Hamiltonian path from $u$ to $v$. We label them: $u=v_1, v_2, \dots, v_n=v$.
Step 3: The Impossible Connection (Lemma): If $u$ connects to $v_{i+1}$, then $v$ cannot connect to $v_i$. Why? We could walk $u \to v_{i+1} \dots v \to v_i \dots u$, forming a cycle!
Step 4: Counting the Roads:
- Let $U$ be the set of spots $v_i$ where $u$ connects to $v_{i+1}$. $|U| = \deg(u)$.
- Let $V$ be the spots where $v$ connects to $v_i$. $|V| = \deg(v)$.
By Step 3, $U$ and $V$ cannot overlap. They can only map to the $n-1$ other spots, so combined connections $\le n-1$.
$\deg(u)+\deg(v) \le n-1$
Logical Conclusion: We assumed $\deg(u)+\deg(v) \ge n$, creating a contradiction. ∎
Step 1: Create a "Maximum" Non-Hamiltonian Graph: Assume there is a graph $G$ that follows the rule but is not Hamiltonian. We add edges one by one until it is impossible to add another without creating a Hamiltonian cycle. Call this $\overline{G}$.
Step 2: Find the Longest Path: Since $\overline{G}$ is not Hamiltonian, there must be two nonadjacent spots, $u$ and $v$. But adding a road between them would have finished the circle, so there must already be a Hamiltonian path from $u$ to $v$. We label them: $u=v_1, v_2, \dots, v_n=v$.
Step 3: The Impossible Connection (Lemma): If $u$ connects to $v_{i+1}$, then $v$ cannot connect to $v_i$. Why? We could walk $u \to v_{i+1} \dots v \to v_i \dots u$, forming a cycle!
Step 4: Counting the Roads:
- Let $U$ be the set of spots $v_i$ where $u$ connects to $v_{i+1}$. $|U| = \deg(u)$.
- Let $V$ be the spots where $v$ connects to $v_i$. $|V| = \deg(v)$.
By Step 3, $U$ and $V$ cannot overlap. They can only map to the $n-1$ other spots, so combined connections $\le n-1$.
$\deg(u)+\deg(v) \le n-1$
Logical Conclusion: We assumed $\deg(u)+\deg(v) \ge n$, creating a contradiction. ∎
Theorem 1.24 — Connectivity & Independence
Let $G$ be a connected graph of order $n \ge 3$. If $\kappa(G) \ge \alpha(G)$, then $G$ is Hamiltonian.
If the graph is "hard to break apart" (connectivity is high) and doesn't have large scattered subsets (independence number is low), it must be tightly knit enough to trace a cycle.
Look at the longest cycle $C$ and an excluded vertex $v$. The component $H$ containing $v$ connects to $C$ via $r$ "gatekeeper" vertices. $r$ must be $\ge \kappa(G)$. The vertices immediately following these gatekeepers on $C$ form a set $S$ with $v$. Size of $S$ is $\ge \kappa(G) + 1 \ge \alpha(G) + 1$. Thus, two vertices in $S$ must be adjacent (since $|S| > \alpha$). This creates a detour that makes the cycle longer, contradicting that $C$ was the longest! ∎
Theorem 1.25 — Goodman–Hedetniemi
If $G$ is 2-connected AND has no induced $K_{1,3}$ (claw) AND has no induced $Z_1$, then $G$ is Hamiltonian.
You need a graph that can't be easily split (2-connected), but also doesn't contain awkward 3-way forks (claws) that would force a path to "choose" a branch and abandon the others.
Showing Claw ($K_{1,3}$) — a forbidden induced subgraph
Assume $C$ is the longest cycle, but not Hamiltonian. Because $G$ is 2-connected, there's a vertex $v \notin C$ attached to some $w \in C$. Look at $w$'s neighbours on the cycle: $a$ (before) and $b$ (after). Neither $a$ nor $b$ can connect to $v$, otherwise we get a longer cycle detour.
If $a$ is not connected to $b$, the vertices $\{w, v, a, b\}$ form a perfect claw $K_{1,3}$ centered at $w$. Contradiction!
If $a$ is connected to $b$, the vertices $\{w, v, a, b\}$ form a $Z_1$ graph. Contradiction!
Therefore, $C$ must have been Hamiltonian all along. ∎
If $a$ is not connected to $b$, the vertices $\{w, v, a, b\}$ form a perfect claw $K_{1,3}$ centered at $w$. Contradiction!
If $a$ is connected to $b$, the vertices $\{w, v, a, b\}$ form a $Z_1$ graph. Contradiction!
Therefore, $C$ must have been Hamiltonian all along. ∎
Planarity Theorems
Theorem 1.31 — Euler's Formula
For any connected planar graph: $$V - E + F = 2$$ where $V$ is vertices, $E$ is edges, and $F$ is faces (regions).
Any connected planar graph drawn on a sphere (or plane) creates regions. The relationship between the number of dots, lines, and regions is an unbreakable law of 2D topology.
Euler's Formula: V - E + F = 2
Induction on $F$. Base ($F=1$): If there's only 1 face, there are no cycles. The graph is a tree. By Thm 1.10, $E = V - 1$. So $V - (V - 1) + 1 = 2$. Checks out.
Step: If $F > 1$, there must be a cycle. Pick any edge $e$ on a cycle and remove it. The graph remains connected, but two faces merge into one ($F \to F-1$), and we lost one edge ($E \to E-1$). Vertices $V$ remain the same. The quantity $V - E + F$ stays perfectly balanced! By induction, the new graph satisfies it, so the old one must too. ∎
Step: If $F > 1$, there must be a cycle. Pick any edge $e$ on a cycle and remove it. The graph remains connected, but two faces merge into one ($F \to F-1$), and we lost one edge ($E \to E-1$). Vertices $V$ remain the same. The quantity $V - E + F$ stays perfectly balanced! By induction, the new graph satisfies it, so the old one must too. ∎
Theorem 1.32 — $K_{3,3}$ is nonplanar
The complete bipartite graph $K_{3,3}$ cannot be drawn in the plane without edge crossings.
Imagine 3 houses and 3 utilities (gas, water, electricity). Each house must connect to each utility. You cannot draw all 9 connections without two crossing.
Suppose $K_{3,3}$ is planar. Then $n=6, q=9$. By Euler's Formula:
$r = 2 - n + q = 2 - 6 + 9 = 5 \text{ faces}$
Now count boundary edges. Let $C$ = total count of (face, edge) pairs where the edge borders the face. Each edge borders at most 2 faces, so:
$C \le 2q = 2 \times 9 = 18$
Since $K_{3,3}$ is bipartite, it has no triangles. Every face is bounded by at least 4 edges (shortest cycle in a triangle-free graph is 4). So:
$C \ge 4r = 4 \times 5 = 20$
But $C \le 18$ and $C \ge 20$ is impossible. Contradiction. $K_{3,3}$ is nonplanar! ∎
$r = 2 - n + q = 2 - 6 + 9 = 5 \text{ faces}$
Now count boundary edges. Let $C$ = total count of (face, edge) pairs where the edge borders the face. Each edge borders at most 2 faces, so:
$C \le 2q = 2 \times 9 = 18$
Since $K_{3,3}$ is bipartite, it has no triangles. Every face is bounded by at least 4 edges (shortest cycle in a triangle-free graph is 4). So:
$C \ge 4r = 4 \times 5 = 20$
But $C \le 18$ and $C \ge 20$ is impossible. Contradiction. $K_{3,3}$ is nonplanar! ∎
Theorem 1.33 — The Planar Edge Bound
If $G$ is a connected planar graph with $V \ge 3$, then $E \le 3V - 6$.
There's a hard mathematical cap on how dense a planar graph can be. If you squeeze in more than $3V-6$ edges, you are mathematically guaranteed to cause a crossing.
Let $C$ = total count of (face, edge) boundary pairs.
Since every face is bounded by at least 3 edges (no self-loops, $n \ge 3$, minimum cycle is 3):
$C \ge 3r$
Since each edge borders at most 2 faces:
$C \le 2q$
Combining: $3r \le 2q$. Substitute Euler's formula $r = 2 - n + q$:
$3(2 - n + q) \le 2q$
$6 - 3n + 3q \le 2q$
$q \le 3n - 6$ ∎
Bonus: Equality holds exactly when every face is a triangle (maximal planar/triangulation).
Since every face is bounded by at least 3 edges (no self-loops, $n \ge 3$, minimum cycle is 3):
$C \ge 3r$
Since each edge borders at most 2 faces:
$C \le 2q$
Combining: $3r \le 2q$. Substitute Euler's formula $r = 2 - n + q$:
$3(2 - n + q) \le 2q$
$6 - 3n + 3q \le 2q$
$q \le 3n - 6$ ∎
Bonus: Equality holds exactly when every face is a triangle (maximal planar/triangulation).
Theorem 1.34 — $K_5$ is nonplanar
The complete graph $K_5$ cannot be drawn in the plane without edge crossings.
5 mutual friends cannot all hold hands with each other without some arms crossing.
$K_5$ has $n = 5$ and $q = 10$.
By Theorem 1.33, a planar graph on 5 vertices has bounded edges: $q \le 3(5) - 6 = 9$.
Since $10 > 9$, $K_5$ is nonplanar. ∎
What went wrong without Thm 1.33? Without that bound, we'd have to try drawing $K_5$ in every possible way and check each one. Thm 1.33 gives a clean one-line shortcut.
By Theorem 1.33, a planar graph on 5 vertices has bounded edges: $q \le 3(5) - 6 = 9$.
Since $10 > 9$, $K_5$ is nonplanar. ∎
What went wrong without Thm 1.33? Without that bound, we'd have to try drawing $K_5$ in every possible way and check each one. Thm 1.33 gives a clean one-line shortcut.
Theorem 1.35 — Planar Low Degree
If $G$ is planar, then $\delta(G) \le 5$ (some vertex has degree at most 5).
No matter how you draw a planar network, there must be at least one "lonely" vertex with 5 or fewer connections. A city can't have EVERY building being a massive $\ge 6$-way intersection without roads crossing.
Suppose for contradiction that every vertex of $G$ has degree $\ge 6$. Then:
$\sum_{v \in V} \deg(v) \ge 6n$
But $\sum \deg(v) = 2q$ (Handshaking Lemma), so $2q \ge 6n$, meaning $q \ge 3n$.
But this contradicts Theorem 1.33: planar graphs must have $q \le 3n - 6$.
Therefore some vertex must have degree $\le 5$. ∎
$\sum_{v \in V} \deg(v) \ge 6n$
But $\sum \deg(v) = 2q$ (Handshaking Lemma), so $2q \ge 6n$, meaning $q \ge 3n$.
But this contradicts Theorem 1.33: planar graphs must have $q \le 3n - 6$.
Therefore some vertex must have degree $\le 5$. ∎
Theorem 1.37 — Small Faces on Polyhedra
For any convex polyhedron, the number of sides on its smallest face $\rho \le 5$.
You can NEVER have a 3D convex shape made entirely of hexagons (or larger shapes). The surface would lay perfectly flat like honeycomb and never curve to close up. You always need triangles, squares, or pentagons (like on a soccer ball) to create curvature.
Lower bound ($\rho(P) \ge 3$): A polygon needs at least 3 edges to enclose an area.
Upper bound ($\rho(P) \le 5$): Assume for contradiction every face has $\ge 6$ edges. Total boundary count $\sum \text{sides} \ge 6F$. This total equals $2E$ (each edge borders 2 faces), so $6F \le 2E \Rightarrow E \ge 3F$.
Similarly, each vertex has degree $\ge 3$ (must be a 3D corner), so sum of degrees $\ge 3V$. This equals $2E$, giving $V \le 2E/3$.
Substitute into Euler's formula $V - E + F = 2$:
$2 = V - E + F \le \frac{2}{3}E - E + \frac{1}{3}E = 0$
$2 \le 0$ — contradiction! ∎
Upper bound ($\rho(P) \le 5$): Assume for contradiction every face has $\ge 6$ edges. Total boundary count $\sum \text{sides} \ge 6F$. This total equals $2E$ (each edge borders 2 faces), so $6F \le 2E \Rightarrow E \ge 3F$.
Similarly, each vertex has degree $\ge 3$ (must be a 3D corner), so sum of degrees $\ge 3V$. This equals $2E$, giving $V \le 2E/3$.
Substitute into Euler's formula $V - E + F = 2$:
$2 = V - E + F \le \frac{2}{3}E - E + \frac{1}{3}E = 0$
$2 \le 0$ — contradiction! ∎
Theorem 1.38 — Only 5 Platonic Solids
The only regular convex polyhedra are the tetrahedron, cube, octahedron, dodecahedron, and icosahedron.
A regular polyhedron needs identical faces and identical corners. Thanks to Euler's formula, the math restricts this to exactly 5 workable combinations. The universe doesn't allow a 6th perfect solid.
Let $P$ have $V$ vertices, $E$ edges, $F$ faces, with $k$ edges per face and $r$ faces per vertex.
Two counting facts:
- Each face has $k$ edges, shared by 2 faces: $kF = 2E$
- Each vertex has $r$ edges, shared by 2 vertices: $rV = 2E$
Substitute $V = 2E/r$ and $F = 2E/k$ into Euler's Formula $V - E + F = 2$:
$2E/r - E + 2E/k = 2$
Divide by $2E$:
$1/r - 1/2 + 1/k = 1/E$
Since $1/E > 0$, we get $1/r + 1/k > 1/2$, or equivalently $2/r + 2/k > 1$.
From Thm 1.37: $3 \le k \le 5$. And $r \ge 3$ for a 3D corner. Enumerate all combinations:
- $k=3, r=3$: sum = $4/3 > 1$ (Tetrahedron)
- $k=3, r=4$: sum = $7/6 > 1$ (Octahedron)
- $k=3, r=5$: sum = $16/15 > 1$ (Icosahedron)
- $k=4, r=3$: sum = $7/6 > 1$ (Cube)
- $k=5, r=3$: sum = $16/15 > 1$ (Dodecahedron)
All other combinations give $\le 1$. Exactly 5 solutions exist. ∎
Two counting facts:
- Each face has $k$ edges, shared by 2 faces: $kF = 2E$
- Each vertex has $r$ edges, shared by 2 vertices: $rV = 2E$
Substitute $V = 2E/r$ and $F = 2E/k$ into Euler's Formula $V - E + F = 2$:
$2E/r - E + 2E/k = 2$
Divide by $2E$:
$1/r - 1/2 + 1/k = 1/E$
Since $1/E > 0$, we get $1/r + 1/k > 1/2$, or equivalently $2/r + 2/k > 1$.
From Thm 1.37: $3 \le k \le 5$. And $r \ge 3$ for a 3D corner. Enumerate all combinations:
- $k=3, r=3$: sum = $4/3 > 1$ (Tetrahedron)
- $k=3, r=4$: sum = $7/6 > 1$ (Octahedron)
- $k=3, r=5$: sum = $16/15 > 1$ (Icosahedron)
- $k=4, r=3$: sum = $7/6 > 1$ (Cube)
- $k=5, r=3$: sum = $16/15 > 1$ (Dodecahedron)
All other combinations give $\le 1$. Exactly 5 solutions exist. ∎
Theorem 1.40 — Kuratowski's Theorem
A graph is planar $\iff$ it contains no subdivision of $K_5$ or $K_{3,3}$.
Every nonplanar graph, no matter how wildly complex, traces its "nonplanarity" back to one of two hidden culprits (a $K_5$ or $K_{3,3}$), possibly with edges stretched out by extra vertices.
Showing K₅, the clique of 5 with an unavoidable crossing
What is a "subdivision"? Take a graph and "stretch" edges by inserting degree-2 vertices. Subdividing doesn't change planarity.
The two directions:
- Easy ($\Rightarrow$): If $G$ is planar, it obviously can't contain a subdivision of $K_5$ or $K_{3,3}$, since those are nonplanar and a planar graph can't contain a nonplanar subgraph.
- Hard ($\Leftarrow$): If $G$ has no subdivision of $K_5$ or $K_{3,3}$ hidden in it, then $G$ must be planar. This is the remarkable and deep part — you just need to check for these two obstructions.
Application: To prove the Petersen graph is nonplanar, pick 6 specific vertices and find 9 internally vertex-disjoint paths connecting them exactly like a $K_{3,3}$! By Kuratowski's theorem, it is nonplanar. ∎
The two directions:
- Easy ($\Rightarrow$): If $G$ is planar, it obviously can't contain a subdivision of $K_5$ or $K_{3,3}$, since those are nonplanar and a planar graph can't contain a nonplanar subgraph.
- Hard ($\Leftarrow$): If $G$ has no subdivision of $K_5$ or $K_{3,3}$ hidden in it, then $G$ must be planar. This is the remarkable and deep part — you just need to check for these two obstructions.
Application: To prove the Petersen graph is nonplanar, pick 6 specific vertices and find 9 internally vertex-disjoint paths connecting them exactly like a $K_{3,3}$! By Kuratowski's theorem, it is nonplanar. ∎
Quiz 4 — Graph Coloring
Theorem 1.42 — Greedy Coloring Bound
$$\chi(G) \le \Delta(G) + 1$$
Line up vertices. Each has at most $\Delta$ neighbors, so at most $\Delta$ colors "taken." With $\Delta+1$ crayons, there's always one free!
Click "Color Next Vertex" to step through the greedy algorithm
Order vertices arbitrarily as $v_1,\ldots,v_n$, and color greedily in that order.
When coloring $v_i$, it has at most $\Delta(G)$ neighbors total, so among already-colored vertices, at most $\Delta(G)$ colors are forbidden. Using a palette of $\Delta(G)+1$ colors, at least one color remains available for $v_i$.
Proceeding through all vertices yields a proper coloring with at most $\Delta(G)+1$ colors. Therefore $\chi(G)\le \Delta(G)+1$. ∎
When coloring $v_i$, it has at most $\Delta(G)$ neighbors total, so among already-colored vertices, at most $\Delta(G)$ colors are forbidden. Using a palette of $\Delta(G)+1$ colors, at least one color remains available for $v_i$.
Proceeding through all vertices yields a proper coloring with at most $\Delta(G)+1$ colors. Therefore $\chi(G)\le \Delta(G)+1$. ∎
Theorem 1.43 — Brooks's Theorem
If $G$ is connected and is neither $K_n$ nor an odd cycle, then $\chi(G) \le \Delta(G)$.
The "+1" from greedy is only needed for complete graphs ($K_n$) and odd cycles ($C_{2k+1}$). Everyone else can be cleverly ordered so greedy uses only $\Delta$ colors.
Two exceptions → need $\Delta+1$:
• $K_n$: everyone connected to everyone → need $n = \Delta+1 colors.
• Odd cycle: alternate 2 colors around ring, odd length forces a 3rd = $\Delta+1$.
Everyone else: find non-adjacent $u, w$ sharing a neighbor $v$. Give $u, w$ the SAME color. Process $v$ last → its $\Delta$ neighbors only use $\Delta-1$ distinct colors → free slot!
• $K_n$: everyone connected to everyone → need $n = \Delta+1 colors.
• Odd cycle: alternate 2 colors around ring, odd length forces a 3rd = $\Delta+1$.
Everyone else: find non-adjacent $u, w$ sharing a neighbor $v$. Give $u, w$ the SAME color. Process $v$ last → its $\Delta$ neighbors only use $\Delta-1$ distinct colors → free slot!
Case 1: $G$ not 2-connected. Then $G$ has a cut vertex; color each block with at most $\Delta(G)$ colors and match colors at shared cut vertices. This gives a global $\Delta(G)$-coloring.
Case 2: $G$ 2-connected, not complete, not an odd cycle. Choose a vertex $v$ of degree $\Delta$, and two nonadjacent neighbors $u,w$ of $v$. Order vertices so $u=v_1$, $w=v_2$, $v=v_n$, and every intermediate vertex has a later neighbor. Color greedily in this order, first giving $u,w$ the same color.
For each intermediate vertex, at least one neighbor is uncolored, so at most $\Delta-1$ neighbor colors are forbidden; one of $\Delta$ colors is free. At the last vertex $v$, its $\Delta$ neighbors include $u,w$ with the same color, so at most $\Delta-1$ distinct colors appear around $v$; one color is free for $v$. Hence $\chi(G)\le\Delta(G)$. ∎
Case 2: $G$ 2-connected, not complete, not an odd cycle. Choose a vertex $v$ of degree $\Delta$, and two nonadjacent neighbors $u,w$ of $v$. Order vertices so $u=v_1$, $w=v_2$, $v=v_n$, and every intermediate vertex has a later neighbor. Color greedily in this order, first giving $u,w$ the same color.
For each intermediate vertex, at least one neighbor is uncolored, so at most $\Delta-1$ neighbor colors are forbidden; one of $\Delta$ colors is free. At the last vertex $v$, its $\Delta$ neighbors include $u,w$ with the same color, so at most $\Delta-1$ distinct colors appear around $v$; one color is free for $v$. Hence $\chi(G)\le\Delta(G)$. ∎
Theorem 1.45 — Independence Number Bounds
$$\frac{n}{\alpha(G)} \le \chi(G) \le n + 1 - \alpha(G)$$
Lower: Each color class is independent (≤ $\alpha$ vertices), so need ≥ $n/\alpha$ classes.
Upper: Color biggest independent set with 1 color, give everyone else their own → $1 + (n-\alpha) = n+1-\alpha$.
Upper: Color biggest independent set with 1 color, give everyone else their own → $1 + (n-\alpha) = n+1-\alpha$.
C₅: n=5, α=2 → bounds: ⌈5/2⌉=3 ≤ χ ≤ 5+1-2=4. Actual: χ=3
Lower bound: In any proper coloring with $\chi(G)$ colors, each color class is an independent set, so each class has size at most $\alpha(G)$. Therefore
$$n \le \chi(G)\,\alpha(G)\quad\Rightarrow\quad \chi(G)\ge \frac{n}{\alpha(G)}.$$
Upper bound: Let $S$ be a maximum independent set with $|S|=\alpha(G)$. Color all vertices of $S$ with one color. Color each of the remaining $n-\alpha(G)$ vertices with a distinct fresh color. Total colors used:
$$1+(n-\alpha(G))=n+1-\alpha(G).$$
Hence $\chi(G)\le n+1-\alpha(G)$. ∎
Theorem 1.47 — Five Color Theorem
Every planar graph is 5-colorable.
Every planar graph has a vertex with deg ≤ 5 (Thm 1.35). Remove it, color the rest by induction, put it back. If stuck → Kempe chain swap!
Step through the Kempe chain argument for 5-coloring
Induction on $n$. By Thm 1.35, $\exists v$ with $\deg(v)\le 5$. Remove $v$, 5-color $G-v$.
If $\deg(v)\le 4$: at most 4 colors used → 5th is free.
If $\deg(v)=5$, all 5 colors used: Kempe chain swap on colors 1-3. If $v_1, v_3$ in different components → swap frees color 1. If same → the 1-3 path separates $v_2$ from $v_4$ → swap 2-4 frees color 2. ∎
If $\deg(v)\le 4$: at most 4 colors used → 5th is free.
If $\deg(v)=5$, all 5 colors used: Kempe chain swap on colors 1-3. If $v_1, v_3$ in different components → swap frees color 1. If same → the 1-3 path separates $v_2$ from $v_4$ → swap 2-4 frees color 2. ∎
Theorem 1.48 — Deletion-Contraction
$$C_G(k) = C_{G-e}(k) - C_{G/e}(k)$$
Colorings of $G-e$ split into: (1) $u \ne v$ = colorings of $G$, and (2) $u = v$ = colorings of $G/e$.
So $C_{G-e} = C_G + C_{G/e}$. Rearrange!
So $C_{G-e} = C_G + C_{G/e}$. Rearrange!
Triangle K₃: click to see deletion and contraction
Let $e=\{u,v\}$. Consider all proper $k$-colorings of $G-e$, and split them into two disjoint classes:
• Class A: $u\ne v$. These are exactly the proper colorings of $G$ (since adding edge $e$ enforces $u\ne v$).
• Class B: $u=v$. These are exactly the proper colorings of $G/e$ (contracting $e$ forces $u,v$ to share one color).
So $$C_{G-e}(k)=C_G(k)+C_{G/e}(k),$$ and rearranging gives $$C_G(k)=C_{G-e}(k)-C_{G/e}(k).$$ ∎
• Class A: $u\ne v$. These are exactly the proper colorings of $G$ (since adding edge $e$ enforces $u\ne v$).
• Class B: $u=v$. These are exactly the proper colorings of $G/e$ (contracting $e$ forces $u,v$ to share one color).
So $$C_{G-e}(k)=C_G(k)+C_{G/e}(k),$$ and rearranging gives $$C_G(k)=C_{G-e}(k)-C_{G/e}(k).$$ ∎
Theorem 1.49 — Chromatic Polynomial Properties
$C_G(k)$ has degree $n$, leading coeff $1$, coeff of $k^{n-1}$ is $-q$, constant term $0$, alternating signs.
Sanity checklist for your answer:
$$C_G(k) = k^n - q\,k^{n-1} + \cdots + 0$$ Signs alternate: $+, -, +, -, \ldots$
$C_G(0) = 0$ always (can't color with 0 colors!)
$$C_G(k) = k^n - q\,k^{n-1} + \cdots + 0$$ Signs alternate: $+, -, +, -, \ldots$
$C_G(0) = 0$ always (can't color with 0 colors!)
Use induction via deletion-contraction:
$$C_G = C_{G-e} - C_{G/e}.$$
Deletion keeps $n$ vertices; contraction has $n-1$ vertices. This gives:
(1) degree is $n$.
(2) leading coefficient is $1$ (from repeated deletions down to edgeless graph).
(3) coefficient of $k^{n-1}$ is $-q$ (each edge contributes one subtraction at that level).
(4) $C_G(0)=0$ since no graph has a proper 0-coloring.
(5) signs alternate by the recursive subtraction pattern in deletion-contraction. ∎
(1) degree is $n$.
(2) leading coefficient is $1$ (from repeated deletions down to edgeless graph).
(3) coefficient of $k^{n-1}$ is $-q$ (each edge contributes one subtraction at that level).
(4) $C_G(0)=0$ since no graph has a proper 0-coloring.
(5) signs alternate by the recursive subtraction pattern in deletion-contraction. ∎
Claim — Cycle Graph Chromatic Polynomial
$$C_{C_n}(k) = (k-1)^n + (-1)^n(k-1)$$
Delete the closing edge → path: $k(k-1)^{n-1}$. Contract it → shorter cycle: $C_{n-1}$. Apply deletion-contraction and simplify.
C₄ with k=3 colors: (3-1)⁴ + (−1)⁴(3-1) = 16+2 = 18 colorings
Induction on $n$.
Base ($n=3$): $C_3=K_3$, so $$C_{C_3}(k)=k(k-1)(k-2)=(k-1)^3-(k-1),$$ matching $(k-1)^n+(-1)^n(k-1)$ at $n=3$.
Step: Assume true for $C_{n-1}$. In $C_n$, let $e=\{v_1,v_n\}$. By deletion-contraction: $$C_{C_n}(k)=C_{C_n-e}(k)-C_{C_n/e}(k).$$ Now $C_n-e=P_n$, so $C_{P_n}(k)=k(k-1)^{n-1}$, and $C_n/e=C_{n-1}$. Thus $$C_{C_n}(k)=k(k-1)^{n-1}-\Big((k-1)^{n-1}+(-1)^{n-1}(k-1)\Big)$$ $$=(k-1)^n+(-1)^n(k-1).$$ Hence the formula holds for all $n\ge 3$. ∎
Base ($n=3$): $C_3=K_3$, so $$C_{C_3}(k)=k(k-1)(k-2)=(k-1)^3-(k-1),$$ matching $(k-1)^n+(-1)^n(k-1)$ at $n=3$.
Step: Assume true for $C_{n-1}$. In $C_n$, let $e=\{v_1,v_n\}$. By deletion-contraction: $$C_{C_n}(k)=C_{C_n-e}(k)-C_{C_n/e}(k).$$ Now $C_n-e=P_n$, so $C_{P_n}(k)=k(k-1)^{n-1}$, and $C_n/e=C_{n-1}$. Thus $$C_{C_n}(k)=k(k-1)^{n-1}-\Big((k-1)^{n-1}+(-1)^{n-1}(k-1)\Big)$$ $$=(k-1)^n+(-1)^n(k-1).$$ Hence the formula holds for all $n\ge 3$. ∎
Quiz 5 — Matchings & Ramsey Theory
1.50 — Berge's Theorem
A matching $M$ is maximum $\iff$ there is no $M$-augmenting path.
Imagine you're organizing a dance. You pair people up — each person can only have one partner. A matching is a valid set of pairs.
Now, you want the most pairs possible. How do you know if you've hit the max? Look for an augmenting path: a chain of people that goes "unpaired person → paired person → their partner → paired person → ... → unpaired person", alternating between non-dance-partners and dance-partners.
If you find one, you can swap everyone along the chain — the pairs become un-paired and vice versa — and you end up with one extra pair. If you can't find any such chain, congratulations — your matching is already the best possible.
Now, you want the most pairs possible. How do you know if you've hit the max? Look for an augmenting path: a chain of people that goes "unpaired person → paired person → their partner → paired person → ... → unpaired person", alternating between non-dance-partners and dance-partners.
If you find one, you can swap everyone along the chain — the pairs become un-paired and vice versa — and you end up with one extra pair. If you can't find any such chain, congratulations — your matching is already the best possible.
Why does this work? Two directions:
"If there's an augmenting path, your matching isn't max" — obvious. You found a chain, flipped it, got +1 pair. So you weren't at the max before.
"If there's NO augmenting path, you ARE at the max" — Proof by contradiction. Pretend your matching $M$ isn't max, and some better matching $M^*$ exists. Overlay both: look at edges that are in one but not the other. Each person touches at most 1 edge from $M$ and 1 from $M^*$, so you get a bunch of paths and cycles. Since $M^*$ is bigger, one of those paths must have more $M^*$-edges, meaning it starts and ends at people not in $M$ — that's an augmenting path. But we said there are none! Contradiction.
"If there's an augmenting path, your matching isn't max" — obvious. You found a chain, flipped it, got +1 pair. So you weren't at the max before.
"If there's NO augmenting path, you ARE at the max" — Proof by contradiction. Pretend your matching $M$ isn't max, and some better matching $M^*$ exists. Overlay both: look at edges that are in one but not the other. Each person touches at most 1 edge from $M$ and 1 from $M^*$, so you get a bunch of paths and cycles. Since $M^*$ is bigger, one of those paths must have more $M^*$-edges, meaning it starts and ends at people not in $M$ — that's an augmenting path. But we said there are none! Contradiction.
A matching is a set of edges with no shared endpoints — like pairing people for a dance.
An augmenting path is a secret shortcut: a trail alternating between "not in $M$" and "in $M$" edges, starting and ending at unmatched vertices.
If you find one, flip every edge along it (matched becomes unmatched and vice versa). You gain +1 pair.
An augmenting path is a secret shortcut: a trail alternating between "not in $M$" and "in $M$" edges, starting and ending at unmatched vertices.
If you find one, flip every edge along it (matched becomes unmatched and vice versa). You gain +1 pair.
The theorem says two things:
• Found an augmenting path? Flip it. $|M|$ grows. So $M$ wasn't maximum.
• No augmenting path anywhere? Then $M$ is already the best you can do.
• Found an augmenting path? Flip it. $|M|$ grows. So $M$ wasn't maximum.
• No augmenting path anywhere? Then $M$ is already the best you can do.
Step through: find the augmenting path, then flip it.
($\Rightarrow$ "aug path $\Rightarrow$ not max"):
If augmenting path $P$ exists, flip all edges along $P$. The path has one more non-$M$ edge than $M$-edge (both endpoints are free), so flipping gives $|M|+1$ edges. $M$ wasn't maximum.
($\Leftarrow$ "no aug path $\Rightarrow$ max"):
Suppose no $M$-augmenting path exists but $M$ is not maximum. Let $M^*$ be a bigger matching. Look at the symmetric difference $M \triangle M^*$ (edges in one but not both). Every vertex touches at most one $M$-edge and one $M^*$-edge, so each connected component is either:
• An alternating cycle (equal $M$ and $M^*$ edges — doesn't help), or
• An alternating path.
Since $|M^*| > |M|$, at least one path has more $M^*$-edges. That path starts and ends at vertices unmatched by $M$ — that's an augmenting path! Contradiction. $\square$
If augmenting path $P$ exists, flip all edges along $P$. The path has one more non-$M$ edge than $M$-edge (both endpoints are free), so flipping gives $|M|+1$ edges. $M$ wasn't maximum.
($\Leftarrow$ "no aug path $\Rightarrow$ max"):
Suppose no $M$-augmenting path exists but $M$ is not maximum. Let $M^*$ be a bigger matching. Look at the symmetric difference $M \triangle M^*$ (edges in one but not both). Every vertex touches at most one $M$-edge and one $M^*$-edge, so each connected component is either:
• An alternating cycle (equal $M$ and $M^*$ edges — doesn't help), or
• An alternating path.
Since $|M^*| > |M|$, at least one path has more $M^*$-edges. That path starts and ends at vertices unmatched by $M$ — that's an augmenting path! Contradiction. $\square$
1.51 — Hall's Marriage Theorem
In bipartite $G=(X,Y)$: $X$ can be fully matched $\iff$ $|N(S)| \ge |S|$ for every $S \subseteq X$.
You have a group of students on the left, and a group of projects on the right. Each student is interested in some projects. Can every student get a unique project?
Hall says: Yes, if and only if there's no "bottleneck." A bottleneck is when some group of $k$ students are all only interested in fewer than $k$ projects. If 3 students only like 2 projects, someone's out of luck — no way around it.
The amazing part: if there's no bottleneck anywhere (every group of students has enough project options), then you can always find an assignment. The condition isn't just necessary — it's sufficient.
Hall says: Yes, if and only if there's no "bottleneck." A bottleneck is when some group of $k$ students are all only interested in fewer than $k$ projects. If 3 students only like 2 projects, someone's out of luck — no way around it.
The amazing part: if there's no bottleneck anywhere (every group of students has enough project options), then you can always find an assignment. The condition isn't just necessary — it's sufficient.
Easy direction: If everyone already has a project, then any group of $k$ students has $k$ distinct projects, so $|N(S)| \ge |S|$. Obvious.
Hard direction (two cases):
Case 1 — "Everyone has options to spare": Every group of students has strictly more projects than people. So just pick any student, give them any project. Remove both. The leftovers still have no bottleneck (we had room to spare). Repeat until everyone's assigned.
Case 2 — "Some group is exactly tight": Some group of $k$ students has exactly $k$ project options — no wiggle room. Handle them first: assign those $k$ students to those $k$ projects (by induction, it works). Now the remaining students still satisfy the condition in the remaining projects (you can check by combining any leftover group with the tight group). So assign the rest by induction too. Done!
Hard direction (two cases):
Case 1 — "Everyone has options to spare": Every group of students has strictly more projects than people. So just pick any student, give them any project. Remove both. The leftovers still have no bottleneck (we had room to spare). Repeat until everyone's assigned.
Case 2 — "Some group is exactly tight": Some group of $k$ students has exactly $k$ project options — no wiggle room. Handle them first: assign those $k$ students to those $k$ projects (by induction, it works). Now the remaining students still satisfy the condition in the remaining projects (you can check by combining any leftover group with the tight group). So assign the rest by induction too. Done!
The dating analogy: $X$ = people, $Y$ = potential partners, edges = "would be ok with."
Everyone in $X$ gets a unique partner $\iff$ no group of people is stuck fighting over too few options.
If 3 applicants only connect to 2 jobs, someone is left out. Hall says: this bottleneck condition is not only necessary, but sufficient.
Everyone in $X$ gets a unique partner $\iff$ no group of people is stuck fighting over too few options.
If 3 applicants only connect to 2 jobs, someone is left out. Hall says: this bottleneck condition is not only necessary, but sufficient.
Step through: verify Hall's condition, then build matching.
Necessity (easy direction):
If a matching pairs everyone in $X$, then for any subset $S \subseteq X$, those $|S|$ people are matched to $|S|$ distinct partners in $N(S)$. So $|N(S)| \ge |S|$. Done.
Sufficiency (hard direction — induction on $|X|$, two cases):
Assume $|N(S)| \ge |S|$ for every $S \subseteq X$.
Case 1 — "Lots of room everywhere":
Every proper nonempty $S \subset X$ has $|N(S)| \ge |S|+1$ (strict inequality).
Pick any $x \in X$, match it to any neighbor $y$. Remove $x$ and $y$ from the graph. The remaining graph still satisfies Hall (we had room to spare). By induction, match the rest. Add back $x$-$y$. Everyone matched!
Case 2 — "Some group is tight":
There exists $S \subset X$ with $|N(S)| = |S|$ exactly (no room to spare).
Match $S$ into $N(S)$ by induction (they form their own bipartite graph satisfying Hall).
Now match the leftovers $X_2 = X \setminus S$ into $Y_2 = Y \setminus N(S)$. Does Hall hold there?
For any $T \subseteq X_2$, apply Hall in the original graph to $T \cup S$:
$|N(T \cup S)| \ge |T \cup S| = |T| + |S|$. Subtract the $|S|$ neighbors already used: $|N(T) \cap Y_2| \ge |T|$.
So Hall holds in the leftover graph too. Induction matches $X_2$. Combine both matchings. $\square$
If a matching pairs everyone in $X$, then for any subset $S \subseteq X$, those $|S|$ people are matched to $|S|$ distinct partners in $N(S)$. So $|N(S)| \ge |S|$. Done.
Sufficiency (hard direction — induction on $|X|$, two cases):
Assume $|N(S)| \ge |S|$ for every $S \subseteq X$.
Case 1 — "Lots of room everywhere":
Every proper nonempty $S \subset X$ has $|N(S)| \ge |S|+1$ (strict inequality).
Pick any $x \in X$, match it to any neighbor $y$. Remove $x$ and $y$ from the graph. The remaining graph still satisfies Hall (we had room to spare). By induction, match the rest. Add back $x$-$y$. Everyone matched!
Case 2 — "Some group is tight":
There exists $S \subset X$ with $|N(S)| = |S|$ exactly (no room to spare).
Match $S$ into $N(S)$ by induction (they form their own bipartite graph satisfying Hall).
Now match the leftovers $X_2 = X \setminus S$ into $Y_2 = Y \setminus N(S)$. Does Hall hold there?
For any $T \subseteq X_2$, apply Hall in the original graph to $T \cup S$:
$|N(T \cup S)| \ge |T \cup S| = |T| + |S|$. Subtract the $|S|$ neighbors already used: $|N(T) \cap Y_2| \ge |T|$.
So Hall holds in the leftover graph too. Induction matches $X_2$. Combine both matchings. $\square$
1.52 — SDR (System of Distinct Representatives)
$S_1,\dots,S_k$ has an SDR $\iff$ $|\bigcup_{i\in I} S_i| \ge |I|$ for every $I \subseteq \{1,\dots,k\}$.
You have several bags, each containing some items. You need to pick one item from each bag, and no two picks can be the same item. That's an SDR — a System of Distinct Representatives.
When can you do it? Exactly when there's no shortage: any $k$ bags together must contain at least $k$ different items total. If 3 bags combined only have 2 unique items, you can't pick 3 different things from them.
The punchline: This is just Hall's theorem in disguise. Bags = left vertices, items = right vertices, "bag contains item" = edge. Picking a distinct rep from each bag = finding a matching. Done. Same theorem, different words.
When can you do it? Exactly when there's no shortage: any $k$ bags together must contain at least $k$ different items total. If 3 bags combined only have 2 unique items, you can't pick 3 different things from them.
The punchline: This is just Hall's theorem in disguise. Bags = left vertices, items = right vertices, "bag contains item" = edge. Picking a distinct rep from each bag = finding a matching. Done. Same theorem, different words.
What's an SDR? Pick one element from each set so that all picks are different.
Example: $S_1=\{a,b\}$, $S_2=\{b,c\}$, $S_3=\{a,c\}$. An SDR: pick $a, b, c$. Works!
This is literally Hall's theorem wearing a disguise.
Example: $S_1=\{a,b\}$, $S_2=\{b,c\}$, $S_3=\{a,c\}$. An SDR: pick $a, b, c$. Works!
This is literally Hall's theorem wearing a disguise.
Step through: pick one distinct element from each set.
The translation:
• Left vertices = sets $S_1, \dots, S_k$
• Right vertices = all elements
• Edge $S_i$-$x$ iff $x \in S_i$
• SDR = matching that covers all left vertices
• For any index set $I$: $N(I) = \bigcup_{i \in I} S_i$
So Hall's condition $|N(I)| \ge |I|$ is exactly $|\bigcup_{i \in I} S_i| \ge |I|$.
SDR exists $\iff$ Hall's condition holds. One theorem, two languages. $\square$
• Left vertices = sets $S_1, \dots, S_k$
• Right vertices = all elements
• Edge $S_i$-$x$ iff $x \in S_i$
• SDR = matching that covers all left vertices
• For any index set $I$: $N(I) = \bigcup_{i \in I} S_i$
So Hall's condition $|N(I)| \ge |I|$ is exactly $|\bigcup_{i \in I} S_i| \ge |I|$.
SDR exists $\iff$ Hall's condition holds. One theorem, two languages. $\square$
Lecture 27 — Tree Matching + Degree Bound
(1) Every tree has at most one perfect matching.
(2) $|V|=2k$ and $\delta(G)\ge k$ $\Rightarrow$ $G$ has a perfect matching.
(2) $|V|=2k$ and $\delta(G)\ge k$ $\Rightarrow$ $G$ has a perfect matching.
Claim 1: A tree is a graph with no loops (cycles). A perfect matching pairs up every vertex. The claim says: in a tree, there's at most one way to do this. Why? Think about a leaf — it has only one neighbor, so you must pair it with that neighbor. No choice. Remove that pair, and now new leaves appear. They're forced too. Every step is forced, so the whole matching is forced. There's only one possibility (or none at all).
Claim 2: If your graph has an even number of vertices and every vertex is connected to at least half the graph, then a perfect matching must exist. Why? Because Dirac's theorem already tells us such a graph has a Hamiltonian cycle (a cycle visiting every vertex). Since the number of vertices is even, just take every other edge of the cycle — boom, perfect matching.
Claim 2: If your graph has an even number of vertices and every vertex is connected to at least half the graph, then a perfect matching must exist. Why? Because Dirac's theorem already tells us such a graph has a Hamiltonian cycle (a cycle visiting every vertex). Since the number of vertices is even, just take every other edge of the cycle — boom, perfect matching.
Claim 1 — simple proof: Suppose two different PMs exist: $M_1$ and $M_2$. Look at edges where they disagree. Every vertex is paired once in each, so in the "disagreement" graph every vertex has degree 0 or 2. That means you get cycles. But trees have no cycles. Contradiction — two different PMs can't exist.
Claim 2 — simple proof: $n = 2k$ vertices, min degree $\ge k = n/2$. Dirac says: Hamiltonian cycle exists. Walk around: $v_1, v_2, v_3, \dots, v_{2k}, v_1$. Pick edges $v_1v_2$, $v_3v_4$, ..., $v_{2k-1}v_{2k}$. That's $k$ pairs covering everyone. Done.
Claim 2 — simple proof: $n = 2k$ vertices, min degree $\ge k = n/2$. Dirac says: Hamiltonian cycle exists. Walk around: $v_1, v_2, v_3, \dots, v_{2k}, v_1$. Pick edges $v_1v_2$, $v_3v_4$, ..., $v_{2k-1}v_{2k}$. That's $k$ pairs covering everyone. Done.
Claim 1: Trees have no cycles. If two different PMs existed, their symmetric difference would create alternating cycles — impossible in a tree.
Another way to see it: Leaves have only one neighbor, so their matching edge is forced. Remove those pairs, get a smaller tree, repeat. Every step is forced $\Rightarrow$ at most one PM.
Claim 2: This is a one-line corollary of Dirac's theorem!
Another way to see it: Leaves have only one neighbor, so their matching edge is forced. Remove those pairs, get a smaller tree, repeat. Every step is forced $\Rightarrow$ at most one PM.
Claim 2: This is a one-line corollary of Dirac's theorem!
Claim 1 — Leaves Force Choices
Each leaf has only one option — forced matching.
Claim 2 — High Degree $\Rightarrow$ PM
Step: dense graph → Hamiltonian cycle → alternate edges = PM.
Claim 1 proof:
Suppose $M_1 \neq M_2$ are both perfect matchings. In $M_1 \triangle M_2$, each vertex has degree exactly 0 or 2 (matched once in each), so every nontrivial component is a cycle. But trees are acyclic. Contradiction. $\square$
Claim 2 proof:
$n = 2k$ and $\delta(G) \ge k = n/2$. By Dirac's Theorem, $G$ has a Hamiltonian cycle. Since $n$ is even, take every other edge of that cycle:
$$v_1v_2,\; v_3v_4,\; \dots,\; v_{n-1}v_n$$ That's $n/2 = k$ disjoint edges covering all $n$ vertices. Perfect matching! $\square$
Suppose $M_1 \neq M_2$ are both perfect matchings. In $M_1 \triangle M_2$, each vertex has degree exactly 0 or 2 (matched once in each), so every nontrivial component is a cycle. But trees are acyclic. Contradiction. $\square$
Claim 2 proof:
$n = 2k$ and $\delta(G) \ge k = n/2$. By Dirac's Theorem, $G$ has a Hamiltonian cycle. Since $n$ is even, take every other edge of that cycle:
$$v_1v_2,\; v_3v_4,\; \dots,\; v_{n-1}v_n$$ That's $n/2 = k$ disjoint edges covering all $n$ vertices. Perfect matching! $\square$
1.53 — König-Egerváry Theorem
In any bipartite graph: max matching = min vertex cover
($\alpha'(G) = \beta(G)$)
($\alpha'(G) = \beta(G)$)
Two different problems, same answer (in bipartite graphs):
Problem 1 — Matching: Pick as many edges as you can, but no two can share a vertex. Like hiring: each job filled by one person, each person takes one job. How many can you fill?
Problem 2 — Vertex Cover: Pick the fewest vertices so that every edge has at least one of its endpoints picked. Like placing security cameras at intersections: every road must be watched.
König says: in a bipartite graph (left side + right side), the answer to Problem 1 = the answer to Problem 2. The most jobs you can fill equals the fewest cameras you need. This only works for bipartite graphs!
Problem 1 — Matching: Pick as many edges as you can, but no two can share a vertex. Like hiring: each job filled by one person, each person takes one job. How many can you fill?
Problem 2 — Vertex Cover: Pick the fewest vertices so that every edge has at least one of its endpoints picked. Like placing security cameras at intersections: every road must be watched.
König says: in a bipartite graph (left side + right side), the answer to Problem 1 = the answer to Problem 2. The most jobs you can fill equals the fewest cameras you need. This only works for bipartite graphs!
Why are they equal? Two parts:
"Matching $\le$ Cover" (true in ANY graph): If you have $k$ matched edges, each one needs at least one guard. You can't use the same guard for two matched edges (they don't share endpoints). So you need at least $k$ cameras. Therefore max matching $\le$ min cover.
"Cover $\le$ Matching" (the bipartite trick): Start with a max matching $M$. Find the unmatched vertices on the left side. From those, trace alternating paths (non-matched edge, matched edge, non-matched, ...) and mark everything reachable. Now build a cover by picking: marked vertices on the right side + unmarked vertices on the left side. Each matched edge contributes exactly one vertex to this cover, so it has $|M|$ vertices. And you can check every edge is covered. So min cover $\le |M|$ = max matching. Done — they're equal!
"Matching $\le$ Cover" (true in ANY graph): If you have $k$ matched edges, each one needs at least one guard. You can't use the same guard for two matched edges (they don't share endpoints). So you need at least $k$ cameras. Therefore max matching $\le$ min cover.
"Cover $\le$ Matching" (the bipartite trick): Start with a max matching $M$. Find the unmatched vertices on the left side. From those, trace alternating paths (non-matched edge, matched edge, non-matched, ...) and mark everything reachable. Now build a cover by picking: marked vertices on the right side + unmarked vertices on the left side. Each matched edge contributes exactly one vertex to this cover, so it has $|M|$ vertices. And you can check every edge is covered. So min cover $\le |M|$ = max matching. Done — they're equal!
What are these?
• Matching = set of edges with no shared endpoints (pairing up)
• Vertex cover = set of vertices that touches every edge (guarding)
• In general, max matching $\le$ min cover (each matched edge needs its own guard). König says in bipartite graphs, they're equal.
• Matching = set of edges with no shared endpoints (pairing up)
• Vertex cover = set of vertices that touches every edge (guarding)
• In general, max matching $\le$ min cover (each matched edge needs its own guard). König says in bipartite graphs, they're equal.
Step through: find matching → trace alternating paths → build cover.
Easy direction: $\alpha'(G) \le \beta(G)$ always. Each matching edge needs a distinct cover vertex.
Hard direction — build a cover of size $|M|$:
Let $M$ be a maximum matching in bipartite $G = (X, Y)$.
Step 1: Let $U$ = vertices in $X$ not touched by $M$ (the unmatched ones).
Step 2: From $U$, explore all alternating paths (unmatched edge, matched edge, unmatched, matched...). Let $Z$ = all vertices reachable this way.
Step 3: Build the cover: $$C = \underbrace{(X \setminus Z)}_{\text{matched X not reachable}} \cup \underbrace{(Y \cap Z)}_{\text{reachable Y vertices}}$$ Why $C$ works:
• Every edge has at least one endpoint in $C$ (if not, we'd find an augmenting path, contradicting $M$ being maximum).
• Each matching edge puts exactly one vertex into $C$: either its $X$-endpoint (if not in $Z$) or its $Y$-endpoint (if in $Z$), never both.
So $|C| = |M|$, giving $\beta(G) \le \alpha'(G)$. Together: $\alpha'(G) = \beta(G)$. $\square$
Hard direction — build a cover of size $|M|$:
Let $M$ be a maximum matching in bipartite $G = (X, Y)$.
Step 1: Let $U$ = vertices in $X$ not touched by $M$ (the unmatched ones).
Step 2: From $U$, explore all alternating paths (unmatched edge, matched edge, unmatched, matched...). Let $Z$ = all vertices reachable this way.
Step 3: Build the cover: $$C = \underbrace{(X \setminus Z)}_{\text{matched X not reachable}} \cup \underbrace{(Y \cap Z)}_{\text{reachable Y vertices}}$$ Why $C$ works:
• Every edge has at least one endpoint in $C$ (if not, we'd find an augmenting path, contradicting $M$ being maximum).
• Each matching edge puts exactly one vertex into $C$: either its $X$-endpoint (if not in $Z$) or its $Y$-endpoint (if in $Z$), never both.
So $|C| = |M|$, giving $\beta(G) \le \alpha'(G)$. Together: $\alpha'(G) = \beta(G)$. $\square$
1.61 — $R(3,3) = 6$
$R(3,3) = 6$
Invite 6 people to a party. Between every pair, their relationship is either "friends" (red) or "strangers" (blue). No matter what, there will always be 3 people who are all mutual friends, or 3 people who are all mutual strangers. You can't avoid it with 6 people.
With only 5 people, you can avoid it (there's a clever arrangement). So 6 is the magic number.
With only 5 people, you can avoid it (there's a clever arrangement). So 6 is the magic number.
Why 6 is enough: Pick any person (call them Pat). Pat has 5 connections. By pigeonhole, at least 3 are the same color — say Pat is friends with Alice, Bob, and Carol.
Now look at Alice, Bob, and Carol:
• If any two of them are friends → they form a friend-triangle with Pat.
• If none of them are friends → Alice, Bob, Carol are all strangers to each other = stranger-triangle.
Either way, you get a monochromatic triangle. Can't escape it!
Why 5 isn't enough: Sit 5 people in a circle. Make adjacent people friends, non-adjacent people strangers. Check: no 3 mutual friends, no 3 mutual strangers. So $R(3,3) = 6$.
Now look at Alice, Bob, and Carol:
• If any two of them are friends → they form a friend-triangle with Pat.
• If none of them are friends → Alice, Bob, Carol are all strangers to each other = stranger-triangle.
Either way, you get a monochromatic triangle. Can't escape it!
Why 5 isn't enough: Sit 5 people in a circle. Make adjacent people friends, non-adjacent people strangers. Check: no 3 mutual friends, no 3 mutual strangers. So $R(3,3) = 6$.
Color every edge of $K_6$ red or blue. You must get a monochromatic triangle. No way to avoid it.
Step through the pivot vertex argument for R(3,3)=6.
Upper bound ($R(3,3) \le 6$) — the "pivot vertex" argument:
Pick vertex $v$ in $K_6$. It has 5 edges.
By pigeonhole: at least 3 have the same color (say red to $a,b,c$).
Look at the triangle $\{a,b,c\}$:
• Any red edge among them $\Rightarrow$ red triangle with $v$.
• All three blue $\Rightarrow$ blue triangle $\{a,b,c\}$.
Either way: monochromatic triangle!
Lower bound ($R(3,3) > 5$):
$K_5$: color a 5-cycle red, the remaining 5 edges (a pentagram) blue. Neither color has a triangle. So 5 vertices aren't enough.
Therefore $R(3,3) = 6$. $\square$
Pick vertex $v$ in $K_6$. It has 5 edges.
By pigeonhole: at least 3 have the same color (say red to $a,b,c$).
Look at the triangle $\{a,b,c\}$:
• Any red edge among them $\Rightarrow$ red triangle with $v$.
• All three blue $\Rightarrow$ blue triangle $\{a,b,c\}$.
Either way: monochromatic triangle!
Lower bound ($R(3,3) > 5$):
$K_5$: color a 5-cycle red, the remaining 5 edges (a pentagram) blue. Neither color has a triangle. So 5 vertices aren't enough.
Therefore $R(3,3) = 6$. $\square$
1.62 — $R(3,4) = 9$
$R(3,4) = 9$
Same party game, but now the goal is harder: you need either 3 mutual friends or 4 mutual strangers.
With 8 people, you can arrange things to dodge both. With 9, it's impossible — one of those must happen. The "$= 9$" tells us exactly where the threshold is.
With 8 people, you can arrange things to dodge both. With 9, it's impossible — one of those must happen. The "$= 9$" tells us exactly where the threshold is.
Upper bound: We use the recursion from Theorem 1.64 with the parity improvement (1.64*): $R(3,4) \le R(2,4) + R(3,3) - 1 = 4 + 6 - 1 = 9$. We get the $-1$ because both 4 and 6 are even.
Lower bound: Someone constructed a specific red/blue coloring of $K_8$ that avoids both a red triangle and a blue $K_4$. (It exists — you don't need to memorize it.)
Lower bound: Someone constructed a specific red/blue coloring of $K_8$ that avoids both a red triangle and a blue $K_4$. (It exists — you don't need to memorize it.)
9 people at a party. Color all friendships red or blue. You must get either a red triangle or a blue $K_4$ (4 mutual blue friends).
Threshold at $n=9$.
Lower bound ($R(3,4) > 8$):
An explicit coloring of $K_8$ exists with no red $K_3$ and no blue $K_4$. (You can construct one; not expected to memorize the exact edges.)
Upper bound ($R(3,4) \le 9$) — uses the improved recursion:
$$R(3,4) \le R(2,4) + R(3,3) - 1 = 4 + 6 - 1 = 9$$ The "$-1$" comes from Theorem 1.64* (parity trick), since $R(2,4) = 4$ and $R(3,3) = 6$ are both even.
Therefore $R(3,4) = 9$. $\square$
An explicit coloring of $K_8$ exists with no red $K_3$ and no blue $K_4$. (You can construct one; not expected to memorize the exact edges.)
Upper bound ($R(3,4) \le 9$) — uses the improved recursion:
$$R(3,4) \le R(2,4) + R(3,3) - 1 = 4 + 6 - 1 = 9$$ The "$-1$" comes from Theorem 1.64* (parity trick), since $R(2,4) = 4$ and $R(3,3) = 6$ are both even.
Therefore $R(3,4) = 9$. $\square$
1.64 — Recursive Ramsey Bound
$R(p,q) \le R(p-1,q) + R(p,q-1)$
This is the master formula for bounding Ramsey numbers. It says: to guarantee a red $K_p$ or blue $K_q$, you need at most $R(p-1,q) + R(p,q-1)$ people.
The idea: Pick any person at the party. Their connections split into "friends" (red) and "strangers" (blue). One of those groups must be big enough. If friends are big enough, look inside them for a slightly smaller red clique (then add the person back). If strangers are big enough, same idea for blue. Either way, you find what you need.
The idea: Pick any person at the party. Their connections split into "friends" (red) and "strangers" (blue). One of those groups must be big enough. If friends are big enough, look inside them for a slightly smaller red clique (then add the person back). If strangers are big enough, same idea for blue. Either way, you find what you need.
The proof in plain English:
Put $N = R(p-1,q) + R(p,q-1)$ people in a room. Pick any person, Pat.
Pat has $N-1$ connections. Split them: red-friends ($R$) and blue-friends ($B$). Since $|R| + |B| = N-1$, by pigeonhole one of these is true:
Scenario A: Pat has $\ge R(p-1,q)$ red-friends. Look at just those people. By definition of $R(p-1,q)$, among them there's either a red group of $p-1$ (add Pat back → red group of $p$!) or a blue group of $q$. Either way, done.
Scenario B: Pat has $\ge R(p,q-1)$ blue-friends. Same logic, flipped. Either a red $K_p$ among them, or a blue group of $q-1$ (add Pat back → blue $K_q$). Done.
Put $N = R(p-1,q) + R(p,q-1)$ people in a room. Pick any person, Pat.
Pat has $N-1$ connections. Split them: red-friends ($R$) and blue-friends ($B$). Since $|R| + |B| = N-1$, by pigeonhole one of these is true:
Scenario A: Pat has $\ge R(p-1,q)$ red-friends. Look at just those people. By definition of $R(p-1,q)$, among them there's either a red group of $p-1$ (add Pat back → red group of $p$!) or a blue group of $q$. Either way, done.
Scenario B: Pat has $\ge R(p,q-1)$ blue-friends. Same logic, flipped. Either a red $K_p$ among them, or a blue group of $q-1$ (add Pat back → blue $K_q$). Done.
The master technique — "pick a vertex, split by color":
Pick any vertex $v$. Its edges split into red-neighbors ($R$) and blue-neighbors ($B$). One group is big enough to recurse into.
Pick any vertex $v$. Its edges split into red-neighbors ($R$) and blue-neighbors ($B$). One group is big enough to recurse into.
Step through the pivot argument decision tree.
Let $N = R(p-1,q) + R(p,q-1)$. Pick any vertex $v$ in $K_N$. It has $N-1$ neighbors, colored red or blue.
$$|R| + |B| = N - 1 = R(p-1,q) + R(p,q-1) - 1$$ By pigeonhole, either $|R| \ge R(p-1,q)$ or $|B| \ge R(p,q-1)$.
If $|R| \ge R(p-1,q)$: Among the red neighbors, by definition of $R(p-1,q)$, there's either:
• A red $K_{p-1}$ $\Rightarrow$ add $v$ (connected to all of them by red) $\Rightarrow$ red $K_p$. Done!
• A blue $K_q$. Done!
If $|B| \ge R(p,q-1)$: Symmetric. Either red $K_p$ or blue $K_{q-1}$ plus $v$ gives blue $K_q$.
Every case gives a monochromatic clique. So $R(p,q) \le N$. $\square$
$$|R| + |B| = N - 1 = R(p-1,q) + R(p,q-1) - 1$$ By pigeonhole, either $|R| \ge R(p-1,q)$ or $|B| \ge R(p,q-1)$.
If $|R| \ge R(p-1,q)$: Among the red neighbors, by definition of $R(p-1,q)$, there's either:
• A red $K_{p-1}$ $\Rightarrow$ add $v$ (connected to all of them by red) $\Rightarrow$ red $K_p$. Done!
• A blue $K_q$. Done!
If $|B| \ge R(p,q-1)$: Symmetric. Either red $K_p$ or blue $K_{q-1}$ plus $v$ gives blue $K_q$.
Every case gives a monochromatic clique. So $R(p,q) \le N$. $\square$
1.64* — Improved Bound (Parity Trick)
If $R(p-1,q)$ and $R(p,q-1)$ are both even: $R(p,q) \le R(p-1,q)+R(p,q-1)-1$.
This is a bonus upgrade to the recursion above. When both sub-values are even, you can squeeze out one fewer person needed. Instead of $a + b$, you only need $a + b - 1$.
How? Assume you could dodge with $a + b - 1$ people (the "bad" coloring). The math forces every person to have an odd number of red connections. But you have an odd number of people. So the total red connections = odd $\times$ odd = odd. But total connections must be even (every handshake has two hands). Contradiction — no bad coloring can exist!
How? Assume you could dodge with $a + b - 1$ people (the "bad" coloring). The math forces every person to have an odd number of red connections. But you have an odd number of people. So the total red connections = odd $\times$ odd = odd. But total connections must be even (every handshake has two hands). Contradiction — no bad coloring can exist!
When does this help? Saves 1 from the basic recursion.
Used for $R(3,4)$: since $R(2,4)=4$ and $R(3,3)=6$ are both even, we get $R(3,4) \le 4+6-1 = 9$ instead of 10.
Used for $R(3,4)$: since $R(2,4)=4$ and $R(3,3)=6$ are both even, we get $R(3,4) \le 4+6-1 = 9$ instead of 10.
Step through the parity contradiction for R(3,4).
Suppose the bound is wrong: a "bad" coloring exists on $N = R(p-1,q) + R(p,q-1) - 1$ vertices (no red $K_p$, no blue $K_q$).
Then for every vertex $v$:
• Red-degree $< R(p-1,q)$ (otherwise red neighbors contain red $K_{p-1}$, giving red $K_p$ with $v$)
• Blue-degree $< R(p,q-1)$ (similarly)
• But red-deg + blue-deg $= N-1$, and the two upper bounds sum to exactly $N-1$.
So both must be equalities: every vertex has red-degree $= R(p-1,q)-1$.
Now the parity punch:
$R(p-1,q)$ is even, so $R(p-1,q)-1$ is odd.
Every vertex has odd red-degree. There are $N$ vertices, and $N$ is odd (even + even $- 1$).
So sum of red degrees = odd $\times$ odd = odd.
But degree sums are always even (handshake lemma)!
Contradiction $\Rightarrow$ no bad coloring exists $\Rightarrow$ $R(p,q) \le N$. $\square$
Then for every vertex $v$:
• Red-degree $< R(p-1,q)$ (otherwise red neighbors contain red $K_{p-1}$, giving red $K_p$ with $v$)
• Blue-degree $< R(p,q-1)$ (similarly)
• But red-deg + blue-deg $= N-1$, and the two upper bounds sum to exactly $N-1$.
So both must be equalities: every vertex has red-degree $= R(p-1,q)-1$.
Now the parity punch:
$R(p-1,q)$ is even, so $R(p-1,q)-1$ is odd.
Every vertex has odd red-degree. There are $N$ vertices, and $N$ is odd (even + even $- 1$).
So sum of red degrees = odd $\times$ odd = odd.
But degree sums are always even (handshake lemma)!
Contradiction $\Rightarrow$ no bad coloring exists $\Rightarrow$ $R(p,q) \le N$. $\square$
Erdős-Szekeres — Binomial Bound
$$R(r,s) \le \binom{r+s-2}{r-1}$$
All the Ramsey bounds above use recursion: $R(p,q) \le R(p-1,q) + R(p,q-1)$. But if you keep expanding, you get a tree of values — and that tree looks exactly like Pascal's triangle!
So instead of recursing, you can just compute a single binomial coefficient. For example, $R(3,3) \le \binom{4}{2} = 6$ — one formula, no recursion needed.
So instead of recursing, you can just compute a single binomial coefficient. For example, $R(3,3) \le \binom{4}{2} = 6$ — one formula, no recursion needed.
Base cases: $R(2,s) = s$ (you need $s$ people to guarantee an edge or a blue $K_s$) and $R(r,2) = r$. These match $\binom{s}{1} = s$ and $\binom{r}{r-1} = r$.
Inductive step: By the recursion (Thm 1.64), $R(r,s) \le R(r-1,s) + R(r,s-1)$. By induction, each piece is bounded by a binomial coefficient. Add them:
$$\binom{r+s-3}{r-2} + \binom{r+s-3}{r-1} = \binom{r+s-2}{r-1}$$ That last step is just Pascal's identity: two adjacent entries in one row add to the entry below them. The recursion and Pascal's triangle have the same structure!
Inductive step: By the recursion (Thm 1.64), $R(r,s) \le R(r-1,s) + R(r,s-1)$. By induction, each piece is bounded by a binomial coefficient. Add them:
$$\binom{r+s-3}{r-2} + \binom{r+s-3}{r-1} = \binom{r+s-2}{r-1}$$ That last step is just Pascal's identity: two adjacent entries in one row add to the entry below them. The recursion and Pascal's triangle have the same structure!
Gives a closed-form upper bound — no recursion needed. The Ramsey recursion mirrors Pascal's identity, so it collapses to a single binomial coefficient.
Highlighted position in Pascal's triangle.
Proof (induction on $r+s$):
Base cases: $R(2,s) = s = \binom{s}{1}$ and $R(r,2) = r = \binom{r}{r-1}$.
(If you only need a red edge, you need $s$ vertices; if you only need a blue edge, you need $r$ vertices.)
Inductive step ($r,s \ge 3$): By Theorem 1.64:
$$R(r,s) \le R(r-1,s) + R(r,s-1)$$ Apply induction hypothesis to both:
$$\le \binom{r+s-3}{r-2} + \binom{r+s-3}{r-1} = \binom{r+s-2}{r-1}$$ The last step is just Pascal's identity: $\binom{n}{k-1} + \binom{n}{k} = \binom{n+1}{k}$. $\square$
Sanity check: $R(3,3) \le \binom{4}{2} = 6$. Matches the exact value!
Base cases: $R(2,s) = s = \binom{s}{1}$ and $R(r,2) = r = \binom{r}{r-1}$.
(If you only need a red edge, you need $s$ vertices; if you only need a blue edge, you need $r$ vertices.)
Inductive step ($r,s \ge 3$): By Theorem 1.64:
$$R(r,s) \le R(r-1,s) + R(r,s-1)$$ Apply induction hypothesis to both:
$$\le \binom{r+s-3}{r-2} + \binom{r+s-3}{r-1} = \binom{r+s-2}{r-1}$$ The last step is just Pascal's identity: $\binom{n}{k-1} + \binom{n}{k} = \binom{n+1}{k}$. $\square$
Sanity check: $R(3,3) \le \binom{4}{2} = 6$. Matches the exact value!