The notation "MN" refers to the Matoušek--Nešetříl text (second edition). The first edition is also entirely adequate for this class. Note that a new Chapter 2 has been inserted in the second edition so for $i\ge 3$ if you can find something in Chapter $i$ in the second edition, it will appear in Chapter $i-1$ in the first edition.
The notation "LIN" refers to the instructor's Linear Algebra notes, currently in progress.
The notation "PZ" refers to the puzzle problem set from the instructor's 2011 REU course.
"DO" problems are strongly recommended to check your understanding of the concepts; do not hand them in.
Problems labeled "HW" must be handed in at the beginning of the next class unless expressly stated otherwise.
BONUS PROBLEMS are not mandatory but the same due date applies as to ordinary homework.
CHALLENGE PROBLEMS have no point value (but they earn you the instructor's attention) and no deadline (except they expire when discussed in class). Make sure you hand them in on a separate sheet, state your name, the fact that this is a challenge problem solution, and indicate the problem being solved. One further request: if you hand in a solution to a challenge problem, send email to the instructor giving the problem number and a brief indication of the problem (copied from the problem announcement: like "n+1 numbers" or "another limit related to the def of "e"" -- no detailed description needed, this suffices to remind me which problem it was). Such email will create an easy record and help avoid mistakes in bookkeeping.
REWARD PROBLEMS are challenge problems that you should NOT hand in. The solution is its own reward. No deadline. Of course you are welcome to discuss your thoughts with the TA, the instructor, and your peers.
If you hand in material before its due date, make sure you put it on a separate sheet since solutions are graded in batches by their due dates.
15.1 DO: Recall that a legal coloring of the directed graph ignores orientation of the edges: if two vertices are adjacent (in either direction), they must have a different color. Prove: a strongly connected directed graph is not 2-colorable if and only if it has an odd (directed!) cycle.
15.2 Notation: $\fff$ denotes the "field" of scalars. In our discussion, $\fff$ denotes either $\rrr$ or $\ccc$. The space of functions $\Omega\to\fff$ is denoted by $\fff^{\Omega}$. This is a vector space over $\fff$, i.e., one can create linear combinations of functions $\Omega\to\fff$ with coefficients from $\fff$.
15.3 DO: study the axioms of vector spaces (e.g., look up "vector space" on Wikipedia). Verify that for every set $\Omega$, the function space $\fff^{\Omega}$ is a vector space over $\fff$. In our context, $V$ will always denote a vector space over $\fff$.
15.4 Definition. If $V$ is a vector space over the field $\fff$ and
$W\subseteq V$ then we say that $W$ is a subspace of $V$ if
$W$ is
(0) not empty (1) closed under addition
(2) closed under multiplication by any scalar.
More formally, (0) $W\neq\emptyset$
(1) $(\forall u,v\in W)(u+v\in W)$   (2)
$(\forall \alpha\in\fff)(\forall w\in W)(\alpha w\in W)$.
Notation: $W\le V$ denotes that $W$ is a subspace of $V$.
15.5 DO: A particularly important vector space over $\fff$ is the space $\fff[x]$ of polynomials in the variable $x$ over $\fff$. From the point of view of taking linear combinations, we can view a polynomial $f(x) = a_0 + a_1x + a_2x^2+\dots+a_nx^n$ as the sequence of its coefficients (with infinitely many zeros added at the end): $(a_0, a_1, a_2,\dots, a_n, 0, 0, 0, \dots)$. With this view, prove: $\fff[x]\le \fff^{\nnn}$ where $\nnn$ denotes the set of non-negative integers.
15.6 DO: Prove: if $V$ is a vector space over $\fff$ then $(\forall \alpha\in\fff)(\forall v\in V)(\alpha v=0 \Leftrightarrow \alpha = 0$ or $v=0)$.
15.7 DO: (a) Prove: if $W\le V$ then $0\in W$. (b) Prove that a subspace of a vector space is a vector space.
15.8 DO: $C[0,1]$ denotes the space of continuous functions over $\rrr$. Prove that this is a subspace of $\rrr^{[0,1]}$.
15.9 DO: The span of a set $S\subseteq V$, denoted $\span(S)$, is the set of all linear combinations of vectors in $S$. (If $S$ is infinite, we only consider finite linear combinations.) Prove: (a) $\span(\emptyset)=\{0\}$. (b) $(\forall S\subseteq V)(\span(S)\le V)$. (c) $\span(S)\supseteq S$. (d) $\span(S) = S \Leftrightarrow S\le V$.
15.10 DO: Let Fib denote the set of those sequences $(a_0, a_1, \dots)$ of real numbers that satisfy the recurrence $(\forall n\ge 2)(a_n = a_{n-1}+a_{n-2})$. (a) Prove: Fib is a subspace of $\rrr^{\nnn}$, the space of sequences of real numbers. (b) Prove that the space Fib is spanned by the Fibonacci sequence $(0,1,1,2,3,5,\dots)$ and the sequence $(1,3,4,7,11,\dots)$.
15.11 Definition. A list $v_1,\dots, v_k$ of vectors is linearly independent if only their trivial linear combination is zero, i.e., $(\forall \alpha_1,\dots,\alpha_k\in\fff)(\sum_{i=1}^k \alpha_iv_i = 0 \Rightarrow \alpha_1=\dots = \alpha_k=0)$.
15.12 DO: Prove: Consider a list $L$ of vectors. Prove: (a) if $0$ is on the list then $L$ is dependent (not independent). (b) If two vectors on the list $L$ are equal then $L$ is dependent.
15.13 DO: Prove: $v_1,\dots,v_k$ are linearly dependent if and only if $(\exists i)(v_i\in\span(v_1,\dots,v_{i-1},v_{i+1}\dots,v_n))$.
15.14 HW: Let $A\in M_n(\fff)$ be an $n\times n$ matrix. Let $v_1,\dots, v_k$ be eigenvectors of $A$ belonging to distinct eigenvalues. Prove: $v_1,\dots, v_k$ are linearly independent. (12 points)
15.15 DO: Let $\alpha_1,\dots,\alpha_n$ be $n$ distinct scalars. Let $f(x)=\prod_{i=1}^n (x-\alpha_i)$ and let $g_i(x) = f(x)/(x-\alpha_i)$. (So each $g_i$ is a polynomial of degree $n-1$.) Prove: $g_1,\dots,g_n$ are linearly independent over $\fff$.
15.16 Definition. If $S\subseteq V$ then the rank of $S$, denoted $\rank(S)$, is the maximum number of linearly independent vectors in $S$. If $S$ is a subspace then $\dim(S):= \rank(S)$ is called the dimension of $S$.
15.17 Definition. $v_1,\dots,v_k$ is a set of generators of $V$ if $V=\span(v_1,\dots,v_k)$. A basis of $V$ is a linearly independent set of generators.
15.18 DO: (a) Find a very simply described basis in $\fff[x]$ (the space of polynomials). (b) Let $f_0,f_1,f_2,\dots$ be a sequence of polynomials in $\fff[x]$ such that $\deg(f_n)=n$ (one polynomial for each degree). Prove: this sequence forms a basis of $\fff[x]$.
15.19 DO: Let $A\in M_n(\fff)$ be an $n\times n$ upper triangular matrix with all diagonal elements non-zero. Prove: the columns of $A$ form a basis of $\fff^n$.
15.20 DO: Let $S$ be a set of generators of $V$ and let $B$ be a maximal independent set in $S$ (i.e., $B$ is independent, but if $B$ is a proper subset of some $B'\subseteq S$ then $B'$ is not independent). Prove: $B$ is a basis of $V$. Note in particular that every maximal independent set in $V$ is a basis of $V$.
15.21 DO: Prove: every linearly independent set in $V$ can be extended to a basis. (Note that this is immediate from the preceding exercise.)
15.22 DO: Prove: if $v_1,\dots,v_k$ are linearly independent and $v_1,\dots,v_{k+1}$ are linearly dependent then $v_{k+1}$ depends on $v_1,\dots,v_k$ (i.e., $v_{k+1}\in\span(v_1,\dots,v_k)$).
15.23 DO (Steinitz exchange principle) Prove: if $v_1,\dots,v_k$ are linearly independent and $(\forall i)(v_i\in \span(w_1,\dots,w_{\ell})$ then $(\forall i)(\exists j)$ such that of we replace $v_i$ by $w_j$ then the new list $v_1,\dots,v_{i-1},w_j,v_{i+1},\dots,v_k$ will also be linearly independent.
15.25 First miracle of linear algebra. Suppose $v_1,\dots,v_k$ are linearly independent and $(\forall i)(v_i\in \span(w_1,\dots,w_{\ell}))$. Then $k\le \ell$.
15.26 DO: Prove the First Miracle. Use the Steinitz Exchange Principle.
15.27 DO: Prove that the First Miracle is equivalent to the following statement: $(\forall S\subseteq V)(\dim(\span(S))=\rank(S))$ ("impossibility of boosting linear independence").
15.28 DO: Prove that the First Miracle is equivalent to the statement that all bases have the same size. It is also equivalent to the statement that all maximal linearly independent sets are maximum. (Maximal: nothing can be added to it so it would still remain independent. Maximum: largest. In class we saw an example that a maximal independent set in a graph is not necessarily maximum. The First Miracle guarantes that this cannot happen with linear independence.)
15.29 DO: Prove: $\dim(\fff^n)=n$. (You need to use the First Miracle.)
15.30 DO: Let $v_1,\dots, v_k$ be a list of vectors. An elementary operation on this list selects subscripts $i\neq j$ and a scalar $\alpha\in\fff$ and replaces $v_i$ by $v_i-\alpha v_j$. It does not change any other member of the list, including $v_j$. Prove: an elementary operation does not change the rank of the list of vectors.
15.31 Definition (rank of a matrix). The column-rank of a matrix is the rank of the list of its column vectors. The row-rank is defined analogously: the rank of the list of row vectors.
15.32 DO: It follows from Exercise 15.30 that an elementary column operation does not change the column-rank of a matrix. Prove: An elementary column operation does not change the row-rank of a matrix.
15.33 DO (Gaussian elemination): Show that starting from any matrix and making elementray row- and column-operations, we can reach a matrix that has at most one non-zero element in every row and at most one non-zero element in every column.
15.34 DO (Second miracle of linear algebra): Prove that the row-rank of a matrix is equal to its column-rank. This common value is called the rank of the matrix. (Hint: Guassian elimination; use the preceding exercises.)
15.35 DO: (a) Prove: If $A$ and $B$ are $k\times \ell$ matrices over $\fff$ then $\rank(A+B)\le \rank(A)+\rank(B)$. (b) Prove: If $A$ is a $k\times \ell$ matrix and $B$ is an $\ell\times m$ matrix over $\fff$ then $\rank(AB) \le \min\{\rank(A),\rank(B)\}$. (Hint: use the following fact: if $B_i$ denotes the $i$-th column of $B$ then the $i$-th column of $AB$ is $Ab_i$. Use further that $(AB)^{\dagger} = B^{\dagger} A^{\dagger}$ where the $\dagger$ indicates the transpose matrix.)
15.36 HW: (a) Determine the dimension of the space Fib defined in Ex. 15.10. Find a basis. (b) Let the shift operator $\sigma : \rrr^{\nnn}\to \rrr^{\nnn}$ be defined by $\sigma(a_0,a_1,a_2,\dots) = (a_1, a_2, a_3,\dots)$. Determine all eigenvalues and eigenvectors of the shift operator in $\rrr^{\nnn}$. (c) Find a basis of Fib that consists of eigenvectors of the shift operator. (d) Represent the Fibonacci sequence as a linear combination of the basis of Fib found in part (c). Savor the result. (3+10+6+4 points)
15.37 DO: Let $T$ be the set of real functions $S=\{\cos(x+\alpha) \mid \alpha\in\rrr\}$. Determine the dimension of $\span(T)$. Find a basis.
15.38 DO: review the definition of the determinant from Wikipedia, with the following correction: (i) $\sgn(\sigma)$ is called the sign (and not the "signature") of $\sigma$. Also, Wikipedia gives a different definition of $\sgn(\sigma)$ from what I used in class; the two definitions are equivalent. The difinition used in class was this: let $\inv(\sigma)$ denote the number of inversions of $\sigma$, i.e., the number of pairs $(i,j)$ such that $i < j$ and $\sigma(i)>\sigma(j)$. The sign of $\sigma$ is 1 if the number of inversions is even, and $-1$ if the number of inversions is odd.
15.39 DO: Prove: (a) If a column of $A$ is zero then $\det(A)=0$. (b) If we obtain $A'$ by swapping two columns of $A$ then $\det(A') = -\det(A)$. (c) If $A$ has two equal columns then $\det(A)=0$. (d) The determinant of a triangular matrix is the product of the diagonal elements. (e) $\det(A^{\dagger}) = \det(A)$.
15.40 DO: Prove that elementary row- and column-operations don't change the value of the determinant.
15.41 DO: Let $A\in M_n(\fff)$. Prove: $\det(A)\neq 0$ if and only if the columns of $A$ are linearly independent.
15.42 DO: We say that an $n\times n$ matrix $A\in M_n(\fff)$ is non-singular if $\det(A)\neq 0$. Prove that for any matrix $A\in M_n(\fff)$, the following are equivalent:
15.43 DO: Prove: if we multiply a column of a matrix by $\alpha$, the determinant is also multiplied by $\alpha$.
15.44 DO: Review the other properties of the determinant listed on the Wikipedia page.
15.45 DO (characteristic polynomial): Let $A\in M_n(\fff)$. Let $f_A(t) = \det(tI-A)$ where $t$ is an unknown scalar and $I$ is the identity matrix. (a) Prove that $f_A(t)$ is a polynomial of degree $n$ with leading coefficient 1 in the variable $t$. (b) Prove: $\lambda$ is a right eigenvalue of $A$ if and only if $f_A(\lambda)=0$. (c) Prove (third miracle): the right and the left eigenvalues are the same.
15.46 HW (rotation matrix): Let $$R_{\theta} = \left(\matrix{\cos \theta & -\sin \theta \\ \sin \theta & \cos \theta}\right).$$ This is called the matrix of the rotation of the plane by $\theta$. Find the complex eigenvalues and eigenvectors of this matrix. Savor the result. (Hint: compute the characteristic polynomial. Use it to find the eigenvalues. For each eigenvalue, find the corresponding eigenvectors.) (10 points)
15.47 HW: Given a matrix $A\in M_n(\fff)$, we say that a basis of $\fff^n$ is an eigenbasis of $A$ if the basis consists of eigenvectors of $A$. (a) Consider the matrix $A = \left(\matrix{1 & 1 \\ 0 & 1}\right)$. Compute the characteristic polynomial of this matrix. Find its eigenvalues and eigenvectors. Prove: this matrix does not have an eigenbasis. (b) Consider the matrix $B = \left(\matrix{1 & 1 \\ 0 & 2}\right)$. Find its eigenvalues. Prove, without any computation, that this matrix has an eigenbasis. You may use one of the problems above without proof. (6+6 points)
15.48 HW: (a) Consider the $n\times n$ matrix $M = \left(\matrix{a & b & b & \cdots & b \\ b & a & b & \cdots & b \\ \vdots & & & \ddots & \vdots \\ b & b & b & \cdots & a \\ }\right)$.
(All diagnal elements are equal to $a$ and all off-diagonal elements
are equal to $b$.)
Determine $\det(M)$. Your answer should be a simple closed-form expression.
(Hint. Use Gaussian elimination to obtain a triangular matrix.
Your first step should be to add each of the first $(n-1)$ rows to the
last row.)
(b) Let $J$ be the $n\times n$ all-ones matrix (every entry is 1).
(b1) Find the characteristic polynomial of $J$. (Hint: use part (a).)
(b2) Determine the eigenvalues of $J$ and their multiplicities
(in the characteristic polynomial). (b3) Find an eigenbasis
of $J$.
(8+3+4+6 points)
14.1 DO: LN 8.2.1 (page 89): analyse the Markov Chain given in Figure 8.3.
14.2 DO: Let $T$ be the transition matrix of a Markov Chain. Suppose the limit $L=\lim_{t\to\infty} T^t$ exists. Prove: every row of $L$ is a stationary distribution.
14.3 Definition. Let $G$ be a digraph and $x$ a vertex of $G$. The period of $x$, denoted $\period(x)$, is the g.c.d. of the lengths of all closed walks starting at $x$. If there is no such closed walk then we set $\period(x)=0$. We say that $x$ is aperiodic if $\period(x)=1$.
14.4 HW: Prove: If vertices $x$ and $y$ belong to the same strong component of the digraph $G$ (i.e., if they are mutually accessible from one another) then $\period(x)=\period(y)$. (Hint: "Euclid's lemma.") (8 points)
14.5. Definition: Let $G$ be a strongly connected digraph. The period of $G$, denoted $\period(G)$, is the period of any of the vertices of $G$. (They are all equal by the previous exercise.) $G$ is aperiodic if $\period(G)=1$.
14.6 DO: Prove: the period of a strongly connected digraph is the g.c.d. of the lengths of all cycles in $G$. (Note: a cycle has no repeated vertices. In particular, $G$ has a finite number of cycles, in contrast to the infinite number of closed walks used in the definition of the period.)
14.7 DO: (a) Let $G$ be a strongly connected digraph and $d$ a positive integer. Prove: $d\mid\period(G)$ if and only if the vertices of $G$ can be split into $d$ blocks $V = V_0\cup\dots\cup V_{d-1}$ such that every edge points from a block to the next block in the cyclic order, i.e., if $x\in V_i$ and $y\in V_j$ and $x\to y$ is an edge then $j\equiv i+1 \pmod d$.
(b) Look at the MC diagram in Figure 8.3 in LN (page 89). Show that the digraph is strongly connected and has period 3. Divide up the vertices (states) into 3 blocks as described in part (a).
14.8 DO: Prove: If $G$ is a connected undirected graph then (a) its period is 1 or 2; (b) $\period(G)=2$ if and only if $G$ is bipartite.
14.9 Definition. A MC is irreducible if the transition digraph is strongly connected. An irreducible MC is aperiodic if the transition digraph is aperiodic. A MC is ergodic if it is irreducible and aperiodic.
14.10 Theorem (convergence of ergodic Markov Chains). (a) Let $T$ be the transition matrix of an ergodic Markov Chain. Then $\lim_{t\to\infty} T^t$ exists. (b) It follows that for any initial distribution $q(0)$, the sequence of distributions $q(t)=q(0)T^t$ converges to the unique stationary distribution. (These results follow from the Perron-Frobenius theorem, to be discussed later.)
14.11 DO: (a) Factor the polynomial $f(x)=x^4+1$ into linear factors over $\ccc$. Describe the roots in canonical form, i.e., in the form $a+bi$ where $a,b$ are real numbers. Prove that all the roots are 8th roots of unity. (b) Factor the polynomial $f(x)=x^4+x^2+1$ into linear factors over $\ccc$. Describe the roots in canonical form. Prove that all the roots are 6th roots of unity.
14.12 DO: Prove the Division Theorem for polynomials: If $f,g\in \ccc[x]$ and $g\neq 0$ then $(\exists q,r\in\ccc[x])(f=gq+r$ and $\deg(r)<\deg(g)$.
14.13 DO: (a) Define divisibily and greatest common divisors of polynomials analogously to the same concepts for integers. (b) Show that the greatest common divisor is unique up to nonzero constant factors. We make the notation $\gcd(f,g)$ unique by requiring its lead coefficient to be 1. (c) Show that the gcd of polynomials can be found by Euclid's algorithm. (d) (gcd as linear combination) Prove: $(\forall f,g\in\ccc[x])(\exists u,v\in \ccc[x])(\gcd(f,g)=uf+vg)$.
14.14 DO: Prove: if $f\in\ccc[x]$ and any $\alpha\in\ccc$ then $(x-\alpha)\mid f(x) - f(\alpha)$. We gave two proofs of this important fact in class. The first one was based on the Division Theorem. The second one was based on the identity $x^n-\alpha^n = (x-\alpha)(x^{n-1} + \alpha x^{n-2} +\alpha^2 x^{n-3} + \dots + \alpha^{n-1})$ and the fact that every polynomial is a linear combination of powers of $x$. Complete the second proof.
14.15 Fundamental Theorem of Algebra. If $f\in\ccc[x]$ and $\deg(f)\ge 1$ then $(\exists\alpha\in\ccc)(f(\alpha)=0)$.
14.16 DO: Infer from the FTA that every polynomial $f\in\ccc[x]$ of degree $n$ can be written as $f(x)=\gamma(x-\alpha_1)^{m_1}\cdots(x-\alpha_k)^{m_k}$ where $\gamma\in\ccc$ is the leading coefficient of $f$; the $\alpha_i$ are distinct complex numbers (the roots of $f$); the multiplicities $m_i$ are positive integers, and $\sum_{i=1}^k m_i = n$.
14.17 DO: Prove that the representation of $f$ given in the preceding exercise is unique (up to the ordering of the terms).
14.18 DO: Let $f\in\zzz[x]$ be a polynomial with integer coefficients: $f(x)=a_0+a_1x+\dots+a_nx^n$ where $a_0a_n\neq 0$. Suppose the rational number $\alpha=k/\ell$ is a root: $f(\alpha)=0$, where $\gcd(k,\ell)=1$. Prove: $k\mid a_0$ and $\ell \mid a_n$.
14.19 HW (multiple roots): Let $f\in\ccc[x]$. We say that "$f$ has multiple roots" if at least one of the multiplicities $m_i$ in problem 14.16 is $m_i\ge 2$. For instance, $x^2-1$ has no multiple roots but $(x+1)(x^2-1)$ does. (a) Prove: $f$ has multiple roots if and only if $\gcd(f,f')\neq 1$ (where $f'$ is the derivative of $f$). (b) Prove that the polynomial $g(x)=x^{100}+x+1$ has no multiple roots. Use only hand calculations; show all your work. (10 + 8 points)
14.20 DO: Let $f(x)=a_0+a_1x+\dots+a_nx^n\in\ccc[x]$. Assume $a_n=1$; then by the FTA, $f(x)$ can be written as $f(x)=\prod_{i=1}^n(x-\alpha_i)$. (The $\alpha_i$ need not be distinct.) Prove:
14.21 Definition. $\omega\in\ccc$ is a complex $n$-th root of unity if $\omega^n=1$. These are the roots of the polynomial $x^n-1$, so $x^n-1=\prod_{i=0}^{n-1}(x-\omega_i)$ where $\omega_0,\dots,\omega_{n-1}$ are the $n$-th roots of unity: $\omega_0=1$, $\omega_1=\cos(2\pi/n)+i\sin(2\pi/n)$, and $\omega_k=\omega_1^k= \cos(2k\pi/n)+i\sin(2k\pi/n)$ ($k=0,\dots,n-1$).
14.22 DO: Let $S_n$ denote the sum of the $n$-th roots of unity. Prove: $S_1=1$ and $S_n=0$ for all $n\ge 2$.
14.23 Definition. The order of a complex number $\alpha$ is the smallest positive $n$ such that $\alpha^n=1$. If no such $n$ exists, we say that the order of $\alpha$ is infinite. For instance, the order of $i=\sqrt{-1}$ is 4 and the order of $(1/2)+(\sqrt{3}/2)i$ is 6. What is the order of $1$ and the order of $-1$? If the order of $\alpha$ is $n$, we say that $\alpha$ is a primitive $n$-th root of unity.
14.24 DO: (a) Prove that the number of complex numbers of order $n$ is $\vf(n)$ where $\vf$ denotes Euler's $\vf$ function. (b) Use this to prove that $\sum_{d\mid n} \vf(d) = n$.
14.25 DO: Let $\alpha$ be a complex number of finite order $k$. Prove: $\alpha^n =1$ ($\alpha$ is an $n$-th root of unity) if and only if $k\mid n$. (Stated in class as HW; downgraded to "DO" exercise.)
14.26 DO: Let $\alpha, \beta\in\ccc$ be complex numbers of finite
order: $\alpha$ has order $k$ and $\beta$ has order $\ell$. Let $m$
denote the order of $\alpha\beta$. Prove:
(a) $m\mid \lcm(k,\ell)$
(b) If $\gcd(k,\ell)=1$ then $m=k\ell$
(Stated in class as HW; downgraded to "DO" exercise.)
(c) CHALLENGE: $\lcm(k,\ell)/\gcd(k,\ell) \mid m$.
14.27 Definition. $M_n(\ccc)$ denotes the set of $n\times n$ matrices and $\ccc^n$ the set of $n \times 1$ matrices ("column vectors") over $\ccc$. We say that $x\in\ccc^n$ is a right eigenvector for the matrix $A\in M_n(\ccc)$ with eigenvalue $\lambda\in\ccc$ if (i) $x\neq 0$ and (ii) $Ax = \lambda x$. We say that $\lambda$ is a (right) eigenvalue of $A$ is there exists an eigenvector for $A$ with eigenvalue $\lambda$. - Left eigenvectors and eigenvalues are defined analogously: Assume $y$ is a row vector (i.e., its transpose $y^{\dagger} \in \ccc^n$); we say that $y$ is a left eigenvector with eigenvalue $\mu$ is $y\neq 0$ and $yA=\mu y$.
14.28 DO later (one of the numerous miracles of linear algebra): The left eigenvalues of $A$ are the same as the right eigenvalues of $A$.
14.29 HW: Let $T$ be a stochastic matrix. (a) Prove that the all-ones vector $(1,1,\dots,1)^{\dagger} \in \ccc^n$ is a right eigenvector for $T$. What is the corresponding eigenvalue? (b) Let $\pi$ be a stationary distribution for the Markov Chain described by $T$. Note that $\pi$ is a row vector. Prove that $\pi$ is a left eigenvector of $T$. What is the corresponding eigenvalue? (The answer to each equation should be just ine line.) (3 + 3 points)
14.30 DO: Let $T$ be the transition matrix of an irreducible Markov Chain with period $d$. Let $\lambda$ be an eigenvalue of $T$ and let $\omega$ be a $d$-th root of unity (i.e., $\omega^d=1$). Prove that $\lambda\omega$ is also an eigenvalue of $T$. (Hint: first solve the case when $d=2$ and $\omega=-1$.) (Stated in class as HW; downgraded to "DO" exercise.)
14.31 HW: Let $T$ be a stochastic matrix. Prove: if $\lambda\in\ccc$ is an eigenvalue of $T$ then $|\lambda|\le 1$. (10 points)
13.0 DO: Review the definitions and concepts for graphs and digraphs from the instructor's online lecture notes (LN), especially the terminology defined on pages 56 and 58.
13.1 DO: LN Exercises 6.2.1 - 6.2.5
13.2 Definition: Let us say that a vertex $x$ of a tournament $T$ dominates a subset $A$ of the vertices if for every $a\in A$ there is an edge $x\to a$ in $T$. Let us say that $T$ is $k$-paradoxical if every $k$-subset of the vertex set is dominated by some vertex.
13.3 DO: For all sufficiently large $n$ find a 2-paradoxical tournament on $n$ vertices.
13.4 DO: Prove: (a) every tournament has a Hamilton path. (b) Every strongly connected tournament has a Hamilton cycle.
13.5 DO: Let $s,t$ be vertices of a digraph $G$. Prove: if $t$ is accessible from $s$ be a (directed) walk then it is also accessible by a (directed) path.
13.6 DO: Prove that accessibility if a transitive relation on the vertices of a digraph.
13.7 DO (strong components): (a) Prove: mutual accessibility is an equivalence relation on the vertices of a digraph. The equivalence classes are called the strong components of $G$. (b) LN 6.2.24 (find strong components in picture)
13.8 NOTATION. Let $G=(V,E)$ be a digraph and $A,B\subseteq V$. Let $E(A,B)=\{(a,b) \mid a\in A, b\in B, (a,b)\in E\}$.
13.9 Definition (cut). A cut of a digraph $G=(V,E)$ is a partition of the vertex set into two parts: $V=A\dot\cup B$. (The $\dot\cup$ indicates that $A\cap B=\emptyset$.) Let $s,t$ be vertices. An $(s,t)$-cut of $G$ is a cut $V=A\dot\cup B$ such that $s\in A$ and $t\in B$. We say that this cut separates $t$ from $s$ if $E(A,B)=\emptyset$.
13.10 HW (accessibility vs. cuts): Let $G$ be a digraph and $s,t$ two vertices of $G$. Prove: $t$ is not accessible from $s$ if and only if there exists an $(s,t)$-cut that separates $t$ form $s$. (5 points)
13.11 Definition. A digraph $G=(V,E)$ is Eulerian if $(\forall x\in V)(\deg^+(x) = \deg^-(x))$ (the out-degree of each vertex is equal to its in-degree).
13.12 HW: Let $G=(V,E)$ be an Eulerian digraph. Prove: (a) If $(A,B)$ is a cut (i.e., $V=A\dot\cup B$) then $|E(A,B)|=|E(B,A)|$. (b) If $G$ is weakly connected (connected if we ignore orientation) then it is strongly connected. (7+5 points)
13.13 DO (topological sort): A labeling of a set $V$ of $n$ elements is a bijection $V\to [n]$. A topological sort of the vertices of a digraph $G=(V,E)$ is a labeling $f : V\to [n]$ such that every edge is increasing, i.e., if $i\to j$ is an edge then $f(i) < f(j)$. Prove: a topological sort of $G$ exists if and only if $G$ is a DAG (directed acyclic graph).
13.14 DO: Let $A_G$ denote the adjacency matrix of the digraph $G$. Prove: the $(i,j)$ entry of $A_G^t$ counts the walks of length $t$ in $G$ from $i$ to $j$.
13.15 HW: Let $G$ be a digraph on $n$ vertices. Prove: $G$ is a DAG if and only if $A_G^n = 0$. (7 points)
Warning: starting with Problem 13.16, the problem set was reorganized, the problem numbers were changed on 11/19, 8:20pm. The reorganization does not affect any of the HW problems due Thursday, Nov 21.
13.16 HW (due Tuesday, Nov 26): Prove: if every vertex in a digraph $G$ has out-degree $\le k$ then $\chi(G)\le 2k+1$. (Legal colorings of a digraph are defined ignoring orientation: if there is an edge between two vertices, they must have different colors.) (9 points)
13.17 DO: Prove that the $2k+1$ bound in the previous exercise is best possible: for every $k$ there exists a digraph such that each outdegree is $\le k$ and the chromatic number is $2k+1$.
FINITE MARKOV CHAINS
13.18 Definition. We say that a vector $(q_1,\dots,q_t)$ is a probability distribution if $(\forall i)(q_i\ge 0)$ and $\sum_{i=1}^n q_i =1$. The uniform distribution is the vector $(1/n,\dots,1/n)$.
13.19 Definition. We say that an $n\times n$ matrix is stochastic if its entries are non-negative and the sum of each row is 1. In other words, every row of a stochastic matrix is a probability distribution.
The transition matrices of finite Markov Chains are precisely the stochastic matrices. If the transition matrix of a finite MC is $T=(p_{ij})$ then $p_{ij} = P(X_{t+1}=j \mid X_t=i)$ (the bar indicates conditional probability) where $X_t$ is the state of the particle at time $t$.
13.21 DO ($t$-step transition matrix): Let $p_{ij}^{(t)} = P(X_{s+t}=j \mid X_s = i)$. Prove that $p_{ij}^{(t)}$ is the $(i,j)$ entry of $T^t$. (Hint: induction on $t$. Use the "Theorem of complete probability.")
13.22 DO (Evolution of Markov Chains): Let $q(t)=(q_1(t),\dots,q_n(t))$ be the distribution of the particle at time $t$, i.e., $P(X_t = i) = q_i(t)$. Prove: $$ q(t+1) = q(t)T.$$ Infer from this that $q(t) = q(0) T^t$.
13.23 Definition: A distribution $\pi(\pi_1,\dots,\pi_n)$ is a stationary distribution for the MC with transition matrix $T$ if $\pi T = \pi$.
13.24 THEOREM: Every finite MC has a stationary distribution.
13.25 DO: Find a finite Markov Chain with more than one stationary distribution. Your example should have as few states as possible. Drawa diagram of your MC and state the associated transition matrix.
13.26 DO: prove: if a MC has more than one stationary distribution then it has infinitely many.
13.27 Definition: Consider a MC defined by the $n\times n$ transition matrix $T=(p_{ij})$. The adjacency matrix $A_T=(a_{ij})$ of the transition digraph $G_T$ associated with this Markov Chain is defined by the rule $a_{ij}=1$ $\Leftrightarrow$ $p_{ij} > 0$: the edges of $G_T$ correspond to the possible transitions in the Markov Chain (transition with nonzero probability).
13.28 TERMINOLOGY. A MC is irreducible if its transition digraph is strongly connected; otherwise the MC is reducible.
13.29 THEOREM: If a MC is irreducible then its stationary distribution is unique.
13.30 DO: Find a reducible Markov Chain which has a unique stationary distribution.
13.31 DO: Prove that the product of two stochastic matrices is stochastic.
13.32 HW (due Tuesday, Nov 26): Let $T$ be a stochastic matrix and suppose the limit $L = \lim_{k\to\infty} T^k$ exists. Prove: each row of $L$ is a stationary distribution. (6 points)
13.33 HW (due Tuesday, Nov 26): Suppose $\pi=(\pi_1,\dots,\pi_n)$ is a stationary distribution for an irreducible Markov Chain. Prove: $(\forall i)(\pi_i > 0)$. (8 points)
13.34 DO: Let $T$ be a stochastic matrix. Prove: $T$ is irreducible if and only if for all sufficiently large $k$ the matrix $(T+I)^k$ is positive (i.e., all of its entries are positive). ($I$ denotes the identity matrix.)
13.35 HW (due Tuesday, Nov 26) (convergence of a finite Markov Chain): LN 8.1.8. (Prove the convergence of the powers of a certain 2 × 2 stochastic matrix.) State the limit matrix. -- Your proof should be short. Elegance counts. (Hint: you may use an online matrix multiplication tool to guess the limit. Subtract the suspected limit from the powers of the transition matrix. Prove that the entries of the difference rapidly converge to zero.) (8 points)
13.36 DO: LN 8.1.17. ((a) Prove that the powers of a certain 2 × 2 stochastic matrix do not converge; (b) find a stationary distribution.)
13.37 DO: A matrix $A$ is doubly stochastic if both $A$ and its transpose are stochastic, i.e., $A$ is a nonnegative matrix such that its rows sums as well as its column sums are 1). Let $T$ be the transition matrix of a Markov Chain. Prove: the uniform distribution is stationary if and only if $T$ is doubly stochastic.
13.38 DO: Review the definition of matrix multiplication. Verify: if the columns of the matrix $B$ are $b_1,\dots,b_m$ and $A$ is a matrix then the columns of $AB$ are $Ab_1,\dots,Ab_m$. (When writing the matrix product $AB$, we assume the the number of columns of $A$ is the same as the number of rows of $B$ so the matrix product $AB$ is defined.)
13.39 DO: Verify: $(AB)^{\dagger}=B^{\dagger}A^{\dagger}$. (Here $A,B$ are matrices and $A^{\dagger}$ denotes the transpose of $A$.)
13.40 DO: Let $A=(\alpha_{i,j})$ be an $n\times n$ matrix. The trace of $A$ is the sum of the diagonal entries: $\trace(A)=\sum_{i=1}^n \alpha_{i,i}$. Prove: if $A$ is an $k\times n$ matrix and $B$ is an $n\times k$ matrix then $\trace(AB)=\trace(BA)$.
12.1 DO: Prove the Inclusion-Exclusion formula for probabilities using indicator variables. Prove and use the identity $$\prod_{i=1}^k (1-x_i) = \sum_{I\subseteq [k]} (-1)^{|I|}\prod_{i\in I} x_i.$$
12.2 HW:
12.3 HW: Count the $n$-digit integers of which all digits are odd and in which every odd digit actually occurs. For instance, for $n=6$, 797153 is such a number, and 797113 is not such a number. Your answer should be a closed-form expression. (6 points)
12.4 DO: Prove Euler's formula $m-n+r=2$ for trees.
12.5 DO: We proved in class that if a planar graphs has $n\ge 3$ vertices and $m$ edges then $m\le 3n-6$. Review the proof.
12.6 HW: Prove: if a planar graphs has $n\ge 3$ vertices and $m$ edges and no triangles then $m\le 2n-4$. (8 points)
12.7 HW (due Thursday, Nov. 21) Prove: every planar graph has a vertex of degree $\le 5$. (Your proof should be short, no longer than a two lines.) (6 points)
12.8 HW (due Thursday, Nov. 21)("6-color Theorem") Prove: every planar graph is 6-colorable. Your proof should not use any results other than those proved in class or assigned as homework. Your proof should be short (only a few lines) and convincing. (8 points)
12.9 DO: An Eulerian trail in a graph is a closed walk that passes through every vertex exactly once. Prove: a graph has an Eulerian trail if and only if it is connected and every vertex has even degree.
12.10 HW: Prove that Petersen's graph is not planar. Draw the Petersen's graph and highlight (with a different color) a Kuratowski subgraph in it. (A Kuratowski subgraph is a topological $K_5$ or a topological $K_{3,3}$. State which kind of Kuratowski subgraph you found. (5 points)
12.11 HW (due Thursday, Nov 21): Prove: the probability that a random graph is planar is extremely small: $o(2^{-0.49n^2})$. (9 points)
12.12 DO: (a) For all sufficiently large $n$, find a graph with $n$ vertices and $\le n+2$ edges that is not planar. What is your threshold of "sufficiently large"? (b) Prove: If $G$ is a connected graph with $n$ vertices and $\le n+2$ edges then $G$ is planar.
11.0 NOTATION: $\{a_n\}$ is a shorthand for the sequence $(a_1,a_2,a_3,\dots)$ (or $(a_0,a_1,a_2,\dots)$ if the sequence starts with $a_0$).
11.1 DO: Prove: under the new definition of asymptotic equality, a sequence $\{a_n\}$ is asymptotically equal to the all-zero sequence $(0,0,0,\dots)$ if and only if the $a_n$ are eventually zero, i.e., $a_n=0$ for all sufficiently large $n$, i.e., $(\exists n_0)(\forall n \ge n_0)(a_n = 0)$.
11.2 DO: Prove that under the new definition, asymptotic equality is an equivalence relation on the set of all sequences.
FROM NOW ON, WE USE THE "NEW" DEFINITION of asymptotic equality and the little-oh notation.
11.3 DO: Let $\{a_n\}$ and $\{b_n\}$ be sequences. Prove: the relations $a_n\sim b_n$ and $a_n = o(b_n)$ hold simultaneously if and only if both the $a_n$ and the $b_n$ are eventually zero.
11.4 DO: Prove: $a_n\sim b_n$ if and only if $a_n = b_n(1+o(1))$. The latter means that there exists a sequence $c_n$ such that $a_n = b_n(1+c_n)$ where $c_n = o(1)$, i.e., $c_n \to 0$.
11.5 DO True or false? "If $a_n=o(1)$ and $b_n=o(1)$ then $a_n+b_n=o(1)$." Or more generally, "If $a_n=o(c_n)$ and $b_n=o(c_n)$ then $a_n+b_n=o(c_n)$."
11.6 DO: Consider the statement "If $a_n=o(b_n)$ and $a_n=o(c_n)$ then
$a_n = o(b_n+c_n)$."
(a) Prove that this statement is FALSE. (b) Prove that this statement is
TRUE under a simple and natural additional assumption.
11.7 DO: (a) Prove that the little-oh relation is transitive, i.e., if $a_n = o(b_n)$ and $b_n=o(c_n)$ then $a_n=o(c_n)$. (b) Prove that the big-Oh relation is transitive.
11.8 DO: (a) Prove: if $a_n = o(b_n)$ then $a_n = O(b_n)$. (b) Prove that the converse is false.
11.9 DO: Prove: if $a_n \sim b_n$ then $a_n = \Theta(b_n)$.
11.10 DO: Prove that the big-Omega relation $a_n=\Omega(b_n)$ is transitive.
11.11 DO: Prove that the big-Theta relation $a_n=\Theta(b_n)$ is an equivalence relation on the set of all sequences.
11.12 DO: Find two sequences $a_n$ and $b_n$ of positive numbers such that $a_n = \Theta(b_n)$ but the sequence $a_n/b_n$ has no limit.
11.13 DO: Study the chapter on "Asymptotic notation" in the online Lecture Notes.
11.14 HW: Suppose $a_n \to \infty$ and $b_n > 0$ and $b_n=\Theta(a_n)$. Prove: (a) $b_n\to\infty$ (b) $\ln a_n \sim \ln b_n$. (3+5 points)
11.15 HW (due Thu, Nov 15): Let $G$ be a graph. Suppose every vertex $x\in V(G)$ has degree $\le k$. Prove: $\chi(G)\le k+1$, i.e., $G$ is $k+1$-colorable. (6 points) Note: In class, this problem was stated as a "DO" exercise. It has been upgraded to "HW."
11.16 HW (due Tue, Nov 19): Let $G$ be a triangle-free graph, i.e., $G$ does not contain $K_3$ as a subgraph. Prove: $\chi(G) = O(\sqrt{n})$. More specifically, prove that $\chi(G) \le 2\sqrt{n}+1$. (10 points) Note: In class, this problem was given with a due date of Nov 15. The revised due date is Nov 19.
11.17 HW: Let $X_n$ denote the number of triangles in a random graph on $n$ vertices. (Adjacency is decided by flipping a fair coin $\binom{n}{2}$ times.) (a) State the size of the sample space in this experiment. (b) Compute $E(X_n)$. (Your answer should be a very simple expression.) Prove your answer. (c) Compute $\var(X_n)$. (Your answer should be a well-organized, closed-form expression. Sloppily written formulas will not be graded since checking them may require unreasonable effort on the side of the grader.) (d) Give an asymptotic formula for $\var(X_n)$ in the form $\var(X_n)\sim an^b$. Evaluate the constants $a$ and $b$. (e) DO (do not hand in): Interpret and prove the Weak Law of Large Numbers for $X_n$. (2+2+8+4 points)
11.18 DO (Erdös: Cliques in random graphs):
In class we proved the following theorem:
Almost all graphs have no clique of size $\ge 1+2\log_2 n$.
Review the proof.
11.19 DO: (a) Prove: for $1\le k\le n$ we have $\binom{n}{k}\ge (n/k)^k$. (b) Prove: $0\le k\le n$ we have $\binom{n}{k}\le n^k/k!$. (We used this inequality in the proof of Erdös's theorem above.) (c) Prove: for $1\le k\le n$ we have $\binom{n}{k}\le (\eee n/k)^k$. -- Do not use Stirling's formula. Hint: use the exercise that $n! \ge (n/\eee)^n$.
11.20 CHALLENGE: Prove: for $1\le k\le n$ we have $\sum_{i=0}^k \binom{n}{i} \le (\eee n/k)^k$. Your proof should be short and elegant.
11.21 HW: Prove: almost all graphs have chromatic number $\Omega(n/\ln n)$. The meaning of the big-Omega notation in the context of an "almost all" statement is that "There exists a constant $c>0$ such that almost all graphs have chromatic number $\ge cn/\ln n$." So the full statement is the following:
There exists a constant $c>0$ such that if $p_n$ denotes the probability that a random graph on $n$ vertices has chromatic number $\ge c n/\ln n$ then $\lim_{n\to\infty} p_n = 1$.
Hint: use HW 10.15 (also due Nov 12) that says $\alpha(G)\chi(G)\ge n$. Your proof should be very short -- only a few lines (based on a result proved in class). (8 points)
11.22 DO: Let $a_n > 1$. Suppose $a_{n+1} = O(a_n)$. Prove: $\ln a_n = O(n)$.
11.23 DEFINITION: We say that the sequence $\{a_n\}$ is polynomially bounded if $|a_n| \le n^{O(1)}$, i.e., $(\exists C)(\exists n_0)(\forall n\ge n_0)(|a_n| \le n^C)$.
11.24 DO: Suppose $a_n \ge 1$. Prove: the sequence $\{a_n\}$ is polynomially bounded if and only if $\ln a_n = O(\ln n)$.
11.25 DO: Prove: $(\ln n)^{100} = o(\sqrt{n})$. Do not use L'Hôpital's rule. Use the fact that $\ln x = o(x)$; beyond this, use only substitution of variables.
11.26 DO (When L'Hôpital does not apply): Give an example of a pair $f$, $g$ of differentiable functions such that $\lim_{x\to\infty} f(x) = \lim_{x\to\infty} g(x) = \infty$ and $f\sim g$ (i.e., $\lim_{x\to\infty} f(x)/g(x) = 1$) but $\lim_{x\to\infty} f'(x)/g'(x)$ does not exist.
10.1 DO: Prove: a graph is bipartite if and only if it has no odd cycles.
10.2 HW: Prove: If a bipartite graph has $n$ vertices and $m$ edges then $m\le n^2/4$. (4 points) (Note: this problem was assigned as "HW" in class but mistakenly posted as a "DO" exercise. If you were confused by the posting and did not hand in your solution for this reason, please send email to the instructor asap.)
10.3 CHALLENGE (Mantel - Turán Theorem) Prove: If a graph has $n$ vertices, $m$ edges, and no triangles, then $m\le n^2/4$. (A "triangle" is a cycle of length 3.)
10.4 CHALLENGE (Köváry - Turán - Sós Theorem) Prove: If a graph has $n$ vertices, $m$ edges, and no cycles of length 4, then $m = O(n^{3/2})$.
10.5 HW: Find all pairs $(k,\ell)$ of positive integers so that the $k\times\ell$ grid is Hamiltonian (has a Hamilton cycle). Prove. You proof for the non-Hamiltonicity case should be just one line (with reference to a DO exercise above). (8 points)
10.6 HW: Recall that a graph is self-complementary if it is isomorphic to its complement. Prove: if there exists a self-complementary graph on $n$ vertices then $n\equiv 0$ or $1\pmod 4$. (5 points)
10.7 REWARD PROBLEM (no credit, do not hand in: your reward is in the solution). Prove the converse of the previous statement: if $n\equiv 0$ or $1\pmod 4$ then there exists a self-complementary graph on $n$ vertices.
10.8 CHALLENGE. For a positive integer $x$ and a graph $G$, let $f_G(x)$ denote the number of legal colorings of $G$ where the colors are taken from the set $[x]=\{1,2,\dots,x\}$. For instance, for the empty graph ${\bar K}_n$ we have $f_{{\bar K}_n} = x^n$ (all the $x^n$ colorings are legal). Prove: for every graphs $G$, the function $f_G(x)$ is a polynomial of degree $n$. This is called the chromatic polynomial of $G$.
10.9 HW (due Tue, Nov 12): Give simple formulas for the chromatic polynomials of the following graphs: (a) $K_n$ (b) $P_n$ (the path of length $n-1$; the subscript $n$ refers to the number of vertices). (2+3 points)
10.10 DO: Give a simple formula for the chromatic polynomial of $C_n$ (the cycle of length $n$).
$\newcommand{\grid}{\mathrm{Grid}}$ 10.11 HW (due Tue, Nov 12): Let $\grid(k,\ell)$ denote the $k\times \ell$ grid graph. (It has $n=k\ell$ vertices.) Determine $\alpha(\grid(k,\ell))$ (the independence number of $\grid(k,\ell)$, i.e., the maximum size of an independent set in this graph). As always, prove your answer. Hint: First, solve the problem for the case when $n$ is even. (9 points) (Note: it seems there was a confusion about the HW status of this problem. Originally posted with the due date of Nov 7, the problem is now due Nov 12.)
10.12 DO: Find a triangle-free graph (a graph with no triangles) that is not 3-colorable (i.e., its chromatic number is at least 4). Let your graph have 11 vertices and a 5-fold symmetry. (Grötzsch's graph)
10.13 CHALLENGE. For every $k$, find a triangle-free graph that is not $k$-colorable.
10.14 DO: Let $G$ be a graph and for $x,y\in V(G)$, let $d(x,y)$ denote the distance of $x$ and $y$, i.e., the length of the shortest path between $x$ and $y$. Prove that the function $d$ is a metric on $V(G)$. (You need to prove that $d$ satisfies the triangle inequality.)
10.15 HW (due Tue, Nov 12): Prove: $\chi(G)\alpha(G)\ge n$. (Recall: $\chi(G)$ denotes the chromatic number and $\alpha(G)$ denotes the independence number of the graph $G$.) (5 points)
10.16 DO: We proved in class that almost all graphs have diameter 2. Review the proof. The last step of the proof was the statement that $\lim_{n\to\infty} \binom{n}{2}(3/4)^{n-2} = 0$. Prove this statement.
10.17 DO: If $G$ is a graph with $m$ edges then $G$ has a bipartite subgraph with $\ge m/2$ edges. Hint: Color the vertices red and blue at random. What is the expected number of illegal (red-red or blue-blue) edges?
10.18 DO: Prove: if every vertex of $G$ has degree $\le k$ then $\chi(G)\le k+1$.
10.19 DO: Prove: if $G$ is triangle-free then $\chi(G) = O(\sqrt{n})$.
10.20 DO: What is the minimum number of edges an $n$-vertex graph of diameter 2 can have?
9.1 DO: Let $x,y\in V(G)$ be two vertices of the graph $G$. Prove: if there is a walk from $x$ to $y$ then there is a path between $x$ and $y$.
9.2 HW: Count the paths of length $k$ in the complete graph $K_n$. Remember that we view paths as subgraphs, so a path of length $\ge 1$ corresponds to two walks (one each way). So don't count them twice. Remember also that a path of length $k$ has $k$ edges (makes $k$ steps) and thereefore it has $k+1$ vertices. (5 points)
9.3 HW: Let $g_n$ denote the number of non-isomorphic graphs
with $n$ vertices. Let
$\displaystyle{N = 2^{\binom{n}{2}}} .$
(a) Prove: $N/n! \le g_n \le N$.
(b) Prove: $\log_2 g_n \sim n^2/2$.
(6+5 points)
9.5 DO: Prove: a graph is connected if and only if it has a spanning tree. As a conquence, a connected graph with $n$ vertices has at least $n-1$ edges.
9.4 DO: Prove that for a graph $G$ with $n$ vertices,
the following conditions are equivalent:
(a) $G$ is a tree.
(b) $G$ is connected and has $n-1$ edges.
(c) $G$ has no cycles and has $n-1$ edges.
(d) There is a unique path between every pair
of vertices of $G$.
9.5 DO: Prove: if $G$ has $c$ connected components then $m\ge n-c$ ($n$ is the number of vertices and $m$ is the number of edges). Moreover, $m=n-c$ exactly if $G$ is a forest (has no cycles). 9.6 HW: Count the graph with $m$ edges on a given set of $n$ vertices. Your answer should be a simple expression involving binomial coefficients. (4 points)
9.7 DO: Prove: every tree $T$ has a vertex $x$ such that all longest paths in $T$ pass through $x$.