** The notation "MN" refers to the Matoušek--Nešetříl
text **

** The notation "LIN" refers to the instructor's
Linear Algebra notes,** currently in progress.

"DO" problems are strongly recommended to check your understanding of the concepts; do not hand them in.

Problems labeled "HW" must be handed in at the beginning of the next class unless expressly stated otherwise.

BONUS PROBLEMS are not mandatory but the same deadline applies as to ordinary homework.

**16.1 DO:** Study Inclusion-Exclusion from MN, including the
chapter "Hatcheck Lady & Co."

16.2 DO (Inclusion-Exclusion, probability version): Let
*A*_{1},...,*A*_{n} be events
and let *B* be the complement of
*A*_{1} ∪ ... ∪ *A*_{n}.
Prove: *P(B) = S _{0} - S_{1} + S_{2} - ... *
where

**16.3 HW:** (Truncated inclusion-exclusion: Bonferroni's
inequalities): With the notation of the previous exercise,
let R_{i} denote the truncation of the right hand side
of the Inclusion-Exclusion formula at item *i*, i.e.,

R_{0} = S_{0} ;

R_{1} = S_{0} - S_{1} ;

R_{2} = S_{0} - S_{1} + S_{2} ; etc.

Prove: if *i* is even then P(B) ≤ R_{i}; and if
*i* is odd then P(B) ≥ R_{i}. **(12 points)**

16.4 NOTATION: Let *V* be a vector space over the field *F* and
let
__e__=(*e*_{1} , ... , *e*_{k})
be a basis of *V*. Then every vector *x ∈ V*
can be expressed uniquely as a linear combination
*x = ∑ α _{i} e_{i}*.
The

16.5 NOTATION: Let *V* and *W* be vector spaces over the same field
*F*. Recall that a *linear map* *φ : V → W*
is a function that satisfies *φ(x+y) = φ(x) + φ(y)*
and *φ(αx) = αφ(x).* Let now
__e__ = (*e*_{1} , ... , *e*_{k})
be a basis in *V* and
__f__ = (*f*_{1} , ... , *f*_{m})
be a basis in *W*. We associate with *φ*
an *m × k* matrix *A* which we denote
by *A = *[*φ*]_{e,f} ;
the *j*-th column of *A* is
[*φ(e _{j})*]

If

**16.6 HW** (a) Let *V* be the real plane and
*ρ _{α}* the rotation of

16.7 DO **(prescribing a linear map on a basis)**:
Let *V, W* be vector spaces over the field *F*.
Let __e__=(*e*_{1} , ... , *e*_{k})
be a basis of *V*. Prove:
(∀ *w*_{1} ,..., *w*_{k} ∈ *W*)
(∃! a linear map *φ : V → W*)(∀ *i*
∈ [*k*])
(*φ*(*e*_{i}) = *w*_{i}).
(The symbol "(∃! φ)" reads "there exists a unique φ
such that ...")

(b) Prove that the correspondence
*φ* ↦ [*φ*]_{e,f}
described in 16.5 is a bijection between the
*V → W* linear maps and the *m × k*
matrices over *F*.

**16.8 HW (Fibonacci shift):** (a) Consider the space
*V* consisting of all infinite sequences
* a = (a_{0}, a_{1}, ...)*
of real numbers. Let

16.9 DO: Let φ : *V → W* be a linear map. Let
__e__=(*e*_{1} , ... , *e*_{k})
be a basis in *V* and
__f__=(*f*_{1} , ... , *f*_{m})
be a basis in *W*. Prove: for all *v ∈ V*
we have [φ(*v*)]_{f} =
[φ]_{e,f}[v]_{e}.

16.10 DO: Let *A,B* be *m × k* matrices. Prove:
(a) The conditions *x* ∈ *F*^{k}, *x ≠ 0*,
and *Ax = Bx* do not imply *A = B*.
(b) If (∀ *x* ∈ *F*^{k} )(*Ax = Bx*)
then *A = B*.

16.11 DO **(change of basis: change of coordinates)**:
Let __e__=(*e*_{1} , ... , *e*_{k})
and __e'__=(*e*_{1}' , ... , *e*_{k}')
be two bases of *V* (the "old" basis and the "new" basis).
We define the *basis change transformation*
σ : *V → V* by setting σ(*e*_{i}) =
*e*_{i}'. Let *S* = [σ]_{e}
(the "basis change matrix"). (a) Prove that *S* is invertible.
(b) Prove: (∀ *v ∈ V*)([*v*]_{e'} =
*S ^{ -1}* [

16.12 DO **(change of bases: change of matrix)**: Let
φ : *V → W*. Let __e__, __e'__ be two bases
of *V* and __f__, __f'__ be two bases of *W*.
Let *S* be the basis change matrix corresponding to the
basis change __e__ ↦ __e'__ in *V* and let *T*
be the basis change matrix corresponding to the basis change
__f__ ↦ __f'__ in *W*. Let *A* be the matrix
of φ in the "old bases"
(i.e., *A* = [φ]_{e,f}) and
*A'* the matrix of φ in the "new bases"
(i.e., *A* = [φ]_{e',f'}). Prove:
*A' = T*^{-1}*AS*. *Hint*.
Use Exx. 16.10 and 16.11. Start by writing φ(*v*) = *w*.
Translating this to matrix form you get matrix equations of the form
* Ax = y* (in the old bases) and *A'x' = y'* (in the new bases)
(explain the meaning of *x, x', y, y'*). Now
*x' = S*^{-1}*x* and *y' = T*^{-1}*y*
(by Ex. 16.11).
Therefore
*A'S*^{ -1}*x* = *T*^{-1}*y*,
so *TA'S*^{-1}*x* = *y = Ax*. Now use
Ex. 16.10.

16.14 DO **(similar matrices)**: Recall: the *n* × *n*
matrices *A* and *B* are *similar* if
(∃ nonsingular matrix *S*)(*B = S*^{-1}*AS*).
Let us fix a basis __e__ in the *n*-dimensional space *V*.
Prove: *A* and *B* are similar if and only if there exists
a basis __e'__ of *V* and a transformation
φ : *V → V* such that
[φ]_{e} = *A* and
[φ]_{e'} = *B*.

16.15 DO: **(muliplicativity of determinants)** Prove:
det(*AB*) = det(*A*)det(*B*). *Hint.*
(a) Prove the result when *A* is a diagonal matrix.
(b) Prove the result when one of the rows of *A* is all zero.
(c) Perform elementary row operations on *A* and the same
row operations on *AB* simultaneously until we reach either
situation (a) or situation (b).

15.1 DO: Prove: if the vector spaces *F ^{n}* and

15.2 DO: Let *A* be a *k × n* matrix and
*B* be an *n × k* matrix over the field *F*.
Prove: Tr(*AB*) = Tr(*BA*). Here "Tr" denotes the
*trace* of a square matrix (sum of the diagonal elements).
What are the dimensions of *AB* and of *BA*?

15.3 DO: Recall that the matrices *A, B ∈ M _{n}(F)*
are

15.4 DO: Let *A ∈ M _{n}(F)* and let

(a)

**15.5 HW:** Let *λ _{1}, ... , λ_{n}*
be the eigenvalues of

15.6 DO: Prove: for every eigenvalue λ of *A*,
the geometric multiplicity of λ
is not greater than its algebraic multiplicity.

15.7 DO: Let *A ∈ M _{n}(F)* be an upper triangular matrix
(all elements under the diagonal are zero). Prove that the eignevalues
of

15.8 DO: A *diagonal matrix* is a square matrix with zeros everywhere
outside the diagonal (diagonal entries may or may not be zero).
For instance the zero matrix and the indentity matrix are diagonal
matrices. A matrix *A ∈ M _{n}(F)* is

**15.9 HW:** Diagonalize *J _{n}* ,
the

15.10 DO: Prove: the matrix *A ∈ M _{n}(F)*
is diagonalizable if and only if the sum of the geometric
multiplicities of its eigenvalues is

**15.11 HW:** Show a 2 × 2 matrix over *C*
that is not diagonalizable. Prove.
State the eigenvalues of your matrix
along with their algebraic and geometric multiplicities.
Prove. (Hint: make your matrix triangular. This will lead
to very short proofs.) **(10 points)**

**14.1 HW:** Let *J _{n}* denote the

**14.2 HW:** Let *a _{1},...,a_{n}* be
distinct elements of the field

**14.3 HW:** (a) Study Gaussian elimination. (b)
Let *B = (b _{ij})* denote the

**14.4 HW:** A (0,1)-matrix is a matrix whose entries are zeros
and ones. Construct a 3 × 3 (0,1)-matrix whose columns are
linearly independent over **R** but linearly dependent over
**F _{2}**. Prove your answers.

14.5 DO (due Tuesday, Dec 1, before the tutorial): (a) Let *A* be
an *n × n* matrix over the field *F*. Let
*v _{1}, ... ,v_{k}* be right eigenvectors
of

14.6 DO (due Tuesday, Dec 1, before the tutorial): Prove the second miracle: for every matrix (any shape, over any field), the row-rank and the column-rank are equal. (Hint: Gaussian elimination.)

14.7 DO (due Tuesday, Dec 1, before the tutorial): Let
*A* and *B* be *k × n* matrices
over the field *F*. Prove:
rank(* A + B*) ≤ rank(* A*) + rank(* B*).

14.8 DO (due Tuesday, Dec 1, before the tutorial):
Let *F* be a field. Prove: (a) if *F* has characteristic zero
then *F* contains a subfield isomorphic to **Q**;
(b) if *F* has characteristic *p*
then *F* contains a subfield isomorphic to
**F _{p}**, the field of residue classes mod

14.9 CHALLENGE PROBLEM.
Let *a _{0},...,a_{n-1} ∈ *

14.10 **(irreducibility over Q)**. (a) REWARD PROBLEM:
Use the Gauss Lemma (about primitive
polynomials) to prove that if an integral polynomial can be
factored nontrivially over the rationals (i.e., into factors of
lower degrees) then it can be factored nontrivially over the integers.
(b) CHALLENGE PROBLEM: Let

**13.1 HW:** Let *A* be an *n × n* complex matrix
and let *G = ([n], E)* be the associated digraph (* i → j*
exactly if *a*_{ij} ≠ 0).
Suppose *G* is strongly connected and has period *d*. Prove:
if λ is an eigenvalue of *A* and ω is a complex
*d*-th root of unity (i.e., ω^{d} = 1) then
λω is also an eigenvalue. **(8 points)**

**13.2 HW:** Prove: if *R* is a finite integral domain then
*R* is a field. Recall that an integral domain is a commutative
ring with an identity element that is distinct from zero (so *R*
has at least two elements) and with no zero-divisors, i.e., if
*ab = 0* then *a = 0* or *b = 0*. The only thing
you need to verify is that under the conditions stated, every
nonzero element of *R* has a (multiplicative) inverse.
**(8 points)**

**12.1 HW (a Ramsey bound):** (a) Define the Erdős - Rado arrow
symbol *n → (r,s,t)*. (b) Prove:
17 → (3,3,3). **(3+8 points)**

(c) CHALLENGE PROBLEM: Prove that (b) is tight, i.e.,
*16 ↛ (3,3,3)* (16 does not arrow (3,3,3).
Hint: GF(16).

**12.2 HW (convergence of a finite Markov Chain):** LN 8.1.8.
(Prove the convergence of the powers of
a certain 2 × 2 stochastic matrix.) State the limit
matrix. -- Your proof should be short. Elegance counts.
**(8 points)**

**12.3 HW:** LN 8.1.17. (Prove that the powers of certain
2 × 2 stochastic matrix do not converge; find
a stationary distribution.)
**(2 + 4 points)**

**12.4 HW:** LN 8.1.24 (if a nonnegative matrix with strongly
connected associated digraph has a nonnegative eigenvector
then that eigenvector is positive). (A matrix is "nonnegative"
if all entries are nonnegative. A matrix is "positive" if
all entries are positive. The same definitions apply to
vectors [which are 1 × *n* matrices].)
**(10 points)**

**12.5 HW (due Monday, Nov 23):** Solve problem 2(e) of the Third Quiz.
(Go to the Course home page, then click "Grading, tests" on the
banner, then click "third quiz.") The problem asks you to prove
a Chernoff-style inequality for the number of occurrences of
the substring ABA in a random string over the alphabet {A,B}.
State explicitly the values of the constants *c,d*. --
Your solution should be short and elegant. If you cannot
describe the essence of the solution in three lines, you are
on the wrong track.
**(12 points)**

**12.6 HW (due Monday, Nov 23):** Prove: for almost all graphs
*G* we have χ(*G*) > (α(*G*))^{100}.
(χ denotes the chromatic number, α the independence
number. -- The entire solution should take no more than 6 lines.)
**(8 points)**

**12.7 HW (due Monday, Nov 23):** (a) A graph is *planar* if
it can be drawn in the plane so that the edges don't cross.
Study planar graphs from MN (MN2 Chapter 6; MN1, Chapter 5)
(MNi: i-th edition of MN). Learn Kuratowski's Theorem and
Euler's formula. (b) Understand MN2, Proposition 6.3.3
(MN1, Prop. 5.3.3) - that a planar graph with *n ≥ 3* vertices
has *m ≤ 3n-6* edges. (c) Use (b) to prove that
every planar graph has a vertex of degree ≤ 5.
(d) Use (c) to prove the
"six-color theorem:" every planar graph is 6-colorable.
Your proof should be a very short induction.
**(0 + 0 + 3 + 8 points)**

12.8 DO (due Monday, Nov 23): Prove that for all sufficiently large *n*,
the probability that a random graph is planar
is less than 2^{-0.49n2}.

12.9 CHALLENGE PROBLEM. Let *S*_{3}(*n*) denote
the number of those subsets of an *n*-set whose size
is divisible by 3. Prove:
|*S*_{3}(*n*) - 2^{n}/3| < 1.

**11.1 HW:** LN 6.2.27 (in a digraph, vertices
in the same strong component have the same period)
**(8 points)**

**11.2 HW:** Prove: if *C* is a strong component of a
digraph *G* then the period of any vertex in
*C* is equal to the gcd of the lengths of all
cycles contained in *C*. (Note: a cycle has
no repeated vertices.)
**(8 points)**

Note: 11.1 follows from 11.2. A *full*
solution to 11.2 also earns you full credit for 11.1
(but no extra credit, if you had a separate solution
to 11.1). A partial solution to 11.2 may earn you
zero credit for 11.1. On the other hand, it is
possible to solve 11.2 *using* 11.1. Naturally,
such a solution will earn you full credit for 11.2
but no credit for 11.1 (unless you separately solve 11.1).

11.3 DO: A digraph *G=(V,E)* is *Eulerian* if
(∀ *x ∈ V*)(
deg^{+} (*x*) = deg^{-} (*x*))
(the out-degree of *x* is the same as the in-degree
of *x*). A digraph is *weakly connected*
if by ignoring orientation we get a connected graph.

Prove: if *G* is an Eulerian digraph and
*G* is weakly connected then *G* is
strongly connected.

Here is the link to homework sets 1-10.