Homework due in writing is marked "HW." Do not hand in solutions to "DO," "Reward," "Puzzle" problems. The "DO" exercises are part of the regular course material, do solve them. The solution to a "Reward" or "Puzzle" problem is its own reward; these problems are meant to be challenging; "Puzzle" problems tend to have an "AH-HA" solution (which may not be easy to find). If you have solved or plan to solve a Challenge, Reward, or a Puzzle problem, let the instructor know (by email). If you hand in a solution to a Challenge Problem, put it on a separate sheet (do not mix it with regular homework).
07-14.1 DO: Let $G$ be a graph of maximum degree $\deg_{\max}$.
Prove: $\chi(G)\le 1+\deg_{\max}$. ($\chi$ denotes the
chromatic number.) (Hint: greedy coloring.)
07-14.2 DO: Construct a triangle-free graph with chromatic number
$\ge 4$. Hint: 11 vertices, symmetry of order 5: Grötzsch's graph.
07-14.3 DO: Prove: The vertices of every graph can be sorted so
that the greedy coloring is optimal.
07-14.4 HW (Greedy coloring dismal) (due Monday, July 18):
For all even values of $n$,
find a bipartite graph with $n$ vertices on which the greedy coloring
algorithm uses $n/2$ colors.
07-14.5 HW (Due Tuesday, July 19): Prove: if $G$ is a
triangle-free graph then $\chi(G)=O(\sqrt{n})$, i.e., there is
a constant $C$ such that for all sufficiently large $n$ we have
$\chi(G)\le C\sqrt{n}$. Estimate your $C$. ($n$ is the number
of vertices.)
07-14.6 DO: Prove: $\alpha(G)\chi(G)\ge n$ where $\alpha$ is the
independence number (size of largest independent set).
Spectral graph theory.
In the next several problems (until the separating asterisks),
$G$ is a graph with $n$ vertices and
$\lambda_1\ge\lambda_2\ge\dots\ge\lambda_n$ are the eigenvalues
of its adjacency matrix.
07-14.7 DO: Prove: $\sum_{i=1}^n \lambda_i = 0$.
07-14.8 HW (due Monday, July 18):
Prove: $\sum_{i=1}^n \lambda_i^2 = 2m$ where $m$
is the number of edges.
07-14.9 HW (due Monday, July 18):
Prove:
$(\forall i)(|\lambda_i|\le \lambda_1)$.
(Hints: 1. First prove this for connected graphs.
2. Rayleigh quotient)
07-14.10 DO: Prove: If $G$ is connected then $\lambda_2 < \lambda_1$.
07-14.11 DO: Prove: If $G$ is $r$-regular then $\lambda_1 = r$.
07-14.12 DO: Prove: If $G$ is regular then $G$ is connected if and
only if $\lambda_2 < \lambda_1$.
07-14.13 HW (spectral upper bound on the chromatic number):
Prove: $\chi(G) \le \lambda_1 +1$.
07-14.14 DO: Prove: If $G$ is bipartite then its spectrum is symmetric
aboput the origin, i.e., $\lambda_n = -\lambda_1$, $\lambda_{n-1} =
-\lambda_2$, etc.
07-14.15 DO: Prove the following strong converse of this statement:
If $G$ is connected and $\lambda_n = -\lambda_1$ then $G$ is
bipartite.
07-14.16 DO: Recall the definition of a complex Hermitian
inner product space:
a vector space over $\ccc$, endowed with a positive definite Hermitian
sesquilinear inner product. (What do these terms mean?)
07-14.17 DO: Prove: Every finite-dimensional Hermitian space has
an ONB; and every ON system can be extended to an ONB.
(Hint: Adapt Gram-Schmidt to these spaces.)
07-14.18 DEF (unitary transformation):
The linear transformation $\vfi : V\to V$ is unitary
if it preserves the inner product:
$(\forall v,w\in V)(\langle \vfi v,\vfi w\rangle =
\langle v, w\rangle)$.
The unitary group $U(V)$
is the group of unitary transformations acting on $V$.
07-14.19 DEF (Hermitian transformation):
The linear transformation $\vfi : V\to V$ is Hermitian
if $(\forall v,w\in V)(\langle v,\vfi w\rangle =
\langle \vfi v, w\rangle)$.
07-14.20 DEF (unitary matrix): The matrix $A\in M_n(\ccc)$
is unitary if its columns form an ONB of $\ccc^n$
wrt the standard Hermitian dot product $\langle u,v\rangle = u^*v$.
07-14.21 DO: Prove that this is equivalent to the rows forming on ONB
of $\ccc^n$.
07-14.22 DEF (Hermitian matrix): The matrix $A\in M_n(\ccc)$
is Hermitian if $A^*=A$.
07-14.23 DO: Let $\ul{b}=(b_1,\dots,b_n)$ be an ONB of $V$
and let $\vfi : V\to V$ be a linear transformation.
Let $A=[\vfi]_{\ul{b}}$ be the matrix of $\vfi$ wrt $\ul{b}$.
Prove: (a) $\vfi$ is unitary if and only if $A$ is unitary.
(b) $\vfi$ is Hermitian if and only if $A$ is Hermitian.
07-14.24 DO: Prove: (a) All eigenvalues of a unitary matrix have
absolute value 1.  : (b) All eigenvalues of a Hermitian
matrix are real.
07-14.25 DEF (normal matrix): The matrix $A\in M_n(\ccc)$
is normal if it commutes with its adjoint, i.e., if
$AA^* = A^*A$.
07-14.26 DEF: We say that the matrices $A,B\in M_n(\ccc)$ are
unitarily similar if there exists a unitary matrix $S$
such that $B=S^{-1}AS$.
07-14.27 DEF: We say that a matrix $A\in M_n(\ccc)$ is
unitarily diagonalizable if it is unitarily similar
to a diagonal matrix.
07-14.28 DO: Prove: a matrix $A\in M_n(\ccc)$ is
unitarily diagonalizable if and only if it has an
orthonormal eigenbasis. -- A major goal
we reach below is the characterization of these
matrices; this will be the Complex Spectral Theorem.
07-14.29 DO: Prove: If the matrices $A$ and $B$ are unitarily
similar and $A$ is normal then so is $B$.
07-14.30 DO: Prove: all diagonal matrices are normal. Therefore,
by the preceding exercise, all unitarily diagonalizable matrices
are normal.
07-14.31 COMPLEX SPECTRAL THEOREM: A matrix $A\in M_n(\ccc)$ is
unitarily diagonalizable if and only if $A$ is normal.
The proof will follow from the following two exercises.
07-14.32 THEOREM (Issai Schur): Every matrix $A\in M_n(\ccc)$ is
unitarily similar to an upper triangular matrix.
07-14.33 HW (due Tuesday, July 19):
Prove: if a triangular matrix $A\in M_n(\ccc)$ is normal then it is
diagonal. (This completes the proof of the Complex Spectral Theorem.)
07-14.34 HW (due Monday, July 19): Let $A\in M_n(\ccc)$.
Prove: (a) $A$ is Hermitian if and only if $A$ is normal and all
eigenvalues of $A$ are real. (b) $A$ is unitary if and only
if $A$ is normal and all eigenvalues of $A$ have unit absolute
value.
07-13.1 HW: Let $\dfrac{d}{dt}: P_{n}(\rrr[t]) \to P_{n}(\rrr[t])$
be the derivative linear transformation of the space of real polynomials
of degree $\le n$. Prove that the number of
$\dfrac{d}{dt}$-invariant subspaces is $n+2$,
and find all of them; give a very simple description of each.
(This count includes the trivial invariant subspaces.)
07-13.2: Consider the following $n\times n$ circulant matrix:
$B=$
$
\begin{bmatrix}
0 & 0 & 0 & \dots & 0 & 1 \\
1 & 0 & 0 & \dots & 0 & 0 \\
0 & 1 & 0 & \dots & 0 & 0 \\
0 & 0 & 1 & \dots & 0 & 0 \\
\vdots & \vdots & \vdots& \ddots& \vdots & \vdots \\
0 & 0 & 0 & \dots & 1 & 0 \\
\end{bmatrix}
$.
(a) DO: Count the invariant subspaces of $B$ over $\ccc$ (i.e, count
the $B$-invariant subspaces of $\ccc^n$). Prove there
are exactly $2^{n}$ (including the trivial ones). Find
all the minimal invariant subspaces.
(A minimal invariant subspace is an invariant subspace that is not
$\{0\}$ but the only invariant subspace properly contained in it is
$\{0\}$.)
Hint:
Find an eigenbasis of $B$ over $\ccc$ and find the eigenvalues.
(b) HW (due Monday, July 18):
Find a 2-dimensional minimal invariant subspace for $B$ over $\rrr$.
(c) CH: Let $n=p$ be a prime. Prove that $B$ has exactly 4
invariant subspaces over $\qqq$ (including the trivial ones).
07-13.3 DO: Let $f(t)=t^{4} + at^{3}+bt^{2}+c{t}-15$ where
$a,b,c \in \zzz$. Let $\alpha \in \zzz$ be a root of $f$.
Prove: $\alpha\mid 15$. (Hint: the proof is one line.)
(Note: this was a lemma to the proof of the Hoffman-Singleton
Theorem.)
07-13.4 DO*: (a) The number of automorphisms of Petersen's graph is 120.
(b) The group of automorphisms of Petersen's graph
is isomorphic to $S_5$, the symmetric group of degree 5.
07-13.5 DO: Prove the following extension of the Spectral Theorem:
Let $V$ be a finite-dimensional Euclidean space and $\vfi : V\to V$
a symmetric transformation. Prove: any orthonormal system of
eigenvectors of $\vfi$ can be extended to an orthonormal eigenbasis.
07-13.6 DO: Let $A\in M_n(\fff)$, $v\in \fff^n$, and $\lambda\in \fff$
such that $Av=\lambda v$. (a) Prove: for every $k\ge 0$
we have $A^kv=\lambda^k v$. (b) Prove: for any polynomial
$f\in\fff[t]$ we have $f(A)v=f(\lambda)v$.
07-13.6 DEF: Let $A\in M_n(\fff)$. A minimal polynomial of $A$ is
a polynomial $f$ of smallest positive degree such that $f(A)=0$.
Such a polynomial exists by the Cayley-Hamilton theorem.
07-13.7 DO: Prove: The minimal polynomial is unique up to nonzero
scalar factors. We denote the unique monic minimal
polynomial of $A$ by $m_A$ and refer to it as the
minimal polynomial.
07-13.8 DO: Let $A \in M_{n}(\fff)$. Prove: For all polynomials
$g \in \fff[t]$ we have that $g(A)=0$ if and only if $m_{A}\mid g$.
07-13.9 DO: Let $A \in M_{n}(\fff)$ and $f \in \fff[t]$. Prove:
$\ker f(A)$ is $A$-invariant.
07-13.10 HW (due Monday, July 18): Let $A \in M_{n}(\fff)$ and
let $f,g \in \fff[t]$ such that $fg = m_A$. Prove:
If $f,g$ are relatively prime, i.e., $\gcd(f,g) = 1$, then
$\fff^n = \ker(f(A)) \oplus \ker(g(A))$.
07-13.11 HW: Let $A\in M_n(\fff)$/. Prove:
(a) Every eigenvalue of $A$ is a root of the minimal polynomial $m_A$.
(b) Prove: $\lambda\in\fff$ is an eigenvalue of $A$ if and only if
$m_A(\lambda)=0$.
07-13.12 HW:
Suppose $A$ is a block-diagonal matrix with diagonal blocks
$A_1, \dots, A_k$. (The diagonal blocks are square matrices.)
Give a very simple expression of $m_A$
in terms of the minimal polynomials of the $A_i$.
07-13.13 DO $\heartsuit$ (discussion problem for Friday, July 15):
Let $P_0, P_1,\dots,P_{n-1}$ be the vertices of a regular $n$-gon
inscribed in the unit circle. Prove:
$$\overline{P_0P_1} \cdot \overline{P_0P_2} \dots \overline{P_0P_{n-1}} = n.$$
Here $\overline{P_iP_j}$ denotes the length of the segment connecting the
points $P_i$ and $P_j$.
07-12.1 DEF: Let $G$ graph. The girth of $G$ is the length of
its longest cycle. If $G$ is cycle-free, we define its girth to
be infinite.
07-12.2 DO: Find infinitely many 3-regular planar graphs of girth 5.
07-12.3 CH: (a) Prove: Every 3-regular finite planar graph has
girth $\le 5$. (b) Find infinitely many 3-regular toroidal
graphs of girth 6.
07-12.4 HW: Prove: If a $k$-regular graph has girth
$\ge 5$ then $n\ge k^2+1$ where $n$ is the number of vertices.
07-12.5 DO: (Preceding exercise continued)
Prove that equality $(n=k^2+1)$ is possible for $k=1,2,3$.
(Note: equality is also possible for $k=7$, realized by the
so-called the Hoffman-Singleton graph.)
07-12.5 THEOREM (Hoffman-Singleton, 1960):
If there is a $k$-regular graph of girth $\ge 5$ with $n = k^2+1$
vertices then $k\in \{1,2,3,7,57\}$. (To be proved in the next class.)
Note: the case $k=57$ is still unresolved.
07-12.6 DO: Let $A,B\in\fff^{k\times n}$. Prove:
$A=B$ if and only if $x^TAy = x^TBy$ for all $x\in \fff^k$ and
all $y\in\fff^n$.
In the remainder of this section, $V$ is a finite-dimensional
Euclidean space.
07-12.7 DO: Let $\vfi : V\to V$ be a linear transformation.
Prove: if $\vfi$ preserves the norm then it
preserves the inner product, i.e., if
$(\forall v\in V)(||\vfi(v)||=||v||)$ then
$(\forall v,w\in V)(\langle \vfi(v), \vfi(w)\rangle=\langle v,w\rangle$.
(Note: the converse is obvious.) Linear transformations
satisfying the conclusion are called orthogonal transformations.
07-12.8 DO: Prove: the orthogonal transformations of the plane
are the rotations about the origin and the reflections in axes passing
through the origin.
07-12.9 DO: Let $\ul{b}=(b_1,\dots,b_n)$ be an ONB of $V$.
Prove: the linear transformation $\vfi : V\to V$ is orthogonal
if and only if $\vfi$ takes $\ul{b}$ to an ONB, i.e., if
$(\vfi(b_1),\dots,\vfi(b_n))$ is an ONB.
07-12.10 DO (orthogonal group): Prove that the orthogonal
transformations of $V$ are closed under composition, inverse,
and the indentity is an orthogonal transformation. This means
that the set $O(V)$ of orthogonal transformations of $V$ is
a group; it is called the orthogonal group on $V$.
07-12.11 DEF (orthogonal matrices):
We say that a matrix $A\in M_n(\rrr)$ is orthogonal
if $A^TA=I$. The set of $n\times n$ orthogonal matrices
is denoted $O(n)$.
07-12.12 DO: Prove: $A\in M_n(\rrr)$ is orthogonal if and only if
the columns of $A$ form and ONB of $\rrr^n$.
07-12.13 DO (Third miracle of linear algebra):
Prove: the columns of a matrix $A\in M_n(\rrr)$ are orthonormal
if and only if the rows are orthonormal, i.e., if $AA^T = I$.
07-12.14 DO: Prove: $A$ is orthogonal if and only if $A^T=A^{-1}$.
07-12.15 DO: Let $\ul{b}=(b_1,\dots,b_n)$ be an ONB of $V$.
Prove: the linear transformation $\vfi : V\to V$ is an orthogonal
transformation if and only if the matrix $[\vfi]_{\ul{b}}$ is
orthogonal.
07-13.16 DO: Use the preceding exercise and an exercise further up
to prove that $O(n)$ is a group under multiplication.
This is called the orthogonal group.
07-12.17 DEF: We say that two matrices $A,B\in M_n(\rrr)$ are
orthogonally similar if $(\exists S\in O(n))(B=S^{-1}AS)$.
07-12.18 DO (Spectral Theorem restated): Prove that the following
statement is equivalent to the Spectral Theorem:
07-12.19 DEF: For a matrix $A\in \ccc^{k\times n}$, we write $A^*$ to denote
the conjugate-transpose of $A$, i.e., $A^*\in \ccc^{n\times k}$
and if $A=(a_{ij})$ and $A^*=(b_{ij})$ then
$b_{ij} = {\overline a}_{ij}$ where
$\overline{z}$ denotes the complex conjugate of $z\in \ccc$.
The conjugate-transpose $A^*$ is also called the adjoint of $A$.
A matrix $A\in M_n(\ccc)$ is called Hermitian or self-adjoint
if $A^*=A$.
07-12.20 DO: For complex matrices $A,B$ of appropriate dimensions,
prove: $(AB)^* = B^*A^*$.
07-12.21 DO (Eigenvalues of self-adjoint matrices): Prove:
all (complex) eigenvalues of a self-adjoint matrix are real.
Hint: Let $v$ be an eigenvector, so $Av=\lambda v$.
Prove that $\lambda={\overline{\lambda}}$ by comparing the expression
$v^*Av$ with its conjugate-transpose.
07-12.22 DO (Spectral Theorem reproved): Recall that
the hard part of the proof of the Spectral Theorem was the
proof of the lemma that every symmetric transformation
of a (real) Euclidean space has a real eigenvector,
or equivalently, that every real symmetric matrix
has a real eigenvalue. Infer this fact directly
from the preceding exercise.
07-12.23 HW (Converse of the Spectral Theorem): Let
$A\in M_n(\rrr)$. Prove: if $A$ is orthogonally similar
to a diagonal matrix then $A$ is symmetric.
07-12.24 HW (eigenvalues of orthogonal matrices):
Prove: The complex eigenvalues of the orthogonal matrices have
unit absolute value.
07-11.1 DEF: Let $G$ be an abelian group (such as the additive structure of
a vector space, or the set $\zzz$ of integers). If $S,T\subseteq G$ then
we write $S+T = \{s+t\mid s\in S, t\in T\}$.
07-11.2 DO: Show: (a) If $|S|=k$ and $|T|=\ell$ then $|S+T|\le k\ell$.
(b) Show that this bound is tight for subsets of $\rrr^2$.
(c) Show that this bound is tight for subsets of $\zzz$.
07-11.3 DO: Show: (a) If $|S|=k$ then $|S+S|\le \binom{k+1}{2}$.
(b) Show that this bound is tight for subsets of $S\subset\zzz$.
07-11.4 DO: Show: (a) If $|S|=k$ then $|S+S+S|\le \binom{k+2}{3}$.
(b) Show that this bound is tight for subsets of $S\subset\zzz$.
07-11.5 HW (due Friday, July 15): (a) Let $|S|=k$.
Find the tight upper bound for
$|S+\dots+S|$ where we add up $r$ copies of $S$. Your answer should
be a very simple formula in terms of $k$ and $r$.
(b) Show that your bound is tight for subsets $S\subset \zzz$.
07-11.6 DO: Let $V$ be a vector space and $W_1,W_2\le V$ subspaces.
Prove: $W_1+W_2 = \span(W_1\cup W_2)$.
07-11.7 HW (modular equation): Let $U_1,U_2\le V$. Prove:
$\dim(U_1+U_2)+\dim(U_1\cap U_2)=\dim(U_1)+\dim(U_2)$.
(Hint: Start with a basis of $U_1\cap U_2$.)
07-11.8 DEF: Let $U_1,\dots,U_k\le V$ and let $W=U_1+\dots+U_k$.
We say that $W$ is the direct sum of the $U_i$, denoted
$W=U_1\oplus\dots\oplus U_k$, if for all $i$ we have
$$ U_i \cap \left(\bigcup_{j\neq i} U_j\right) = \{0\}.$$
07-11.9 DO: Let $W=\sum_{i=1}^k U_i$. Prove that the following are
equivalent:
07-11.10 DO: Prove: $\dim\left(\bigoplus_{i=1}^k U_i\right)
=\sum_{i=1}^k \dim(U_i)$.
07-11.11 DO: Prove: $u_1,\dots,u_k\in V$ are
linearly independent if and only if
$\sum_{i=1}^k \span(u_i) = \bigoplus_{i=1}^k \span(u_i)$.
07-11.12 DEF: Let $V$ be a vector space over the field $\fff$,
$\vfi : V\to V$ a linear transformation, and $\lambda\in\fff$.
The subspace $U_{\lambda}=\{v\in V\mid \vfi(u)=\lambda u\}$
is called the eigensubspace of $\vfi$ corresponding to the
value $\lambda$. Note that $\lambda$ is an eigenvalue of $\vfi$
if and only if $\dim(U_{\lambda})\ge 1$.
07-11.13 DO: Prove:
$\sum_{\lambda} U_{\lambda} = \bigoplus_{\lambda} U_{\lambda}$.
(Here the summation is over all $\lambda\in\fff$.
In effect this is a finite sum since $U_{\lambda}=\{0\}$ unless
$\lambda$ is an eigenvalue; we could have written that the
summation is over all eigenvalues $\lambda$.)
07-11.14 HW (due Fri Feb 15): Let
$\vfi : V\to V$ be a linear transformation with eigensubspaces
$U_{\lambda}$. Prove: $\vfi$ is diagonalizable (has an eigenbasis)
if and only if $V=\sum_{\lambda} U_{\lambda}$.
07-11.15 DO: Let $f(t) = \dfrac{at^{2}+bt+c}{dt^{2}+e}$ with
$a,b,c,d,e \in \rrr$ and $d^{2}+e^{2} \neq 0$. Show that if
$f(0) \geq f(t)$ for all $t\in\rrr$ then $b=0$. (Hint: $f'(0)$).
In the remaining exercises in this section, $V$ is a
Euclidean space.
07-11.16 DEF: Let $v\in V$ and $S\subseteq V$.
We say that $v$ is orthogonal to $S$, notation: $v\perp S$, if
$v\perp s$ for all $s\in S$. We write $S^{\perp}=\{v\in V\mid v\perp S\}$.
We say that the subsets $S,T\subseteq V$ are orthogonal, notation:
$S\perp T$, if $(\forall s\in S)(\forall t\in T)(s\perp t$)$.
07-11.17 DO: Prove: If $W\le V$ then $W\cap W^{\perp} = \{0\}$.
07-11.18 DO: Prove: If $U_1,\dots,U_k$ are pairwise orthogonal subspaces
of $V$ then $\sum_{i=1}^k U_i = \bigoplus_{i=1}^k U_i$.
07-11.19 DO: Show: (a) $V^{\perp}=\{0\}$ (b) $\{0\}^{\perp}=V$
(c) If $S\subseteq T\subseteq V$ then $S^{\perp}\supseteq T^{\perp}$
(d) What is the perp of the empty set? (Note that your answer needs to
be consistent with (a), (b), (c).)
07-11.20 DO: (a) Prove: $(\forall S\subseteq V)(S^{\perp}\le V)$.
(b) Prove: $(\forall S\subseteq V)(\span(S)^{\perp}=S^{\perp})$.
07-11.21 DO: Prove: (a) $(\forall S\subseteq V)(S\subseteq S^{\perp\perp})$
(b) $\span(S)\subseteq S^{\perp\perp}$.
07-11.22 CH: Find a Euclidean space $V$ and a subspace $W\le V$
such that $W^{\perp\perp}\neq W$. (Hint: make $W$ a proper subspace
of an infinite-dimensional Euclidean space $V$ such that
$W^{\perp} = \{0\}$.)
In the remaining exercises in this section, $V$ is a
finite-dimensional Euclidean space
and $\vfi : V\to V$ is a linear transformation of $V$.
07-11.23 DO (Dimension complementarity):
Prove: For any subspace $W\le V$
we have $\dim W + \dim W^{\perp}=\dim V$.
07-11.24 DO: Prove: If $W\le V$ then $V=W\oplus W^{\perp}$.
07-11.25 DO: Prove: If $W\le V$ then $W^{\perp\perp}=W$.
07-11.26 DEF: Let $\vfi : V\to V$ be a linear transformation. We say
that $\vfi$ is symmetric if
$(\forall v,w\in V)(\langle v,\vfi w\rangle = \langle \vfi v, w\rangle)$.
07-11.27 DO: Let $\vfi : V\to V$ be a linear transformation of the
finite-dimensional Euclidean space $V$. Let $\ul{b}=(b_1,\dots,b_n)$
be an ONB of $V$. Let $A=[\vfi]_{\ul{b}}$ be the matrix of $\vfi$ with
respect to $\ul{b}$. Prove: $\vfi$ is a symmetric linear transformation
if and only if $A$ is a symmetric matrix.
07-11.28 DEF: $W\le V$ is a $\vfi$-invariant subspace
if $(\forall w\in W)(\vfi(w)\in W)$.
07-11.29 DO (Eigenvectors vs. invariant subspaces):
Let $v\in V$, $v\neq 0$.
Prove: $\span(v)$ is $\vfi$-invariant if and only
if $v$ is an eigenvector of $\vfi$.
07-11.30 DO: Find all invariant subspaces of our 3-dimensional geometry
with respect to the following transformations: (a) rotation by $\theta$
about a line through the origin (b) projection on a plane
that includes the origin (c) reflection in a plane that includes
the origin (d) the identity map (e) the zero map.
07-11.31 HW: Prove: If $\vfi$ is symmetric and $W$ is
$\vfi$-invariant then $W^{\perp}$ is $\vfi$-invariant.
07-11.32 DEF:
Let $\vfi$ be a symmetric linear transformation with eigenvalues
$\lambda_1\ge \dots \ge \lambda_n$. The Rayleigh quotient
of $\vfi$ is a function $R_{\vfi} : V\setminus \{0\}\to\rrr$
defined by
$$ R_{\vfi}(v) = \dfrac{\langle v,\vfi(v)\rangle}{\langle v,v\rangle}
\qquad (v\in V, v\neq 0) $$
07-11.33 DO: Verify: For all $\lambda\neq 0$ we have
$R_{\vfi}(\lambda v) = R_{\vfi}(v)$.
07-11.34 DO (Rayleigh's principle):
Let $\vfi$ be a symmetric linear transformation with eigenvalues
$\lambda_1\ge \dots \ge \lambda_n$. Then
$\lambda_1 = \max_{v} R_{\vfi}(v)$ and
$\lambda_n = \min_{v} R_{\vfi}(v)$.
07-11.35 HW (Courant-Fischer theorem) (due Wed, July 13):
Let $\vfi$ be a symmetric linear transformation with eigenvalues
$\lambda_1\ge \dots \ge \lambda_n$. Prove:
$$ \lambda_i =
\max_{\stackrel{U\le V}{\dim U=i}}\min_{v\in U\setminus\{0\}} R_{\vfi}(v).$$
07-11.36 HW (Interlacing Theorem) (due Thu, July 14):
Let $A\in M_n(\rrr)$ be an $n\times n$ symmetric real matrix.
Let $B$ be the $(n-1)\times (n-1)$ matrix obtained by deleting the $i$-th
column and the $i$-th row of $A$ (so $B$ is also symmetric).
Prove: the eigenvalues of $A$ and $B$ interlace, i.e.,
if the eigenvalues of $A$ are $\lambda_1\ge \dots \ge \lambda_n$
and the eigenvalues of $B$ are $\mu_1\ge \dots \ge \mu_{n-1}$
then $\lambda_1\ge\mu_1\ge\lambda_2\ge\mu_2 \ge \dots
\ge \lambda_{n-1}\ge \mu_{n-1}\ge \lambda_n$.
07-07.1 DO: If $(f_0,f_1,\dots)$ is an infinite sequence of polynomials
such that $\deg(f_i)=i$ then this sequence is a basis of the space
$\fff[t]$ of polynomials.
07-07.2 DO: Let $\rho :\rrr\to\rrr$ be a continuous real function
satisfying the following conditions: (a1) $\rho(t)\ge 0$
(a2) $\rho$ is not identically zero
(c) $(\forall n)\left(\int_{-\infty}^{\infty} \rho(t)t^{2n}dt <\infty\right)$.
07-07.3 DEF: Given the weight function $\rho$ as in the preceding
exercise, let us apply Gram-Schmidt orthogonalization to the
sequence $(1,t,t^2,\dots)$ of polynomials. The resulting sequence
$(f_0,f_1,f_2,\dots)$ is called a sequence of orthogonal
polynomials. (So every weight function uniquely defines the
corresponding sequence of orthogonal polynomials.) Note that
$\deg(f_i)=i$. (Why?)
07-07.4 CH (due Fri, July 15): Let
$(f_0,f_1,f_2,\dots)$ be a sequence of orthogonal polynomials.
Prove the following remarkable facts:
(a) All complex roots of each $f_n$ are real.
(b) The roots of $f_n$ and $f_{n-1}$ interlace, i.e., if
the roots of $f_n$ are $\lambda_1 \le \lambda_2 \le\dots\le \lambda_n$
and the roots of $f_{n-1}$ are $\mu_1\le\mu_2\le\dots\le\mu_{n-1}$
then $\lambda_1 \le \mu_1\le \lambda_2 \le\mu_2 \le\dots\le
\mu_{n-1}\le \lambda_n$.
07-07.5 DO: Prove: the sequence $(1,\cos(t),\sin(t),\cos(2t),\sin(2t),\dots)$
of functions in $C[0,2\pi]$ is orthogonal wrt the inner product
$\langle f,g \rangle = \int_{0}^{2\pi} f(t)g(t)dt$. Infer that these
functions are linearly independent (see next exercise).
07-07.6 HW: Let $V$ be a real Euclidean space. Assume the nonzero
vectors $v_1,\dots,v_n$ are orthogonal. Prove that they are linearly
independent.
07-07.7 HW (reassigned): Let $V$ be a real Euclidean space and
$v_1,\dots,v_k\in V$. Recall that the Gram matrix of this list of
vectors is the $k\times k$ matrix $G=G(v_1,\dots,v_k)=
(\langle v_i,v_j\rangle)$. Prove: $v_1,\dots,v_k$ are linearly independent
if and only if $\det(G)\neq 0$.
07-07.8 DEF: A polynomial $h$ is a greatest common divisor of the
polynomials $f_1,\dots,f_k$ if (i) $h$ is a common divisor if the $f_i$,
and (ii) $h$ is divisible by all common divisors of the $f_i$.
07-07.9 DO: Show: if a greatest common divisor of the polynomials
$f_1,\dots,f_k$exists then it is unique up to nonzero scalar factors.
07-07.10 DEF: A subset $A\subseteq \fff[t]$ is called an ideal
(notation: $A\triangleleft \fff[t]$)
if the following three conditions hold:
(o) $0\in A$ (i) $\forall f,g\in A)(f+g\in A)$
(ii) $(\forall f\in A)(\forall g\in \fff[t])(fg\in A)$.
07-07.11 DO: Prove: every principal ideal is an ideal.
07-07.12 DO: Prove: $(f)\subseteq (g)$ if and only if $g\mid f$.
07-07.13 HW (Principal Ideal Theorem): Prove: every ideal in
$\fff[t]$ is principal. (Hint: Division Theorem for polynomials.)
07-07.14 DO: Infer from the precdeing exercise the existence of greatest
common divisor with the following amendment:
07-07.15 DEF: We define the formal derivative of a polynomial
$f=a_0+a_1t+a_2t^2+a_3t^3\dots$ as the polynomial
$f'=a_1+2a_2t+3a_3t^2+\dots$. (Note that this definition makes sense even
over finite fields where limits would make no sense.)
07-07.16 DO: Prove: differetiation is a linear map, i.e.,
$(f+g)'=f'+g'$ and $(\alpha f)'=\alpha f'$.
07-07.17 DO: Prove that $(fg)'=f'g+fg'$.
07-07.18 DO: We define the composition $f\circ g$ of the formal
polynomials $f=a_0+a_1t+a_2t^2+a_3t^3\dots$ and $g\in\fff[t]$
as the polynomial $f\circ g= a_0+a_1g+a_2g^2+a_3g^3\dots$.
Prove the Chain Rule: $(f\circ g)'=(f'\circ g)\cdot g'$.
07-07.19 HW (multiple roots): We say that $\alpha\in\fff$ is a
multiple root of the polynomial $f\in\fff[t]$ if $(t-\alpha)^2\mid f$.
(a) Prove: $\alpha$ is a multiple root of $f$ if and only if
$f(\alpha)=f'(\alpha)=0$. (b) Prove: $f\in\qqq[t]$ has a multiple root
in $\ccc$ if and only if $\gcd(f,f')\neq 1$ (i.e., $\gcd(f,f')$ is not
a nonzero constant).
07-07.20 DO (insensitivity to field extensions): Let $\fff$ be
a subfield of the field ${\mathbb G}$ and let $f_1,\dots,f_k\in \fff[t]$.
Prove: the gcd of the $f_i$ over $\fff$ is the same as
the gcd of the $f_i$ over ${\mathbb G}$.
In the next sequence of exercises we learn another way to prove the
existence of gcd.
07-07.21 NOTATION:
Let $\Div(f)$ denote the set of divisors of the polynomial $f$
and let $\Div(f,g)$ denote the set of common divisors of the polynomials
$f,g$, so $\Div(f,g)=\Div(f)\cap\Div(g)$.
07-07.22 DO: Prove: the polynomial $h$ is a greatest common divisor of
the polynomials $f$ and $g$ if and only if $\Div(f,g)=\Div(h)$.
07-07.23 DO (Euclid's Lemma):
$(\forall f,g,q\in\fff[t])(\Div(f,g)=\Div(f-gq,g)$.
07-07.24 DO: $\Div(f,0)=\Div(f)$.
07-07.25 DO: Recall Euclid's algorithm for polynomials. The algorithm
takes a pair $(f,g)$ of polynomials as input and returns a polynomial $h$.
Prove that $\Div(h)=\Div(f,g)$. (Use the preceding three exercises.)
This proves that $h$ is a gcd of $f$ and $g$, giving an alternative
(algorithmic) proof of the existence of gcd.
07-07.26 DO: Let $S,D\in M_n(\fff)$ where $D=\diag(\lambda_1,\dots,\lambda_n)$
is a diagonal matrix with the $\lambda_i$ being the diagonal entries.
Let $S=[s_1,\dots,s_n]$ where the $s_i$ are the columns of $S$. Verify that
$SD=[\lambda_1s_1,\dots,\lambda_ns_n]$.
07-07.27 DO: Use the preceding exercise to give a very simple proof of the
fact that a matrix is diagonalizable if and only if it has an eigenbasis.
07-07.28 DO (coordinates):
Let $V$ be a vector space and $\ul{b}=(b_1,\dots,b_n)$ be
a list of vectors in $V$. Prove that $\ul{b}$ is a basis if and
only if for every $v\in V$ there exists a unique list of corefficients
$\alpha_1,\dots,\alpha_n$ such that $v=\sum_{i=1}^n \alpha_iv_i$.
The $\alpha_i$ are called the coordinates of $v$
wrt $\ul{b}$. We denote the column vector consisting of the
coordinates of $v$ by $[v]_{\ul{b}}$:
07-07.29 DO: Fix a basis $\ul{b}=(b_1,\dots,b_n)$ of the vector space $V$.
Prove: the correspondence $v\mapsto [v]_{\ul{b}}$ is an isomorphism
and therefore $V\cong \fff^n$. Consequently, and two vector spaces
over the same field and fo the same dimension are isomorphic.
07-07.30 DO: Let $V,W$ be vector spaces (over the same field).
Let $\ul{b}=(b_1,\dots,b_n)$ be a basis of $V$ and let
$w_1,\dots,w_n$ be arbitrary vectors in $W$. Prove: there exists a
unique linear map $\vfi : V\to W$ such that $(\forall i)(\vfi(b_i)=w_i)$.
07-07.31 DEF (Coordinatization of a linear map):
Let $V,W$ be vector spaces (over the same field). Let
$\ul{e}=(e_1,\dots,e_n)$ be a basis of $V$ and
$\ul{f}=(f_1,\dots,f_k)$ be a basis of $W$.
Let $A$ be the $k\times n$ matrix of which the $j$-th column
is $[\vfi(e_j)]_{\ul{f}}$, the column vector of coordinates
of the vector $\vfi(e_j)$ wrt the basis $\ul{f}$. We write
07-07.32 DEF (Coordinatization of a linear transformation):
If $V=W$, we speak of a linear transformation
$\vfi : V\to V$. In this case we write $[\vfi]_{\ul{e}}$
for $[\vfi]_{\ul{e}, \ul{e}}$ (the same basis is used for
$V$ as the domain and as the target space of $\vfi$).
TO BE CONTINUED.
07-05.1 DO: Study the problems on the instructor's
Linear Algebra Puzzles sheet. Solve Problems 1 to 11
from the Puzzle sheet by Monday, July 11.
Some of these will be assigned as HW.
07-05.2 HW (due Mon July 11) (commuting matrices)
Solve Problem 3 on the Lin Alg Puzzles sheet.
07-05.3 CH (due Mon July 11) (Hilbert matrix)
Solve Problem 7 on the Lin Alg Puzzles sheet.
07-05.4 DO: Prove: $\det(AB)=\det(A)\det(B)$. Hint: perform elementary
column operations on $B$ until either $B$ has an all-zero column or
$B$ has exactly one nonzero entry in each row and in each column.
Verify that $AB$ undergoes the same elementary operations so
neither $\det(B)$ nor $\det(AB)$ changes along the way.
07-05.5 DO: Review cofactors. State and prove the determinant expansion by
the $i$-th row.
07-05.6 DO: Prove: (a) A matrix has a right inverse if and only if
it has full row rank. (b) Infer from this (do not repeat
the argument) that a matrix has a left inverse if and only if
it has full column.
07-05.7 DO: Prove: If a matrix has a right inverse and a left inverse
then it has a unique two-sided inverse, and it has no right inverse
and no left inverse other than this two-sided inverse.
07-05.8 DO: Let $A\in\rrr^{k\times n}$. Prove: (a) If $A$ has a right
inverse then $k\ge n$. (b) If $A$ has a right
inverse and $k > n$ then $A$ has infinitely many right inverses.
07-05.9 HW (rank via nonsingular submatrix):
Prove that the rank of a matrix $A$ is the largest value $r$
such that $A$ has a nonsingular $r\times r$ submatrix.
07-05.10 HW (rank vs. Gram--Schmidt):
Let $A\in M_n(\rrr)$. Let us obtain the matrix
$A'\in M_n(\rrr)$ by applying Gram--Schmidt orthogonalization
to the columns of $A$. Prove: $\det(A)=\det(A')$.
07-05.11 HW (Gram matrix):
Let $V$ be a Eulidean space (real vector space endowed with a
positive definite symmetric inner product). Let $v_1,\dots,v_k\in V$.
The Gram matrix of this list of vectors is the $k\times k$
matrix $G=G(v_1,\dots,v_k)=(\langle v_i,v_j\rangle)$ (the $(i,j)$ entry
is the inner product of the $i$-th and the $j$-th vector).
Prove: $v_1,\dots,v_k$ are linearly independent if and only if
$\det G\neq 0$.
07-05.12 DO: Prove: For any list of vectors, $\det(G)\ge 0$.
07-05.13 HW (due Monday, July 11) (Hadamard's Inequality):
Let $A\in M_n(\rrr)$ with columns $A=[a_1,\dots,a_n]$.
(a) Prove: $|\det(A)|\le \prod_{i=1}^n ||a_i||$.
(b) Prove: equality holds in Hadamard's inequality if and only if
either one of the columns of $A$ is zero or the columns of $A$ are
orthogonal.
07-05.14 DEF: An Hadamard matrix is a square matrix with
$\pm 1$ entries such that its columns are orthogonal.
07-05.15 DO: Given an $n\times n$ Hadamard matrix, construct
a $(2n)\times (2n)$ Hadamard matrix. In particular,
there exists a $2^k\times 2^k$ Hadamard matrix for every $k\ge 1$.
07-05.16 DO: Let $\calH$ denote the set of those integers $n$
for which there exists an $n\times n$ Hadamard matrix.
Prove: if $n\in\calH$ and $n\neq 2$ then $4\mid n$.
07-05.17 DO: Prove: if $k,\ell\in\calH$ then $k\ell\in \calH$.
07-05.18 CH: Prove: If $p$ is a prime number and $4\mid p+1$
then $p+1\in\calH$.
07-05.19 OPEN QUESTION: Does $\calH$ have positive density
among the positive integers? (Recall the meaning of this
statement.)
07-05.20 DEF: Let $v_1,\dots,v_k\in V$ where $V$ is a real vector
space. The parallelepiped spanned by $v_1,\dots,v_k\in V$
is the set $\{\sum_{i=1}^k \alpha_i v_i \mid 0\le\alpha_i\le 1\}$.
Notation: $\vol_k(v_1,\dots,v_k)$ denotes the $k$-dimensional
volume of the parallelepiped spanned by the $v_i$.
07-05.21 DO: Prove: If $v_1,\dots,v_n\in\rrr^n$ then
$\vol_k(v_1,\dots,v_k) = |\det(v_1,\dots,v_k)|$.
07-05.22 DO: Prove: If $v_1,\dots,v_n\in V$ where $V$ is a
Euclidean space then $\vol_k(v_1,\dots,v_k) = \sqrt{\det(G)}$
where $G=G(v_1,\dots,v_k)$ is the Gram matrix of the $v_i$.
07-05.23 DEF. The characteristic polynomial of the
matrix $A\in M_n(\fff)$ is the polynomial $f_A(t)=\det(tI-A)$.
07-05.24 DO: Prove: $\lambda\in\fff$ is an eigenvalue of $A$
if and only if $\lambda$ is a root of the characteristic polynomial,
i.e., $f_A(\lambda)=0$.
07-05.25 HW: Let $J$ be the $n\times n$ all-ones matrix
$J =$ \(
\begin{bmatrix}
1 & 1 & \dots & 1 \\
1 & 1 & \dots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \dots & 1
\end{bmatrix}
\).
Find the characteristic polynomial
of $J$ and find all eignevalues of $J$ with multiplicity.
(DEF: The eigenvalue $\lambda$ has multiplicity $s$ if $(t-\lambda)^s$
divides the polynomial $f_A(t)$ but $(t-\lambda)^{s+1}$
does not divide it.)
07-05.26 DO: Let $A=$ \(
\begin{bmatrix}
1 & 1 \\
1 & 0
\end{bmatrix} \).
(a) Find the eigenvalues of $A$ and relate them to universal ideals
in human culture.
(b) Determine the matrix $A^n$ for every $n$.
07-05.27 DEF: Let $E_{ij}$ be the $n\times n$ matrix with 1 in position
$(i,j)$ and zero everywhere else. We call the matrices of the
form $I+\lambda E_{i,j}$ $(i\neq j)$ elementary matrices.
07-05.28 DO: Prove: (a) If $A\in \fff^{k\times n}$ and we multiply $A$
by an $n\times n$ elementary matrix to the right, the effect is
an elementary column operation on $A$.
(b) Infer from this (do not repeat the argument) that
if we multiply $A$ by a $k\times k$ elementary matrix to the left,
the effect is an elementary row operation on $A$.
07-05.29 DO:
Use the preceding exercise to give an elegant proof of the following
statement. If the columns
of a matrix $A=[a_1,\dots,a_n]$ satisfy a linear relation
$\sum \alpha_i a_i = 0$ then after an elementary row operation,
the columns of the modified matrix satisfy the same linear relation.
(Recall that this was one of the two main lemmas to the proof
of the second miracle: that column rank equals row rank.)
06-30.1 DO: Let $V$ be a vector space over the field $\fff$.
Based on the axioms of vector spaces, prove:
$(\forall\alpha\in\fff)(\forall v\in V)(\alpha v=0 \iff\alpha = 0$
or $v=0)$. (Note that we have two different zeros
in play here: $0\in \fff$ and $0\in V$.)
06-30.2 DO: Prove: the empty list is linearly independent.
06-30.3 DO: Prove: every sublist of a linearly independent list is
linearly independent.
06-30.4 DO: (a) Find $n+1$ vectors in $\rrr^n$ such that every $n$ of them
are linearly independent.
(b) Find infinitely many vectors in $\rrr^n$ such that every $n$ of them
are linearly independent.
(c) Find a continuous curve $\rrr\to\rrr^n$ such that any $n$ points on
the curve are linearly independent. (Define the curve by a very simple
formula.)
06-30.5 DO: If $v_1, \dots, v_k$ are linearly independent and
$v_1, \dots, v_k, w$ are linearly dependent, then
$w \in \span(v_1, \dots, v_k)$.
06-30.6 DEF: A basis of a vector space $V$ is a
linearly independent list of generators of $V$.
06-30.7 DO: The following are equivalent for a list $B$
of vectors: (a) $B$ is a basis
(b) $B$ is a maximal linearly independent set. (Recall: "maximal"
means it cannot be extended, i.e., if we add one more vector
to the list, it becomes linearly dependent.)
06-30.8 DEF: The dimension of vector space $V$ is
the maximum number of linearly independent vectors in $V$.
06-30.9 DO: Prove: every linearly independent list can be extended to
a basis.
(a) Prove this for finite-dimensional spaces.
(b) Prove it for the general case using Zorn's lemma.
06-30.10 DO: Prove: every list of generators contains a basis as a sublist.
06-30.11 DO (First miracle of linear algebra):
Let $v_1,\dots,v_k$ be linearly independent vectors and
$w_1,\dots,w_m\in V$ be vectors such that
$v_1,\dots,v_k\in\span(w_1,\dots,w_m)$. Then $k\le m$.
(Hint: Steinitz exchange principle, discussed in the first class.)
06-30.12 DEF: The rank of a list of vectors is the size of the largest
linearly independent sublist. The column-rank of a matrix
is the rank of its list of columns; row-rank is defined analogously.
(Note that the dimension of a space is its rank.)
06-30.13 DO: Prove: Starting from any matrix, using elementary row and
column operations, one can reach a matrix in which each row and each column
has at most one non-zero entry.
06-30.14 DO: All bases have the same cardinality (same number of elements).
This shared number is the dimension.
06-30.15 DO: Prove: (a) Elementary column-operations do not change the
column-rank of a matrix. (b)
Elementary row-operations do not change the column rank of a matrix.
06-30.16 DO: Infer that the same is true for row-rank.
06-30.17 DO (Second miracle of linear algebra)
For every matrix, row-rank $=$ column-rank. In other words,
the column-rank of a matrix is equal to the column-rank of its transpose.
(This is immediate
from the preceding three exercises.)
06-30.18 DEF: The rank of matrix is its column-rank (or, equivalently,
its row-rank).
06-30.19:
Let $A$ be an integral matrix, i.e., $A\in \zzz^{k\times\ell}$.
We can view $A$ as a matrix over any of the fields $\qqq$, $\rrr$, $\ccc$.
This results in three notions of rank which we denote by $\rk_{\qqq}(A)$,
$\rk_{\rrr}(A)$, and $\rk_{\ccc}(A)$, respectively. Prove:
$\rk_{\qqq}(A)\ge \rk_{\rrr}(A) \ge \rk_{\ccc}(A)$.
06-30.20 HW, due Tuesday, July 5 (insensitivity to field extensions):
Continuing the preceding exercise, prove:
$\rk_{\qqq}(A)= \rk_{\rrr}(A) = \rk_{\ccc}(A)$.
Your solution should be elegant, just one or two lines,
based on material covered in Friday's class (July 1).
(10 points)
06-30.21 DO: Let $A$ again be an integral matrix,
$A\in \zzz^{k\times\ell}$. Viewing
the entries of $A$ modulo $p$ ($p$ a prime number), we
can ask the rank of $A$ over the field $\fff_p$.
We denote this rank by $\rk_p(A)$.
(a) Prove: $\rk_p(A)\le \rk_{\rrr}(A)$.
(b) Find a $3\times 3$ (0,1)-matrix $A$
such that $\rk_2(A)\le \rk_{\rrr}(A)$.
(c) For every prime $p$, find a (0,1)-matrix $A$
such that $\rk_p(A)\le \rk_{\rrr}(A)$.
July 14
Homework due the next day unless otherwise stated.
(Hint: Induction on $n$. Let $A=[\vfi]_{\ul{b}}$ for some
ONB of a Hermitian inner product space. Change the basis so
that your first basis vector is an eigenvector of $\vfi$.)
Go to top
July 13
Homework due the next day unless otherwise stated.
Go to top
July 12
Homework due the next day unless otherwise stated.
Note that the July 11 homework set has also been posted (below);
it includes several problems due Wednesday or later this week.
Every symmetric real matrix is orthogonally similar to a diagonal
matrix. In other words, every symmetric real matrix is orthogonally
diagonalizable.
Go to top
July 11
Homework due the next day unless otherwise stated.
Note that this homework set includes several problems due later
this week.
(a) $W$ is the direct sum of the $U_i$
(b) for all $u_1\in U_1,\dots,u_k\in U_k$, if none of the $u_i$ is zero then
$u_1,\dots,u_k$ are linearly independent
(c) Every element of $w\in W$ can uniquely be represented as
$w=u_1+\dots+u_k$ where $u_i\in U_i$.
Part (c) shows that the direct sum concept generalizes the concept of
linear independence. This connection will be reinforced by Ex. 07.11.11 below.
For this reason, we refer to $W^{\perp}$
as the orthogonal complement of $W$.
Go to top
July 7
Let us define the inner product with respect to the weight function
$\rho$ on the space $\rrr[t]$ of real polynomials by setting
$\langle f,g\rangle = \int_{-\infty}^{\infty} \rho(t)f(t)g(t)dt$
for $f,g\in\rrr[t]$.
Prove that (A) under the assumptions (a1),(a2),(b) this integral converges
for all pairs $(f,g)$ of real polynomials, and (B) this integral defines
a positive definite symmetric bilinear inner product, and thereby it
turns $\rrr[t]$ into a Euclidean space.
The principal ideal generated by the polynomial $f$,
denoted $(f)$, is the set of multiples of $f$, i.e.,
$(f)=\{fg\mid g\in\fff[t]\}$.
Let $f_1,\dots,f_k\in\fff[t]$ be polynomials. Then there exist
polynomials $u_1,\dots,u_k\in\fff[t]$ such that the polynomial
$h=f_1u_1+\dots+f_ku_k$ is a greatest common divisor of the $f_i$.
$[v]_{\ul{b}}$ = \(
\begin{bmatrix}
\alpha_1 \\
\alpha_2 \\
\vdots \\
\alpha_n \\
\end{bmatrix}
\).
$[\vfi]_{\ul{e},\ul{f}} = A$, indicating that this matrix,
associated with the linear map $\vfi$, depends on the
choice of the beses $\ul{e}$ and $\ul{f}$.
July 5
June 30
Go to top
Go to top
Course home
Return to the Department
of Computer Science home page