next up previous
Next: About this document ... Up: Discrete Math, Second Problem Previous: Number Theory

Linear Algebra

Let $ F$ be a field. Note that the ring $ F[x]$ of univariate polynomials over $ F$ is an infinite-dminesional vector space over $ F$ with a nice (standard) basis: $ \{1,x,x^2,x^3,x^4, \dots\}$.

Example 2.1   Let $ F_n[x]$ denote the set of polynomials of degree $ \le n$. This is an $ n+1$-dimensional subspace of $ F[x]$. We calculate the matrix representing the map $ \frac{d}{dx}:F_n[x]\to F_n[x]$ with respect to the standard basis $ {\mathbf{B}}:=\{1,x,\dots,x^n\}$.

\begin{displaymath}
% latex2html id marker 3479\left[\frac{d}{dx}\right]_{{\ma...
...ts & n \\
0 & 0 & 0 & 0 & \ldots & 0 \\
\end{array}\right]
\end{displaymath}

Example 2.2   Consider the rotation of the Euclidean plane $ {\mathbb{E}}^2$ by angle $ \alpha$ about the origin. We denote this linear map by $ R_\alpha:\mathbb{E}^2\to\mathbb{E}^2$. Choose as the basis a pair of perpendicular unit vectors $ {\mathbf{e}}$ and $ {\mathbf{f}}$. Then we have that $ R_\alpha({\mathbf{e}})=\cos(\alpha){\mathbf{e}}+\sin(\alpha){\mathbf{f}}$, and $ R_\alpha({\mathbf{f}})=-\sin(\alpha){\mathbf{e}}+\cos(\alpha){\mathbf{f}}$. Therefore the matrix for $ R_\alpha$ in the basis $ {\mathbf{e}},{\mathbf{f}}$, is

\begin{displaymath}
% latex2html id marker 3500[R_\alpha]_{{\mathbf{e}},{\math...
...(\alpha)\\
\sin(\alpha)&\cos(\alpha)\\
\end{array}\right].
\end{displaymath}

Example 2.3   Consider the reflection of the Euclidean plane % latex2html id marker 3503
$ {\bf E^2}$ through the line which makes an angle $ \alpha$ with horizontal ( $ {\mathbf{e}}$) and passes through the origin. We denote this linear map by $ F_\alpha:\mathbb{E}^2\to\mathbb{E}^2$. Then we have $ F_\alpha({\mathbf{e}})=\cos(2 \alpha){\mathbf{e}}+\sin(2
\alpha){\mathbf{f}}$, and $ F_\alpha({\mathbf{f}})=\sin(2
\alpha){\mathbf{e}}-\cos(2\alpha){\mathbf{f}}$. Therefore the matrix for $ F_\alpha$ in the basis $ {\mathbf{e}},{\mathbf{f}}$, is

\begin{displaymath}
% latex2html id marker 3519[F_\alpha]_{{\mathbf{e}},{\math...
...lpha)\\
\sin(2\alpha)&-\cos(2\alpha)\\
\end{array}\right].
\end{displaymath}

Example 2.4   If in the previous example, we had chosen the basis $ {\mathbf{u}}$ and $ {\mathbf{v}}$, where $ {\mathbf{u}}$ is a vector along the line of reflection, and $ {\mathbf{v}}$ is perpendicular to it, then the matrix for $ F_\alpha$ would be:

\begin{displaymath}
% latex2html id marker 3532[F_\alpha]_{{\mathbf{u}},{\math...
...
\begin{array}{rr}
1 & 0 \\
0 & -1 \\
\end{array}\right].
\end{displaymath}

Definition 2.5   Given a square matrix $ A$, define % latex2html id marker 3537
$ \mathop{\rm tr}\nolimits (A)$, the trace of $ A$ to be the sum of the diagonal entries of $ A$.

Remark 2.6   The following observations are special cases of a general principle:

Definition 2.7   Let $ \{{\mathbf{e_1}},\dots,{\mathbf{e_n}}\}$, and $ \{{\mathbf{f_1}},\dots,{\mathbf{f_n}}\}$ be two bases for a vector space $ V$. The basis change transformation $ S:V\to V$ is the unique (invertible) linear map defined by $ S:e_i\mapsto f_i$.

Example 2.8   Let $ {\mathbf{x}}$ be a vector in $ V$. Then $ {\mathbf{x}}=\sum \alpha_i {\mathbf{e}}_i=\sum \beta_i{\mathbf{f}}_i$. Notice that

$\displaystyle S{\mathbf{x}}=S\sum \alpha_i {\mathbf{e}}_i=\sum \alpha_i S{\mathbf{e}}_i=\sum \alpha_i {\mathbf{f}}_i$.$\displaystyle $

Therefore if

\begin{displaymath}
% latex2html id marker 3569[{\mathbf{x}}]_{{\mathbf{e}}}=
...
...rray}{c}
\alpha_1 \\
\vdots \\
\alpha_n
\end{array}\right]
\end{displaymath}

$ [S{\mathbf{x}}]_{{\mathbf{f}}}=[{\mathbf{x}}]_{\mathbf{e}}$. Also, $ [S{\mathbf{x}}]_{{\mathbf{f}}}=[S]_{{\mathbf{f}}}[{\mathbf{x}}]_{{\mathbf{f}}}$ by the following exercise. Thus $ [{\mathbf{x}}]_{{\mathbf{f}}}=[S^{-1}]_{{\mathbf{f}}}[{\mathbf{x}}]_{{\mathbf{e}}}$

Exercise 2.9   Let $ \{{\mathbf{e_1}},\dots,{\mathbf{e_n}}\}$ be a basis for $ V$, and let $ \{{\mathbf{f_1}},\dots,{\mathbf{f_n}}\}$ be a basis for $ W$. Let $ {\mathbf{v}}\in V$, and let $ A:V\to W$. Show that $ [A]_{{\mathbf{e}},{\mathbf{f}}}[{\mathbf{v}}]_{\mathbf{e}}=[A{\mathbf{v}}]_{{\mathbf{f}}}$.

Exercise 2.10   Let $ V,W,Z$ be vector spaces over $ F$. Let $ \{{\mathbf{e_1}},\dots,{\mathbf{e_n}}\}$ be a basis for $ V$, $ \{{\mathbf{f_1}},\dots,{\mathbf{f_n}}\}$ be a basis for $ W$, $ \{{\mathbf{g_1}},\dots,{\mathbf{g_n}}\}$ be a basis for $ Z$. Let $ A:V\to W$, $ B:W\to Z$. Show that $ [BA]_{{\mathbf{e}},{\mathbf{g}}}=[B]_{{\mathbf{f}},{\mathbf{g}}}[A]_{{\mathbf{e}},{\mathbf{f}}}$.

Exercise 2.11   If $ B$, $ C$ are $ n\times k$ matrices that $ B{\mathbf{x}}=C{\mathbf{x}}$ for all $ {\mathbf{x}}\in F^k$ then $ B=C$.

Remark 2.12   Let $ {\mathbf{e}}$, $ {\mathbf{e'}}$ be two bases for $ V$ with change of basis map $ S: {\mathbf{e}}_i\mapsto {\mathbf{e}}_i'$. Let $ {\mathbf{f}}$, $ {\mathbf{f'}}$ be two bases for $ W$ with change of basis map $ T: {\mathbf{f}}_i\mapsto {\mathbf{f}}_i'$. Let $ A:V\to W$. We want to compare $ [A]_{{\mathbf{e}}{\mathbf{f}}}$ with $ [A]_{{\mathbf{e'}}{\mathbf{f'}}}$. Let $ {\mathbf{x}}\in V$. We have the following identities: Therefore for all $ {\mathbf{x}}\in V$,

$\displaystyle ([T^{-1}]_{{\mathbf{f'}}{\mathbf{f'}}}[A]_{{\mathbf{e}}{\mathbf{f...
...thbf{f'}}}[S^{-1}]_{{\mathbf{e'}}{\mathbf{e'}}})[{\mathbf{x}}]_{{\mathbf{e}}}.
$

However, by the previous exercise, this is only possible if

$\displaystyle [T^{-1}]_{{\mathbf{f'}}{\mathbf{f'}}}[A]_{{\mathbf{e}}{\mathbf{f}}}=[A]_{{\mathbf{e'}}{\mathbf{f'}}}[S^{-1}]_{{\mathbf{e'}}{\mathbf{e'}}}.
$

Remark 2.13   From the previous remark, we deduce that

$\displaystyle [A]_{{\mathbf{e}}{\mathbf{f}}}=[T]_{{\mathbf{f'}}{\mathbf{f'}}}[A...
...f{f'}}}[S^{-1}]_{{\mathbf{e'}}{\mathbf{e'}}}=[TAS]_{{\mathbf{e'}}{\mathbf{f'}}}$.$\displaystyle $

Corollary 2.14   Let $ S=A=T$. Then

$\displaystyle [S]_{{\mathbf{e}}{\mathbf{e}}}=[TAS^{-1}]_{{\mathbf{f}}{\mathbf{f}}}=[S]_{{\mathbf{f}}{\mathbf{f}}}$.$\displaystyle $

Corollary 2.15   If $ V=W$, $ e=f$, then % latex2html id marker 3680
$ A_{\text{new}}=S^{-1}A_{\text{old}}.S$

Definition 2.16   Two $ n\times n$ matrices $ A$, $ B$ are similar if there exists some invertible $ S$ such that $ B=S^{-1}AS$. We write $ A\sim B$.

Definition 2.17   The characteristic polynomial $ f_A$ of a matrix $ A$ is the determinant $ f_A(x) =$   det$ (xI-A)$, where $ I$ is the $ n\times n$ identity matrix.

Exercise 2.18   If $ A$ and $ B$ are $ n\times n$ matrices, then det$ (AB)=$det$ (A)$det$ (B)$.

Exercise 2.19   If $ A\sim B$ then det$ (A)=$det$ (B)$.

Exercise 2.20   If $ A\sim B$ then $ f_A=f_B$.

Definition 2.21   If $ A$ is a matrix, the roots of $ f_A$ are called the eigenvalues of $ A$.

Definition 2.22   If $ T$ is a linear transformation, then the characteristic polynomial of $ T$ is the characteristic polynomial of a matrix for $ T$ in some basis. The last exercise shows that this is well-defined.

Definition 2.23   If $ T$ is a linear transformation, then the eigenvalues of $ T$ are the roots of the characteristic polynomial.

Remark 2.24   If $ A$ is an upper triangular matrix, then the eigenvalues of $ A$ are the diagonal entries of $ A$, taken with multiplicity.

Exercise 2.25   Find two $ 2\times 2$ matrices, $ A$ and $ B$ such that $ f_A=f_B$ but $ A$ and $ B$ are not similar.

Exercise 2.26   If $ A$ is an $ n\times n$ matrix over a field $ F$, and $ f_A$ has $ n$ distinct roots in $ F$, then $ A$ is ``diagonalizable,'' i.e., $ A$ is similar to a diagonal matrix. (What are the entries of this diagonal matrix?)


next up previous
Next: About this document ... Up: Discrete Math, Second Problem Previous: Number Theory
Varsha Dani 2003-07-25