Introduction
Here's a selection of problems in linear algebra. In a few cases you might be able to use the result of one problem to help with another. The questions are not always worded in the most helpful fashion, and the ordering is fairly random. The sections are overall in order of difficulty. For the questions in the "serious" group, I recommend that you cheat. Collaboration is encouraged.
None of the problems require tensor products. You may need to develop your understanding of companion matrices in at least one case. If the field is not specified, it can be any field: finite, $p$-adic, rational functions,...
No hints, no solutions.
Basics
- If $A$ and $B$ are matrices of the same order, what algorithm can we use to decide whether they have the same row space?
- Show that either the system of equations $Ax=b$ has a solution over $\fld$, or there is a vector $y$ such that $y^TA=0$ and $y^Tb\ne0$.
- Let $M$ be a square matrix with minimal polynomial $\psi$. Assume $\psi(t)$ splits over the underlying field. Prove that $M$ is diagonalizable if and only if $\psi$ and its derivative $\psi'$ are coprime.
- Prove Cauchy-Schwarz for complex vectors.
- Prove that a complex matrix $A$ is normal if and only if \[ \ip{Az}{Az}=\ip{A^*z}{A^*z} \] for all vectors $z$.
- Without using the "fundamental" "theorem of algebra", prove that any real symmetric matrix has an eigenvalue.
- Suppose $M$ is a real matrix with linearly independent columns and \[ A = \pmat{M^TM & M^T}. \] Since $M^TM$ is positive definite, we can use elementary row operations (and no pivoting) to bring $M^TM$ to triangular form. If the matrix representing this product of row operations is $H$, prove that the columns of $MH^T$ are an orthonormal basis for the column space of $M$. [What is the name of this algorithm? Warning, your first guess will probably be very close.]
- If $A$ is square, prove that $A$ and $A^T$ are similar.
- Let $\cM$ denote the vector space of of $m\times n$ matrices. If $L$ is a linear map from $\cM$ to itself, prove that there are matrices $A_i,B_i$ for $i=1,\ldots,r$ such that \[ L(X) = \sum_{i=1}^r A_iXB_i^T. \] What is the minimum value of $r$ (that works for all matrices $X$ in $\cM$)?
- If $A$ ia a symmetric matrix of rank $k$, prove that $A$ has a invertible principal submatrix of order order $k\times k$.
- Show that a matrix has rank at most $r$ if and only if it can be written as the sum of $r$ rank-one matrices.
- Show that a symmetric matrix $M$ can be written as the sum of symmetric matrices with rank one and symmetric matrices of rank two with zero diagonal, such that ranks of the summands adds up to $\rk(M)$.
- Let $V$ be a vector space of dimension $2d$ over some finite field $\fld$. Let $U_0$ and $U_\infty$ be subspaces of $V$ with dimension $d$ such that $U_0\cap U_\infty=0$. Prove that number of $d$-dimensional subspaces of $V$ having trivial intersection with $U_0$ and $U_\infty$ is equal to the number of invertible $d\times d$ matrices over $\fld$.
- Let $S$ be a set of $d\times d$ matrices over the field $\fld$ of order $q$, such that the difference of any two distinct matrices from $S$ is invertible. Prove that $|S|\le q^d$.
- If $A$ and $B$ are matrices such that both products $AB$ and $BA$ are defined, prove that $\det(I-AB)=\det(I-BA)$. Deduce from this that if $A$ and $B$ are real, then $AB$ and $BA$ have the same non-zero eigenvalues, with the same multiplicities.
- Prove that the eigenvalues of an indecomposable symmetric tridiagonal matrix $M$ over $\re$ are simple. (A tridiagonal matrix $M$ is indecomposable if $M_{i+1,i}M_{i,i+1}\ne0$ for all $i$.)
- We work over $\rats$. Show that either the equation $Ax=b$ has an integer solution, or there is a vector $y$ such that the entries of $y^TA$ are integers, but $y^Tb$ is not an integer.
- If $A$ and are positive semidefinite, so is their Schur product $A\circ B$. One proof of this uses the facts that $A\circ B$ is a principal submatrix of $A\otimes B$, which itself is positive semidefinite. Find a second proof.
- If $A$ and $B$ are $n\times n$ real matrices that are similar in $GL(n,\cx)$, prove that they are similar in $GL(n,\re)$.
- Prove that a matrix $A$ is normal if and only if $A^*$ is a polynomial in $A$..
Fun
- If $M$ and $N$ are square matrices of the same order, $[M,N]:=MN-NM$. (This is the Lie bracket.) If $A$, $B$, $C$ are real $2\times2$ matrices prove that $[[A,B]^2,C]=0$.
- Let $A$ be a symmetric matrix over $\ints_2$, let $D$ be the diagonal matrix $I\circ A$ and let $d=D\one$. Show that $d$ lies in the column space of $A$.
- Let $A$ and $B$ be symmetric real idempotent matrices. Show that if $A-B$ is positive semidefinite then $BA=AB=B$.
Graph Theory
- Let $A$ be the adjacency matrix for the graph $X$. Starting from the identity \[ (tI+A-J) = (tI+A)^{-1}(I-(tI+A)^{-1}J), \] derive an expression for the characteristic polynomial of $J-A$ in terms of the characteristic polynomial of $A$ and the generating function for all walks in $X$.
- If $X$ and $Y$ are cospectral graphs with cospectral complements, prove that there is an orthogonal matrix $L$ such that $LJ=JL$ and \[ L^TA(X)L=A(Y), \qquad L^TA(\comp{X})L =\comp{Y}. \]
- If $X$ is a graph on $n$ vertices and $u\in V(X)$, the walk matrix relative to $a$ is the square matrix with colomns \[ e_u, Ae_u,\ldots,A^{n-1}e_u. \] Prove that vertices $a$ and $b$ are cospectral if and only if $W_a^TW_a=W_b^TW_b$.
- A vertex $u$ in $X$ is controllable if $W_u$ is invertible. Suppose $a$ and $b$ are vertices in $X$ and $a$ is controllable. Show that $a$ and $b$ are cospectral if and only if $W_bW_a^{-1}$ is orthogonal, commutes with $A$ and maps $e_a$ to $e_b$.
- If $a$ and $b$ are cospectral vertices and $a$ is controllable and $Q=W_bW_a^{-1}$, prove that $Q$ is a polynomial in $A$ and $Q^2=I$
- If $C_1$ and $C_2$ are cliques of the same size in the strongly regular graph $X$, prove that the graphs $X\diff C_1$ and $X\diff C_2$ are cospectral with cospectral complements. [We're deleting vertices here, not edges.]
Serious
- Prove that any square real matrix can be written as the product of two symmetric matrices.
- If $A$ and $B$ are $n\times n$ matrices over a field, how do we decide if $A$ and $B$ are similar?
- If $A$ is a square matrix over a field $\fld$ that contains all eigenvalues of $A$, prove that $A=S+N$ where $S$ is diagonalizable, $N$ is nilpotent, and $S$ and $N$ are polynomials in $A$.
- Prove that $B$ commutes with each matrix that commutes with $A$ if and only if $B$ is a polynomial in $A$.
- Let $\seq{P}1k$ be $n\times n$ positive semidefinite matrices such that $\rk(P_i)=r_i$ and $\sum_i r_i =n$. If $P_1+\cdots P_k=I$, prove that $\seq P1k$ are pairwise orthogonal idempotents.