next up previous contents
Next: Distinct eigenvalues Up: Linear Systems Previous: Linear systems in R

2.2 Exponential of matrices

The IVP in one-dimensional space

\begin{displaymath}x' = ax, \quad x(0) = x_{0}, \quad x_{0} \in R,
\end{displaymath}

has a unique solution x(t) = x0 eat, $t \geq
0$. It is natural to ask whether we can define the ``exponential of a matrix'' so that the IVP in $\Real^{n}$

 \begin{displaymath}x' = Ax, \quad x(0) = x_{0}, \quad x_{0} \in
\Real^{n}
\end{displaymath} (2.3)

has a unique solution x(t) = eAt x0, $t \geq
0$. Before answering this question, we need to define the norm of a linear operator and the exponential of matrices. Let $T : R^{n} \raro R^{n}$ be a linear operator. We define the norm of T by

\begin{displaymath}\parallel T \parallel = \max_{\mid x \mid \, \leq 1} \mid
T (x) \mid \, ,
\end{displaymath}

where $\mid \cdot \mid$ denotes the Euclidean norm. Let the linear operator T on Rn be represented by the $n \times n$ matrix A with respect to a basis for Rn. Then the norm of A is defined by

\begin{displaymath}\parallel A \parallel = \parallel T \parallel \, .
\end{displaymath}

Let L(Rn) be the space of all linear operators on Rn. Then for $S, \, T \in L(R^{n})$, $x \in R^{n}$,

(1)
$\parallel T \parallel \geq 0$ and $\parallel T
\parallel = 0$ iff T=0;

(2)
$\parallel kT \parallel = \mid k \mid \; \parallel T
\parallel \quad \forall \,\, k \in R$;

(3)
$\parallel S + T \parallel \, \leq \, \parallel S
\parallel \, + \, \parallel T \parallel$;

(4)
$\mid T (x) \mid \, \leq \, \parallel T \parallel \; \mid x
\mid$ $\forall \, x \in \Real^{n}$;

(5)
$\parallel TS \parallel \, \leq \, \parallel T \parallel
\; \parallel S \parallel$;

(6)
$\parallel T^{k} \parallel \, \leq \, \parallel T
\parallel^{k}$ $k = 0, 1,2, \ldots $ .


Proof: (omitted)


Lemma 2.2.1:     Let A be an $n \times n$ matrix and $r \in R_{+}$. Then the series

\begin{displaymath}\sum_{k =0}^{\infty} \frac{A^{k}t^{k}}{k!}
\end{displaymath}

is absolutely and uniformly convergent for $\mid t \mid
\, \leq \, r$.


Proof: (omitted)


Definition 2.2.1:     Let A be an $n \times n$ matrix. Then $\forall \; t \in R$, we define

\begin{displaymath}e^{At} = \sum_{k=0}^{\infty} \frac{A^{k}t^{k}}{k!} \, .
\end{displaymath}

Let $A, \, B$ and P be $n \times n$ matrices where P is nonsingular. Then

(1)
e0 = I, 0 being $n \times n$ zero matrix;

(2)
eA+B = eA eB, provided AB = BA;

(3)
eA is invertible and (eA)-1 = e-A;

(4)
P-1 AP = B implies eB = P-1 eA P.


Remark:     eAt is an $n \times n$ matrix whose entries are limits of the corresponding n2 infinite series.


Theorem 2.2.1:     Let A be an $n \times n$ matrix. Then IVP (2.3) has a unique solution for all $t \in R$ which is given by

\begin{displaymath}x (t) = e^{At} x_{0} \, .
\end{displaymath}


Proof: (Omitted)


Corollary 2.2.1     By Theorem 2.2.1, we have

(1)
$e^{\tiny\left[ \begin{array}{ll}
\lambda_{1} & 0 \\ 0 & \lambda_{2} \end{array}...
...{array}{ll}
e^{\lambda_{1}t} & 0 \\ 0 & e^{\lambda_{2}t} \end{array}\right]\, ;$

(2)
$e^{\tiny\left[ \begin{array}{ll}
\lambda & 1 \\ 0 & \lambda \end{array} \right]t} =
e^{\lambda t} \left[ \begin{array}{ll}
1 & t \\ 0 & 1 \end{array} \right]\, ;$

(3)
$e^{\tiny\left[ \begin{array}{rl}
\alpha & \beta \\ - \beta & \alpha \end{array}...
...cos \beta t & \sin \beta t \\ - \sin \beta t & \cos \beta
t \end{array} \right]$.


Corollary 2.2.2     Let P be an $n \times n$ nonsingular matrix. If P-1 AP = B, then x(t) = PeBt P-1 x0 is a solution of the IVP (2.3).

Exercise Perko, P. 19-20; 4, 6, 7, 8.


next up previous contents
Next: Distinct eigenvalues Up: Linear Systems Previous: Linear systems in R