3.9 KiB
Matrix arithmetic
Definitions
Definition: let
Abe am \times nmatrix given by
A = \begin{pmatrix} a_{11} & a_{12}& \cdots & a_{1n} \ a_{21} & a_{22} & \cdots & a_{2n} \ \vdots & \vdots & \ddots & \vdots \ a_{m1} & a_{m2} & \cdots & a_{mn} \end{pmatrix}with
a_{ij}referred to as the entries ofAor scalars in general, with(i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}. For real entries inAwe may denoteA \in \mathbb{R}^{m \times n}.
This matrix may be denoted in a shorter way by A = (a_{ij}).
Definition: let
\mathbf{x}be a1 \times nmatrix, referred to as row vector given by
\mathbf{x} = \begin{pmatrix}x_1 \ x_2 \ \vdots \ x_n\end{pmatrix}with
x_ireferred to as the entries of\mathbf{x}, withi \in \{1, \dots, n\}. For real entries we may denote\mathbf{x} \in \mathbb{R}^n.
Definition: let
\mathbf{x}be an \times 1matrix, referred to as column vector given by
\mathbf{x} = (x_1, x_2, \dots, x_n)with
x_ireferred to as the entries of\mathbf{x}, withi \in \{1, \dots, n\}. Also for the column vector we have for real entries\mathbf{x} \in \mathbb{R}^n.
From these two definitions it may be observed that row and column vectors may be used interchangebly, however using both it is important to state the difference. Best practice is to always work with row vectors and take the transpose if necessary.
Matrix operations
Definition: two
m \times nmatricesAandBare said to be equal ifa_{ij} = b_{ij}for eachi(i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}.
Definition: if
Ais anm \times nmatrix and\alphais a scalar, then\alpha Ais them \times nmatrix whose(i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}entry is\alpha a_{ij}.
Definition: if
A = (a_{ij})andB = (b_{ij})are bothm \times nmatrices, then the sumA + Bis them \times nmatrix whose(i,j) \in \{1, \dots, m\} \times \{1, \dots, n\}entry isa_{ij} + b_{ij}for each ordered pair(i,j).
If A is an m \times n matrix and \mathbf{x} is a vector in \mathbb{R}^n, then
A \mathbf{x} = x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_n
with A = (\mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n}).
Definition: if
\mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n}are vectors in\mathbb{R}^mandx_1, x_2 \dots, x_nare scalars, then a sum of the form
x_1 \mathbf{a}_1 + x_2 \mathbf{a}_2 + \dots + x_n \mathbf{a}_nis said to be a linear combination of the vectors
\mathbf{a_1}, \mathbf{a_2}, \dots, \mathbf{a_n}.
Theorem: a linear system
A \mathbf{x} = \mathbf{b}is consistent if and only if\mathbf{b}can be written as a linear combination of the column vectorsA.
??? note "Proof:"
Will be added later.
Transpose matrix
Definition: the transpose of an
m \times nmatrix A is then \times mmatrixBdefined by
b_{ji} = a_{ij},for
j \in \{1, \dots, n\}andi \in \{1, \dots m\}. The transpose ofAis denoted byA^T.
Definition: an
n \times nmatrixAis said to be symmetric ifA^T = A.
Hermitian matrix
Definition: the conjugate transpose of an
m \times nmatrix A is then \times mmatrixBdefined by
b_{ji} = \bar a_{ij},for
j \in \{1, \dots, n\}andi \in \{1, \dots m\}. The conjugate transpose ofAis denoted byA^H.
Definition: an
n \times nmatrixAis said to be Hermitian ifA^H = A.
Matrix multiplication
Definition: if
A = (a_{ij})is anm \times nmatrix andB = (b_{ij})is ann \times rmatrix, then the productA B = C = (c_{ij})is them \times rmatrix whose entries are defined by
c_{ij} = \mathbf{a}i \mathbf{b}j = \sum{k=1}^n a{ik} b_{kj}