Definitions
Matrix
A matrix is an ordered set of numbers listed rectangular form.
Example. Let A denote the matrix
2 5 7 8
5 6 8 9
3 9 0 1
This matrix A has three rows and four columns. We say it is a 3 x 4 matrix.
We denote the element on the second row and fourth column with a24.
Square matrix
If a matrix A has n rows and n columns then we say it's a square matrix.
In a square matrix the elements ai,i , with i = 1,2,3,... , are called diagonal elements.
Remark. There is no difference between a 1 x 1 matrix and an ordinary number.
Diagonal matrix
A diagonal matrix is a square matrix with all non-diagonal elements 0.
The diagonal matrix is completely denoted by the diagonal elements.
Example.
7 0 0
0 5 0
0 0 6
The matrix is denoted by diag(7 , 5 , 6)
Row matrix
A matrix with one row is called a row matrix
Column matrix
A matrix with one column is called a column matrix
Matrices of the same kind
Matrix A and B are of the same kind if and only if
A has as many rows as B and A has as many columns as B
The transpose of a matrix
The n x m matrix A' is the transpose of the m x n matrix A if and only if
the ith row of A = the ith column of A' for (i = 1,2,3,..n)
So ai,j = aj,i'
The transpose of A is denoted T(A) or AT
0-matrix
When all the elements of a matrix A are 0, we call A a 0-matrix.
We write shortly 0 for a 0-matrix.
An identity matrix I
An identity matrix I is a diagonal matrix with all diagonal elements = 1.
A scalar matrix S
A scalar matrix S is a diagonal matrix with all diagonal elements alike.
a11 = ai,i for (i = 1,2,3,..n)
The opposite matrix of a matrix
If we change the sign of all the elements of a matrix A, we have the opposite matrix -A.
If A' is the opposite of A then ai,j' = -ai,j, for all i and j.
A symmetric matrix
A square matrix is called symmetric if it is equal to its transpose.
Then ai,j = aj,i , for all i and j.
A skew-symmetric matrix
A square matrix is called skew-symmetric if it is equal to the opposite of its transpose.
Then ai,j = -aj,i , for all i and j.
The sum of matrices of the same kind
Sum of matrices
To add two matrices of the same kind, we simply add the corresponding elements.
Sum properties
The basic properties of matrix addition:
- A + B = B + A
- (A+B) + C = A + (B + C).
- A + 0 = 0 + A = A
Scalar multiplication
Definition
To multiply a matrix with a real number, we multiply each element with this number.
Properties
The basic properties of matrix multiplication:
- A(BC) = (AB)C
- A(B + C) = AB + AC
- (A + B)C = AC + BC
Multiplication of a row matrix by a column matrix
This multiplication is only possible if the row matrix and the column matrix have the same number of elements. The result is a ordinary number ( 1 x 1 matrix).
To multiply the row by the column, one multiplies corresponding elements, then adds the results.
Example.
[1]
[2 1 3]. [2] = [19]
[5]
Multiplication of two matrices A.B
This product is defined only if A is a (l x m) matrix and B is a (m x n) matrix.
So the number of columns of A has to be equal to the number of rows of B.
The product C = A.B then is a (l x n) matrix.
The element of the ith row and the jth column of the product is found by multiplying the ith row of A by the jth column of B.
cij = sumk (aik.bkj)
Example.
[1 2][1 3] = [5 7]
[2 1][2 2] [4 8]
[1 3][1 2] = [7 5]
[2 2][2 1] [6 6]
[1 1][2 2] = [0 0]
[1 1][-2 -2] [0 0]
From these examples we see that the product is not commutative and that there are zero divisors.
Properties of multiplication of matrices
Associative
A(B.C) = (A.B)C for all matrices A,B and C.
Proof:
We'll show that an element of A(B.C) is equal to the corresponding element of (A.B)C
First we calculate the element of the ith row and jth column of A(B.C)
Let D denote B.C, then
dkj = sump bkp.cpj (1)
Let E denote A.D then
eij = sumk aik.dkj (2)
(1) and (2) gives
eij = sumk aik.(sump bkp.cpj)
<=> eij = sumk,p aik.bkp.cpj
So the element of the ith row and jth column of A(B.C) is
sumk,p aik.bkp.cpj (3)
Now we calculate the element of the ith row and jth column of (A.B)C
Let D' denote A.B, then
dip' = sumk aik.bkp (4)
Let E' denote D'C then
eij' = sump dip'.cpj(5)
(4) and (5) gives
eij' = sump (sumk aik.bkp).cpj
<=> eij' = sumk,p aik.bkp.cpj
So the element of the ith row and jth column of (A.B)C is
sumk,p aik.bkp.cpj (6)
From (3) and (6) => A(B.C) = (A.B)C
Distributive
A(B+C) = A.B+A.C and (A+B).C = A.C+B.C for all matrices A,B and C. This theorem can be proved in the same way as above.
Theorem 1
For each A, there is always an identity matrix E and an identity matrix E' so that A.E = A and E'.A = A If A is a square matrix, E = E'.
Theorem 2
(A.B)T = BT .AT
Theorem 3
A.0 = 0 = 0.A
Theorem 4
If r and s are real numbers and A , B matrices, then (rA)(sB) = (rs)(AB)
Theorem 5
if D = diag(a,b,c) then D.D = ( a2 , b2 , c2)
D.D.D = ( a3 , b3 , c3)
This property can be generalized for D = diag(a,b,c,d,e,...,l).