VECTORS AND MATRICES

DEFINITIONS: An m ´ n MATRIX is a rectangular array of numbers arranged in m rows and n columns. A = ((aij)) is a matrix A with aij as its ith row jth column ((i,j)th) element. To indicate the size, one may write Amxn.

A 1xn matrix is called a ROW MATRIX. An mx1 matrix is called a COLUMN MATRIX. If the numbers of rows and column are equal we have a SQUARE MATRIX. Row and column matrices are also called VECTORS.

The TRANSPOSE of a matrix A is obtained by interchanging rows and columns; it is denoted by AT or A¢.

A ZERO MATRIX is a matrix with all its entries equal to zero. We denote it by 0 .

A DIAGONAL MATRIX is a square matrix with all its off-diagonal elements zero.

A TRIANGULAR MATRIX is a square matrix with either all elements above the diagonal or all elements below the diagonal are zero. In the first case it is called LOWER TRIANGULAR and in the second case UPPER TRIANGULAR.

A SYMMETRIC MATRIX is a matrix A such that A = AT.

An IDENTITY MATRIX is a a diagonal matrix with all its diagonal entries equal to 1.

Two matrices are EQUAL if all corresponding elements are equal (order must be same).

MATRIX OPERATIONS

(1) If A = (( aij)) mxn and B = ((bij)) mxn then A + B = ((aij + bij)) mxn.

(2) If A = ((aij)) mxn and k is a constant, kA = (( kaij)) mxn

(3) -A = (-1)A

(4) A-B = A+ -B

(5) Two matrices A and B are conformable for multiplication (if, AB is defined) if the number of columns of A equals the number of rows of B. For An mxn matrix A and an nxk matrix B, AB is defined to be an m x k matrix whose (i,j)th element is .

Note Even if AB makes sense, BA may not.

Example

A = B = AB =

BA is not defined

PROPERTIES

(1) Matrix addition is commutative and associative

A + B = B + A ; (A + B ) + C = A + ( B + C)

(2) Matrix multiplication is associative

(AB)C = A(BC)

Matrix multiplication is not commutative

(i) Even if AB is defined BA may not be

(ii) Even if AB and BA are defined, they may not have same order.

(iii) Even if AB and BA are both defined and have same order, (happens if and only if both are square matrices of same order) AB need not equal BA.

(3) (AT) T = A

(4) (A + B)T = AT + BT

(5) (AB)T = BT AT

FURTHER DEFINITIONS

TRACE of a square matrix of order n is defined by tr(A) = the sum of the diagonal elements

Note:

tr( A) = a tr(A)

tr (CD) = tr (DC)

tr (AT) = tr(A)

DETERMINANT of a 2 ´ 2 matrix is ad - bc; it is denoted by .

The determinant of a 3x 3 matix is defined by = , where

Aij = (-1)i+j minor (aij) and minor (aij) = the determinant of the submatrix obtained by deleting the ith row and jth column of A. Aij is also called the COFACTOR of aij.

Note: You may take any i for the purpose of computing ½A½ .

In general for an nxn matrix A, ½A½ = = .

Properties of Determinants

(1)

(2)

(3) if A & B are both square matrices

(4) If A is triangular or diagonal, then .

A matrix is SINGULAR if ; otherwise it is called NON-SINGULAR

The INVERSE of a square matrix A is another matrix, denoted by A-1, such that

THEOREM

Inverse exists for a matrix A if and only if A is non-singular.

Properties

(1) If , then .

(2)

(3)

(4)

Theorem

Example:

Let A =

Verification:

Partitioned matrices

Let A be an m´n matrix. Let 1 £ k £ m and 1 £ l £n. Then one may partition A into sub matrices as follows:

Where A11 is a k ´ l matrix, A12 is a k ´ (n-l) matrix, A21 is an (m-k) ´ l matrix and A22 is an (m-k) ´ (n-l) matrix.

Examples:

Results:

(i)

(ii)

Homework: Prove the above two results. Also see if you can construct another form of (ii) where is used.

(iii)

where

DEF: A collection of vectors {a1,a2,...,ak} is said to be LINEARLY INDEPENDENT if there is no NON-ZERO k-vector a such that

(That is, there are no constants a1,...,ak, not all zero, such that ).

DEF: The RANK of a matrix is the maximum number of linearly independent rows.

Results:

(i) The maximum number of linearly independent columns is same as the rank.

(ii) Rank (A) £ min(m,n)

(iii) Rank (A) = Rank (AT)

(iv) Rank (AAT) = Rank (ATA) = Rank (A).

(v) If B is a non-singular square matrix Rank (AB) = Rank (BA) = Rank (A) (when products make sense).

(vi) Rank (A + B) £ Rank (A) + Rank (B).

Results :

(i) A is orthogonal => .

(ii) A & B are orthogonal => AB is orthogonal.

(iii) A is orthogonal => A-1 = AT.

DEF: Two vectors a and y of same order is said to be orthogonal if xTy = 0, orthonormal if in addition xTx = yTy = 1.

EIGENVALUES AND EIGENVECTORS

Let A be an n x n matrix. , where l is a variable, is a polynomial of degree n is l. That is called the CHARACTERISTIC POLYNOMIAL of A. The roots of the characteristic polynomials are called the EIGENVALUES (or characteristic values or characteristic roots) of A.

Associated with each eigenvalue l, there is a vector x ¹ 0 such that Ax = lx. This x is called an EIGENVECTOR (or characteristic vector) corresponding to the eigenvalue l.

Results :

(i)

(ii)

(iii) If A is symmetric, the eigenvalues of A are all real.

(iv) If are eigenvalues of A, their corresponding eigenvectors will be

linearly independent. If in addition A is symmetric C, they will be

orthogonal.

(v) For every symmetric matrix A, $ an orthogonal matrix T such that

T' AT = D

where D is a diagonal matrix of the eigenvalues of A.

(vi) The eigenvalues of a triangular matrix are the diagonal elements.

QUADRATIC FORMS

DEF: A QUADRATIC FORM is n variables x1,x2,...,xn is a homogeneous function consisting of all possible second order terms, namely

= xT A x

where A = ((aij)).

DEF: A square matrix A (and its associated quadratic form) is called NON-NEGATIVE DEFINITE if for all x ¹ 0. It is called POSITIVE SEMI-DEFINITE if it is non-negative definite but not positive definite.

Results :

(i) A positive definite (PD) matrix is invertible; a PSD matrix is not.

(ii) If A is a PSD matrix of rank r, then it has exactly r positive eigenvalues and

n-r zero eigenvalues.

(iii) The eigenvalues of a NND matrix are all non-negative; if it is a PD matrix,

then they are all positive.

(iv) If A is NND of rank r, then $ a matrix Q of rank r such that A = QQT.

Also, the transformation y = QTx transforms if

A is full rank.

(v) Let A be a symmetric matrix with being the eigenvalues of

x1,...,xr being normalized eigenvectors. Let T be the matrix with xi as its

ith column. Then T'AT = Ù = diag(). Moreover A = TÙT' =

l1x1x + ... lnxnx.

(This is called the SPECTRAL DECOMPOSITION of A).

Further if x = TY then xTAx = yTÙ y = l1y.