Matrix Form of a System of Linear Equations
The system of linear equations
a11x1 + a12x2 + a13x3 + . . . + a1nxn = b1
a21x1 + a22x2 + a23x3 + . . . + a2nxn = b2
......
am1x1 + am2x2 + am3x3 + . . . + amnxn = b3
can be written in matrix form AX=B where
A =
X = and B =
Example:
x + y – z = 5
2x + 3y – 8z = 24
-x – 4y + 2z = 17
has matrix form:
Definitions:
An mxn matrix A is a rectangular array of real numbers with m rows and n columns. (Rows are horizontal and columns are vertical.)
A column vector is an mx1 matrix; a row vector is a 1xn matrix
C = R = ;
A scalar, d, is a matrix with a single element.
d = [3]
The numbers m and n are the dimensions of A.
The real numbers in the matrix are called its entries or elements.
The elements in row i and column j are denoted aij or Aij.
A = is a 4x3 matrix; element a4,2 = -6
A square matrix has dimensions nxn
S =
The trace of a square matrix is the sum of the entries on the main diagonal
tr(S) = 2 + 5 + 6 = 13
The transpose, AT or A`, of a matrix A is obtained by writing the rows as columns.
If A is an mxn matrix, then AT is an nxm matrix with aTij = aji.
A = and AT =
A matrix is symmetric if AT = A
A diagonal matrix is a square, symmetric matrix that has 0’s everywhere except on the main diagonal
D =
The identity matrix, I, is a special case of a diagonal matrix; it has 1’s on the main diagonal and 0’s elsewhere
Iij = 1 if i = j, 0 for i ≠ j
I =
A determinant (det(A) or |A|) is a function of square matrices that depends on n; associates a scalar to all nxn matrices
For a 2x2 matrix, the determinant is defined to be
det
For a 3x3 matrix, the determinant is defined to be
Example: if S = then
det(S) = |S| = 2 - (-1) + 5
= 2*5*6-2*7*2+1*8*6-1*7*.9+5*8*2-5*5*.9 = 131.2
A general determinant for a matrix A:
|A| = where Mij is the minor of matrix A formed by eliminating row i and column j from A.
** If the determinant of a matrix is 0, the matrix is said to be singular, and thus not invertible.
Matrix Operations:
For matrices, one can add, subtract and multiply matrices, but there is no division.
Addition:
If A and B have the same dimensions, then their sum, A + B, is obtained by adding the corresponding entries.
(A + B)ij = aij + bij
Example: Let A = and B = then
A + B =
A – B =
Scalar Multiplication:
If A is a matrix and c is a scalar, then the scalar multiple, cA, is obtained by multiplying every entry in A by c.
(cA)ij = c(Aij).
Example: Let A = , then 2A =
Product:
If A has dimensions m×n and B has dimensions n×p, then the product AB is defined, and has dimensions m×p. The entry (AB)ij is obtained by multiplying row i of A by column j of B, which is done by multiplying corresponding entries together and then adding the results.
Example: Let A = and B = then
(AB)11 = 1*2 + -1*.5 = 1.5
(AB)12 = 1*5 + -1*-3 = 8, etc
AB =
If we multiply a row vector by a column vector, we obtain a scalar
1 x n * n x 1 = 1
AX does not usually equal XA
The inverse of matrix A (A-1) is the analog of division in real numbers.
§ only square matrices can have an inverse
§ in real numbers x * x-1 = 1 for matrices A*A-1 = I; here AA-1 = A-1A = I
This leads to:
if AX=B then A-1AX=A-1B or X = A-1B
To find A-1, row reduce (matrix row operations) A
first, write a double-wide matrix with the identity matrix on the right half:
A =
new row 2 = -R1 + R2:
new row 3 = -R1+R3:
new row 1 = -3R2+R1:
new row 1 = -3R3+R1:
Thus A-1 =
Homework: Check – what does A* A-1 equal?
What happens if a row (or column) is the sum of prior rows?
P = here column 3 is the sum of rows 1 and 2
Homework: What is P-1?
SSCP
Let X =
then X`X =
=
The diagonal values are sums of squares and the off-diagonal values are sums of cross products. The matrix X`X is an SSCP (Sums of Squares and Cross Products) matrix.
In general:
The Six Basic Matrices
Raw Score Matrix (Data Matrix) / [vertical]Raw Score SSCP / [square]
Deviation Score Matrix / [vertical]
Deviation SSCP / [square]
Covariance Matrix / [square]
Correlation Matrix / [square]
Example:
Raw Score Matrix (Data Matrix)
X = vertical matrix
Raw Score SSCP
X`X = Square Symmetric Matrix
Deviation Score Matrix
D = Vertical Matrix (what are means of the columns now)?
Deviation Score SSCP
The deviation score matrix is calculated by subtracting the column mean for the variable from each score in the data set: let xi = Xi – m
S = D`D = Square Symmetric Matrix
Covariance Matrix
The SSCP matrix divided by N (or N-1) is called the variance-covariance matrix. In it, we have variances on the diagonal and covariances off the main diagonal.
C = Square Symmetric Matrix
Correlation Matrix
If we further divide through by the standard deviation for each row and each column, we have a correlation matrix:
R = Square Symmetric Matrix
Matrix Formulae for Regression Coefficients
In theoretical form, the regression equation is written:
Y = β0 + β1X1 + … + βkXk
In raw form, this becomes:
Y = b0 + b1X1 + … + bkXk + ei
In matrix form: Y = Xb + e
where X, b and e are matrices