Sie sind auf Seite 1von 10

Multivariate Statistics, Thoman, Spring 2009 1

Matrix Algebra Intro


All the computations you need to know

Why do I need to know matrix algebra?


Understanding the matrix algebra gives you a better conceptual understanding of whats going on
Multiple predictor regression (with p-1 predictors) in scalar equations gets complicated The matrix expression is simple, and does not change Computer programs (like SPSS) use the matrix form

Matrix algebra is the link between regression and many multivariate techniques Much of the advanced stats literature is matrix-based Many advanced stats programs (e.g., SEM programs) are matrix-based

A matrix is a rectangular arrangement of numbers in rows and columns


Here are some you should already be familiar with

Multivariate Statistics, Thoman, Spring 2009 2

The general form of a matrix


We say that the order of this matrix could be m rows by n columns

Important terms: element

An element, aij would be the number in the ith row and jth column For example, element 2,3 is a23

Important terms: Square


A square matrix is where the number of rows (m) equals the number of columns (n)

Multivariate Statistics, Thoman, Spring 2009 3

Important terms: Main diagonal

Important terms: Symmetrical

Important terms: Triangular

Multivariate Statistics, Thoman, Spring 2009 4

Important terms: Identity Matrix

Important terms: Scalar


Any number not in a matrix is called a scalar. Algebra among scalars is called scalar algebra. Scalars and Matrices can interact!! Very important: A 1 x 1 matrix is not a scalar!!! To help distinguish between scalars and matrices, I will use the following notation: A letter representing a matrix will always be capital (e.g., A) A matrix that displays all the elements will be surrounded by brackets

Important terms: Vector

Multivariate Statistics, Thoman, Spring 2009 5

When are two matrices equal?


Two matrices are equal if and only if all elements are equal

When are two matrices equal?


Two matrices are equal if and only if all elements are equal

Operations in Matrix Algebra


There are several operations with which you will need to be familiar. Some of these have an equivalent in scalar algebra, others do not.
Transpose: reflects the matrix
We use the symbol to indicate a transpose A would be where aij now equals aji

Multivariate Statistics, Thoman, Spring 2009 6

Operation: Trace
Trace is the sum of the diagonal elements (main diagonal)

For a correlation matrix, the trace is the number of variables For a covariance matrix, the trace is the sum of the variances for all variables

Arithmetic Operations: Addition and Subtraction


To add or subtract two matrices, they must be of the same order (same number of rows and columns

Multiplication
To multiply a scalar by a matrix, merely multiply each element in the matrix by the scalar

Multivariate Statistics, Thoman, Spring 2009 7

Multiplication of Matrices
To multiply matrices (applies to vectors, too), the number of columns of the first matrix must equal the number of rows of the second matrix, otherwise the matrices are noncomformable Examples
A(3x2) * B(2x6) is conformable (a solution) A(3x2) * A(3x2) is nonconformable (no solution) A * A is conformable because transposing A () reverses the rows and columns (A(3x2) * A(2x3))

The result will be a matrix of order equal to the number of rows in the first matrix and the number of columns from the second matrix
A(3x2) * B(2x6) = a matrix with 3 rows and 6 columns

Multiplication Examples

Another Multiplication Example

Multivariate Statistics, Thoman, Spring 2009 8

Try two equations

AB is not generally equal to BA

Matrix Inversion
There is no straight division in matrix algebra. However, we can conduct division through multiplying a reciprocal In scalar algebra, know a-1 = 1/a, and a/a = 1 so a-1a = 1 In matrix algebra, we draw upon this last function to conduct division. That is:
A-1A = I where I is an identity matrix Also AA-1 = I (this is one case where pre or post multiplying does not matter) Matrix division merely requires us to find the matrix that, when multiplied by A, equals the identity matrix

Finding the determinant


The first step toward calculating the inverse matrix A-1 is to calculate the determinant It is easy to calculate for a 2 x 2 matrix

Multivariate Statistics, Thoman, Spring 2009 9

Properties of the determinant


As we move beyond a 2 x 2 matrix, the calculation rapidly gets more complicated The determinant of a matrix is sometimes said to represent the generalized variance of the matrix (volume) It is possible to have a negative determinant The case we are most concerned with is when the determinant equals zero

Determinant as generalized variance


Lets get a sense of why this is true, imagine our matrix described the qualities of a square
On the main diagonal are the length and width of the square On the off diagonal is the correlation of the two dimensions (right angle, so zero)

The determinant would be length*width zero (squared)


In other words, area

So, if we think of this applied to a covariance matrix where the main diagonal is variance (like length) and the off diagonal is covariance (like correlation)
The end result is the volume of the geometric space for the matrix

Now, imagine if we had a cube (three dimensions, and the length of one side is zero)
The volume is zero, and so the determinant is zero

Calculating the Inverse

If the determinant equals 0, there is no inverse! This is what happens when you have multicollinearity Terms used for this are matrix is not positive definite matrix is singular or matrix is not invertable

Multivariate Statistics, Thoman, Spring 2009 10

Eigenvalues and Eigenvectors


Eigenvalues (characteristic roots) and Eigenvectors (characteristic vectors) come from a technique for decomposing matrices used in a principle components analysis and factor analysis. We will discuss them later in the course. For now, know that the determinant is intrinsic to this calculation, too (i.e., if the determinant is zero, then one of more eigenvalues are 0).

Kronecker (Direct) Product (sometimes called the dot product)


You will sometimes see the Kronecker product in the literature

Summary of Matrix Algebra


Matrix algebra gives simple expressions for combining matrices of any size Computers are quick at it! As we shall see when looking at regression, a single matrix equation can describe an entire set of scalar equations Can be done in SPSS and SAS

Das könnte Ihnen auch gefallen