Navigation 
 Home 
 Search 
 Site map 

 Contact Graeme 
 Home 
 Email 
 Twitter

 Skip Navigation LinksMath Help > Sets, Set theory, Number systems > Matrix Math

Systems of linear equations can be solved using "matrix math".  This section explains some of these techniques.

Contents of this section:

Skip Navigation Links.

Matrix Definitions define order and rank, scalar and matrix multiplication, properties (such as associative, commutative), etc.

The "Determinant" is a scalar value obtained by algebraically summing products of cells in different rows and columns.  This page explains it, gives an algorithm to find it, and tells you why it is useful.

RREF is "Reduced Row Echelon Form" a.k.a. Gauss-Jordan elimination: a way of solving a system of linear equations.

Cramer's Rule is another way of solving a system of linear equations, especially useful if you have a quick way of getting the determinant of a matrix.

An example of a system of linear equations is

3x + 2y + 2z = 3
x + 2y - z = 5
2x - 4y + z = 0

A traditional way of solving this is to multiply pairs of equations through by different numbers so they can be added together to eliminate variables.  This method is tedious, but it can be automated by observing that certain types of matrix manipulation give the same results.

Internet references

A Brief History of Linear Algebra and Matrix Theory, by Marie A. Vitulli

Related pages in this website

Linear Patterns -- a very basic introduction to the concept of linear relationships
Vectors
-- the "dot" product and the "cross" product, explained.