Linear AlgebraJohn Niclasen Contents:1. Week 1 1.1 Systems of Linear Equations 1.2 Row Echelon Form 1.3 Matrix Algebra 2. Week 2 2.1 Matrix Algebra 2.2 Elementary Matrices 2.3 The Determinant of a Matrix
1. Week 11.1 Systems of Linear Equations
A linear system of m equations in n unknowns is a system of the form \displaylines{a_{11}x_1+a_{12}x_2+\ldots +a_{1n}x_n=b_1\cr
a_{21}x_1+a_{22}x_2+\ldots +a_{2n}x_n=b_2\cr
\vdots \cr
a_{m1}x_1+a_{m2}x_2+\ldots +a_{mn}x_n=b_m\cr
} Consistency
If a system has at least one solution, we say that it is consistent.
If a system has no solution, we say that it is inconsistent. 
Equivalent Systems
Two systems of equations involving the same variables are said to be equivalent if they have the same solution set. 
Strict Triangular Form
A system is said to be in strict triangular form if in the kth equation the coefficients of the first kl variables are all zero and the coefficient of x_k is nonzero (k=1,\ldots ,n). 
Back Substitution
First, the nth equation is solved for the value of x_n. This value is used in the (n1)st equation to solve for x_{n1}. The values x_n and x_{n1} are used in the (n2)nd equation to solve for x_{n2}, and so on. We will refer to this method of solving a strictly triangular system as back substitution. 
1.1.1 Matrix
The term matrix means simply a rectangular array of numbers. A matrix having m rows and n columns is said to be m\times n. \displaylines{\left[\matrix{a_{11}&a_{12}&\ldots &a_{1n}\cr
a_{21}&a_{22}&\ldots &a_{2n}\cr
\vdots \cr
a_{m1}&a_{m2}&\ldots &a_{mn}\cr
}\right]\cr
} 1.1.2 Augmented Matrix\displaylines{A=\left[\matrix{a_{11}&a_{12}&\ldots &a_{1n}\cr
a_{21}&a_{22}&\ldots &a_{2n}\cr
\vdots \cr
a_{m1}&a_{m2}&\ldots &a_{mn}\cr
}\right],\quad B=\left[\matrix{b_{11}&b_{12}&\ldots &b_{1r}\cr
b_{21}&b_{22}&\ldots &b_{2r}\cr
\vdots \cr
b_{m1}&b_{m2}&\ldots &b_{mr}\cr
}\right]\cr
(A\,\vert\, B)=\left [ \matrix{a_{11}&a_{12}&\ldots &a_{1n}\cr
\vdots \cr
a_{m1}&a_{m2}&\ldots &a_{mn}\cr
}\right  \left . \matrix{b_{11}&b_{12}&\ldots &b_{1r}\cr
\vdots \cr
b_{m1}&b_{m2}&\ldots &b_{mr}\cr
}\right ] \cr
}
In an augmented matrix the lefthand side is called the coefficient matrix. Elementary Row Operations  I.  Interchange two rows.   II.  Multiply a row by a nonzero real number.   III.  Replace a row by its sum with a multiple of another row. 

1.2 Row Echelon FormRow Echelon Form
A matrix is said to be in row echelon form  (i)  If the first nonzero entry in each nonzero row i 1.   (ii)  If row k does not consist entirely of zeros, the number of leading zero entries in row k+1 is greater than the number of leading zero entries in row k.   (iii)  If there are rows whose entries are all zero, they are below the rows having nonzero entries. 

Gaussian Elimination
The process of using row operations I, II, and III to transform a linear system into one whose augmented matrix is in row echelon form is called Gaussian elimination. 
If doing the Gaussian Elimination one or more rows end up with all zeros in the coefficient matrix and nonzero in the righthand side of the matrix, the linear system is inconsistent. 1.2.1 Over/Underdetermined Systems
A linear system is said to be overdetermined if there are more equations than unknowns. Overdetermined systems are usually (but not always) inconsistent.
A system of m linear equations in n unknowns is said to be underdetermined if there are fewer equations than unknowns (m<n). Although it is possible for underdetermined systems to be inconsistent, they are usually consistent with infinitely many solutions. It is not possible for an underdetermined system to have only one solution. Reduced Row Echelon Form
A matrix is said to be in reduced row echelon form if:  (i)  The matrix is in row echelon form.   (ii)  The first nonzero entry in each row is the only nonzero entry in its column. 

1.2.2 Homogeneous Systems
A system of linear equations is said to be homogeneous if the constants on the righthand side are all zero.
1.3 Matrix Algebra1.3.1 Matrix Notation\displaylines{A=\left[\matrix{a_{11}&a_{12}&\ldots &a_{1n}\cr
a_{21}&a_{22}&\ldots &a_{2n}\cr
\vdots \cr
a_{m1}&a_{m2}&\ldots &a_{mn}\cr
}\right]\cr
} 1.3.2 Multiply by a scalar
Multiply each element by the scalar. 1.3.3 Sum
Sum each element of two m\times n matrices. Linear Combination
If \bar a_1,\bar a_2,\ldots ,\bar a_n are vectors in \mathbb{R} ^m and c_1,c_2,\ldots ,c_n are scalars, then a sum of the form \displaylines{c_1\bar a_1+c_2\bar a_2+\ldots +c_n\bar a_n}
is said to be a linear combination of the vectors \bar a_1,\bar a_2,\ldots ,\bar a_n. 
Consistency Theorem for Linear Systems
A linear system A\bar x=\bar b is consistent if and only if \bar b can be written as a linear combination of the column vectors of A. 
2. Week 22.1 Matrix Algebra2.1.1 Product
Multiply each row with each column and sum it up. The product of an m\times n matrix and an n\times r matrix is an m\times r matrix. 2.1.2 Algebraic RulesTheorem 1.3.2
Each of the following statements is valid for any scalars \alpha and \beta and for any matrices A, B and C for which the indicated operations are defined.  A+B=B+A
 (A+B)+C=A+(B+C)
 (AB)C=A(BC)
 A(B+C)=AB+AC
 (A+B)C=AC+BC
 (\alpha \beta )A=\alpha (\beta A)
 \alpha (AB)=(\alpha A)B=A(\alpha B)
 (\alpha +\beta )A=\alpha A+\beta A
 \alpha (A+B)=\alpha A+\alpha B

Warning:
In general, AB\not =BA. Matrix multiplication is not commutative. 
2.1.3 The Identity Matrix
The n\times n identity matrix is the matrix I=(\delta _{ij}), where \displaylines{\delta _{ij}=\cases{1&i = j\cr
0&i <> j\cr
}\cr
} \displaylines{I=\left[\matrix{1&0&0&\cdots &0\cr
0&1&0&\cdots &0\cr
0&0&1&\ddots &\vdots \cr
\vdots &\vdots &\ddots &\ddots &0\cr
0&0&\cdots &0&1\cr
}\right]\cr
} Matrix Inversion
An n\times n matrix A is said to be nonsingular or invertible if there exists a matrix B such that AB=BA=I. The matrix B is said to be a multiplicative inverse of A. 
Singular
An n\times n matrix is said to be singular if it does not have a multiplicative inverse. 
Theorem 1.3.3
If A and B are nonsingular n\times n matrices, then AB is also nonsingular and (AB)^{1}=B^{1}A^{1}. 
2.1.4 The Transpose of a MatrixTranspose
The transpose of an m\times n matrix A is the n\times m matrix B defined by \displaylines{b_{ji}=a_{ij}}
for j=1,\ldots ,n and i=1,\ldots ,m. The transpose of A is denoted by A^T. 
2.1.5 Algebraic Rules for Transposes (A^T)^T=A
 (\alpha A)^T=\alpha A^T
 (A+B)^T=A^T+B^T
 (AB)^T=B^TA^T
Symmetric
An n\times n matrix is said to be symmetric if A^T=A. 
2.2 Elementary Matrices2.2.1 Equivalent Systems
Given an m\times n linear system A\bar x=\bar b, we can obtain an equivalent system by multiplying both sides of the equation by a nonsingular m\times m matrix M. \displaylines{A\bar x=\bar b\cr
\Leftrightarrow \quad MA\bar x=M\bar b\cr
} Elementary Matrices
Type I. Interchange two rows of I.
Type II. Multiplying a row of I by a nonzero constant.
Type III. Adding a multiply of one row of I to another row. 
Theorem 1.4.1
If E is an elementary matrix, then E is nonsingular and E^{1} is an elementary matrix of the same type. 
Theorem 1.4.2
Equivalent Conditions for Nonsingularity
Let A be an n\times n matrix. The following are equivalent:  (a)  A is nonsingular.   (b)  A\bar x=\bar 0 has only the trivial solution \bar 0.   (c)  A is row equivalent to I. 

Corollary 1.4.3
The system of n linear equations in n unknowns A\bar x=\bar b has a unique solution if and only if A is nonsingular. 
2.3 The Determinant of a MatrixDefinition
Let A=(a_{ij}) be an n\times n matrix and let M_{ij} denote the (n1)\times (n1) matrix obtained from A by deleting the row and column containing a_{ij}. The determinant of M_{ij} is called the minor of a_{ij}. We define the cofactor A_{ij} of a_{ij} by \displaylines{A_{ij}=(1)^{i+j}\det (M_{ij})} 
Determinant
The determinant of an n\times n matrix A, denoted \det (A), is a scalar associated with the matrix A that is defined inductively as follows: \displaylines{\det (A)=\cases{a_{11}&n=1\cr
a_{11}A_{11}+a_{12}A_{12}+\ldots +a_{1n}A_{1n}&n>1\cr
}\cr
}
where \displaylines{A_{1j}=(1)^{1+j}\det (M_{1j})\quad j=1,\ldots ,n}
are the cofactors associated with the entries in the first row of A. 
