Hike News
Hike News

Linear Algebra (I)

The core of linear algebra is the equation Ax=b. In the column and matrix pictures, the right hand side of the equation is a vector b. Given a matrix A, do the linear combinations of its column vectors fill the xy-plane (or space, in the three dimensional case)?If the answer is “no”, we say that A is a singular matrix. In this singular case its column vectors are linearly dependent.

From vectors to matrices to subspaces

Vectors

Vectors just like a combination of scalars.
An example in $R^3$ would be:

Matrices

Matrices just like a combination of vextors.
An example would be:

Vector space/Subspace

A vector space is a collection of vectors that is closed under linear combina­tions. Given a matrix A with columns in $R^3$, these columns and all their linear combinations form a subspace of $R^3$.This is the column space C(A). A subspace is a vector space inside another vector space.
The subspaces of $R^2$ are:

  • all of $R^2$;
  • any line through origin;
  • the zero vector alone.
    The subspaces of $R^3$ are:
  • all of $R^3$;
  • any plane through the origin;
  • any line through the origin;
  • the zero vector alone.

Elimination with matrices

Gauss elimination

So question rises that how to judge wether a matrice is singular or not, in other words, can the linear combination of vectors (vector space) of the matrice equals $R^n$?
We repeat the multiplications and substractions (Gauss’ elimination method)on the row of matrices to get an upper triangular matrix. An example would be:

We define the first number that not equalsto zero in each row as pivot. The number of pivots in each upper triangular matrix is the rank of the matrix. If the rank is smaller than the dimension of each vector (the column space of the matrice), we can say that the matrice is singular, so the equation Ax=b may have no solution. If the row number equals the column number(square matrices), and the matrice can be transfered to matrix $I$, we define that the matrix can be inversed, the inverse matrix is writen as $A^-1$. in other words, $A^(-1)A=I$.

Once we have used Gauss’ elimination method to convert the original matrix to upper triangular form, we go on to use Gauss-Jordan elimination eliminating entries in the upper right portion of the matrix. This is the method to get the inverse matrix.

Factorization into $A=LU$

The Four Spaces

Nullspace and Column Space

The nullspaceof a matrix $A$ is the collection of all solutions $x$ that make $Ax=0$. The nullspace is a subspace.

How to compute the nullspace?

Elimination steps make a echelon form (upper triganle matrix) $U$. If the rank of the matrix is smaller than space dimension (r<m, a matrix with m rows and n columns), there exist m-r free columns. Suppose these columns equal one (others zero) respectively, we can get the whole nullspace (the linear combinations of the results). On the other hand, by continuing to use the method of elimination we can convert $U$ to a matrix $R$ in reduced row echelon form (rref form), with pivots equal to 1 and zeros above and below the pivots.

The application of the nullspace

We can use the nullspace to compute the complete solution of any $Ax=b$ by combining a particular solution (set all free variables to zero) and the nullspace.

Row space and left nullspace

The row space and left nullspace is the column space and nullspace of the transpose of matrix $A$.

Orthogonality

Equation: Suppose a $m \times n$ matrix $A$, we get:
$ rank(A)= number of pivot columns of A = dimensions of C(A) $
$= m- dimension of N(A) =m- number of free variables $

$ rank(A^T)= number of pivot columns of A^T = dimensions of R(A) $
$=n- dimension of N(A^T) =m- number of free variables $

We know that $Subspace S$ is orthogonal to $Subspace T$ means: every vector in $S$ is orthogonal to every vector in $T$.
The row space of a matrix is orthogonal to the nullspace, because $Ax=0$ means the dot product of $x$ with each row of $A$ is 0. But then the product of $x$ with any combination of rows of $A$ must be 0. The column space is orthogonal to the left nullspace of $A$ because the row space of $A^T$ is perpendicular to the nullspace of $A^T$.

Projection

We’d like to write the projection in terms of a projection matrix $P$: $p=Pb$.
$P=\frac{aa^T}{a^Ta}$

The application of projection