Linear Algebra

In my first year of SYSU, I took this course about Linear Algebra introduced by Dr Wang CD. It’s the foundation for the study in computer science and I would use this post to summarize some basic concepts in it.

Linear Equations


Linear Systems

  • Linear equation: For variables \(x_1, …, x_n\), linear equation can be written as $$a_1x_1 + a_2x_2 + … + a_nx_n = b$$
  • Linear System: A collection of one or more linear equations involving same group of variables. The set of all possible solutions for the system is called solution set. Two linear systems are equivalent if they have the same solution set.
  • Consistency: A consistent linear system has one or infinitely many solutions. An inconsistent linear system has no solution.
  • Matrix notation: Coefficient matrix only includes the coefficients of each variable. Augmented matrix also includes the right-side constants.
  • Solving linear system: Use three elementary row operations – replacement(replace one row by the sum of itself and a multiple of another), interchange(switch two rows), scaling(multiply whole row with a nonzero constant). These row operations are reversible. Two matrices are row equivalent if one can be changed to the other by a sequence of elementary row operations.

Echelon form

  • Leading entry: the leftmost nonzero entry of a row.
  • Echelon form:
    1. Nonzero rows are all above the all-zero rows;
    2. Each leading entry in a row is to the right of the leading entry in the row above;
    3. All entries beneath a leading entry in a column are zeros.
  • Reduced echelon form:
    1. The leading entry is 1 for all nonzero rows;
    2. For each column containing a leading entry, the other entries are 0.
      Each matrix is row equivalent to exactly one reduced echelon form matrix.
  • Pivot: The leading entry of the reduced echelon form matrix.
  • Row reduction algorithm:
    1. Use row interchange to move the row with most nonzero entries to the top.
    2. Select the pivot row, use row replacement to make the entries below pivot become 0. (add each row below with some multiple of pivot row)
    3. Continue…
    4. Start with the right-most pivot and use row replacement to make the entries above it become 0.
    5. Scale each row to make the pivots become 1.
  • Existence of solution: For augmented matrix, if the right-most column is not pivot column, the linear system is consistent.

Vector equations

  • A vector equation \(x_1{\bf{a_1}} + x_2{\bf{a_2}} + … + x_n{\bf{a_n}} = \bf{b}\) has the same solution set as the linear system \(\begin{bmatrix}\bf{a_1} & \bf{a_2} & … & \bf{a_n} & \bf{b}\end{bmatrix}\).
  • SPAN {\({\bf{v_1}}, {\bf{v_2}}, …, {\bf{v_n}}\)}: All the possible linear combination of vectors \({\bf{v_1}}, {\bf{v_2}}, …, {\bf{v_n}}\), where these vectors are nonzero and noncolinear(unparallel).

Matrix equations

  • Existence of solution: For matrix equation \(A{\bf{x}} = {\bf{b}}\), its solution exists if and only if \({bf{b}}\) is a linear combination of matrix A’s column vectors, which means \({\bf{b}}\) is in SPAN {\({\bf{a_1}}, {\bf{a_2}}, …, {\bf{a_n}}\)}.

Solution sets of linear systems

  • Homogeneous linear systems: \(A{\bf{x}} = {\bf{0}}\). It always has a trivial solution \({\bf{x}} = {\bf{0}}\). It has a nontrivial solution if and only if it has at least one free variable(corresponding to the column without leading entry).
  • Solution of Nonhomogeneous systems: \(A{\bf{x}} = {\bf{b}}\)’s solutions can be obtained by adding a vector \({\bf{p}}\) to the solutions of \(A{\bf{x}} = {\bf{0}}\), where \({\bf{p}}\) is one particular solution of \(A{\bf{x}} = {\bf{b}}\).
  • If \(A{\bf{x}} = {\bf{0}}\) has
    1. no free variable: \({\bf{0}}\) -> SPAN {\({\bf{0}}\)};
    2. 1 free variable: A line through origin -> SPAN {\({\bf{v}}\)};
    3. 2 free variables: A plane through origin -> SPAN {\({\bf{u}}, {\bf{v}}\)}.

Linear Independence

  • Linearly independent: For a set of vectors {\({\bf{v_1}}, {\bf{v_2}}, …, {\bf{v_n}}\)}, \({x_1\bf{v_1}}, {x_2\bf{v_2}}, …, {x_p\bf{v_p}} = {\bf{0}}\) has only trivial solution \({\bf{0}}\). If there is a set of c such that \({c_1\bf{v_1}}, {c_2\bf{v_2}}, …, {c_p\bf{v_p}} = {\bf{0}}\), these vectors are linearly dependent. In other words, if there is a least one free variable, there is nontrivial solution and thus the vectors are linearly dependent.

Linear Transformation

  • Linear transformation for vector x, T\(({\bf{x}})\): In \(A{\bf{x}} = {\bf{b}}\), \({\bf{R}}^n\) is the domain of T (columns of A and rows of \({\bf{x}}\)) and \({\bf{R}}^m\) is the codomain of T (rows of A and rows of \({\bf{b}}\)).
  • Judgement of linear transformation:
    1. T\(({\bf{u}} + {\bf{v}})\) = T\(({\bf{u}})\) + T\(({\bf{v}})\);
    2. T\((c{\bf{u}})\) = cT\(({\bf{x}})\).
      Matrix transformations are all linear transformation but linear transformation is not constrained to matrix trans.
  • Image: For the \({\bf{x}}\) in \({\bf{R}}^n\), T\(({\bf{x}})\) is the image of \({\bf{x}}\) in \({\bf{R}}^m\).
  • Matrix in linear transformation: There is a unique matrix A such that T\(({\bf{x}}) = A{\bf{x}}\) for \({\bf{x}}\) in \({\bf{R}}^n\), where A = \(\begin{bmatrix}T(\bf{e_1}) & T(\bf{e_2}) & … & T(\bf{e_n}) \end{bmatrix}\) and \({\bf{e}}_j\) is the jth column of indentity matrix in \({\bf{R}}^n\).
  • Onto: A mapping T from \({\bf{R}}^n\) to \({\bf{R}}^m\) is said to be onto \({\bf{R}}^m\) if each \({\bf{b}}\) in \({\bf{R}}^m\) is the image of at least one \({\bf{x}}\) in \({\bf{R}}^n\).
  • One-to-one: A mapping T from \({\bf{R}}^n\) to \({\bf{R}}^m\) is said to be one-to-one if each \({\bf{b}}\) in \({\bf{R}}^m\) is the image of at most one \({\bf{x}}\) in \({\bf{R}}^n\).

Matrix Algebra


Matrix operations

  • Matrix multiplication: For \(m \times n\) matrix A and \(n \times p\) matrix B, A·B = A·\(\begin{bmatrix}\bf{b_1} & \bf{b_2} & … & \bf{b_p}\end{bmatrix}\) = \(\begin{bmatrix}A·\bf{b_1} & A·\bf{b_2} & … & A·\bf{b_p}\end{bmatrix}\).
  • Row-Column rule:
    1. \(row_i(AB) = row_i(A)·B\);
    2. \(col_j(AB) = A·col_j(B)\);
  • Transpose of matrix: The rows and columns in \(A\) are the columns and rows in \(A^T\) with \(A{ij} = {A^T}{ji}\). It follows a reverse order rule in \({(AB)}^T = A^T + B^T\).

Inverse

  • Inverse of matrix: Only the matrices with rows and columns equal have inverse, where \(A·A^{-1} = I\). It the inverse exists, it is called a nonsingular matrix.
  • Existence of inverse of \(2 \times 2\) matrix:
    For matrix \(A = \begin{bmatrix}a & b\\ c & d\end{bmatrix}\), its determinant is \(det(A) = ad - bc\). If det(A) == 0, it is not invertible. Or its inverse is \(A^{-1} = \frac{1}{ad - bc}·\begin{bmatrix}d & -b\\ -c & a\end{bmatrix}\).
  • Elementary matrix \(E\): Generated after one row transformation from identity matrix. If the same row transformation is conducted on matrix \(A\), \(A’ = E·A\). Note that E should be to the left of A.
  • Computation of \(A^{-1}\): Do row transformation on augmented matrix \(\begin{bmatrix}A & I\end{bmatrix}\). If A is invertible, the result is \(\begin{bmatrix}I & A^{-1}\end{bmatrix}\).

高估了自己的耐性,原本打算把每门课都整理出来的,不过写博客真的是非常花时间,在JIE面试之前我还是尽量把所有课过一遍而不是花太多时间整理博客。之后有空再搞完这些基础课的笔记整理吧,至少有了博客就会逼着自己时不时放点干货上来[嘿嘿]。


好吧,我要弃坑了。目前得多体验一下业界,多面向应用去有针对性的学习。这种general的东西,不见得会对实操有多大帮助吧。