You are here

Introduction to Linear Algebra

Gilbert Strang
Publisher: 
Wellesley Cambridge Press
Publication Date: 
2016
Number of Pages: 
x + 574
Format: 
Hardcover
Edition: 
5
Price: 
95.00
ISBN: 
9780980232776
Category: 
Textbook
[Reviewed by
Allen Stenger
, on
09/28/2016
]

Please see our review of the fourth edition. This new fifth edition is not drastically different. It has a new chapter on Linear Algebra in Probability and Statistics, and Singular Value Decomposition has been expanded into its own chapter. Some of the material is rearranged, and there are hundreds of small changes. The book is the same length as before, because the solutions to exercises has been removed to the web (the web version has solutions for all problems, not just selected ones as before).

The new probability chapter gives a fast-paced introduction to probability and statistics, then uses linear algebra to analyze covariance matrices. There’s also a short introduction to Kalman filters, which are a form of recursive least squares used in control theory. The SVD chapter is driven by particular applications, in particular image processing.

Bottom line: still an excellent introductory textbook.


Allen Stenger is a math hobbyist and retired software developer. He is an editor of the Missouri Journal of Mathematical Sciences. His personal web page is allenstenger.com. His mathematical interests are number theory and classical analysis.

  • 1 Introduction to Vectors
    • 1.1 Vectors and Linear Combinations
    • 1.2 Lengths and Dot Products
    • 1.3 Matrices
  • 2 Solving Linear Equations
    • 2.1 Vectors and Linear Equations
    • 2.2 The Idea of Elimination
    • 2.3 Elimination Using Matrices
    • 2.4 Rules for Matrix Operations
    • 2.5 Inverse Matrices
    • 2.6 Elimination = Factorization: A = LU
    • 2.7 Transposes and Permutations
  • 3 Vector Spaces and Subspaces
    • 3.1 Spaces of Vectors
    • 3.2 The Nullspace of A: Solving Ax = 0 and Rx = 0
    • 3.3 The Complete Solution to Ax = b
    • 3.4 Independence, Basis and Dimension
    • 3.5 Dimensions of the Four Subspaces
  • 4 Orthogonality
    • 4.1 Orthogonality of the Four Subspaces
    • 4.2 Projections
    • 4.3 Least Squares Approximations
    • 4.4 Orthonormal Bases and Gram-Schmidt
  • 5 Determinants
    • 5.1 The Properties of Determinants
    • 5.2 Permutations and Cofactors
    • 5.3 Cramer’s Rule, Inverses, and Volumes
  • 6 Eigenvalues and Eigenvectors
    • 6.1 Introduction to Eigenvalues
    • 6.2 Diagonalizing a Matrix
    • 6.3 Systems of Differential Equations
    • 6.4 Symmetric Matrices
    • 6.5 Positive Definite Matrices
  • 7 The Singular Value Decomposition (SVD)
    • 7.1 Image Processing by Linear Algebra
    • 7.2 Bases and Matrices in the SVD
    • 7.3 Principal Component Analysis (PCA by the SVD)
    • 7.4 The Geometry of the SVD
  • 8 Linear Transformations
    • 8.1 The Idea of a Linear Transformation
    • 8.2 The Matrix of a Linear Transformation
    • 8.3 The Search for a Good Basis
  • 9 Complex Vectors and Matrices
    • 9.1 Complex Numbers
    • 9.2 Hermitian and Unitary Matrices
    • 9.3 The Fast Fourier Transform
  • 10 Applications
    • 10.1 Graphs and Networks
    • 10.2 Matrices in Engineering
    • 10.3 Markov Matrices, Population, and Economics
    • 10.4 Linear Programming
    • 10.5 Fourier Series: Linear Algebra for Functions
    • 10.6 Computer Graphics
    • 10.7 Linear Algebra for Cryptography
  • 11 Numerical Linear Algebra
    • 11.1 Gaussian Elimination in Practice
    • 11.2 Norms and Condition Numbers
    • 11.3 Iterative Methods and Preconditioners
  • 12 Linear Algebra in Probability & Statistics
    • 12.1 Mean, Variance, and Probability
    • 12.2 Covariance Matrices and Joint Probabilities
    • 12.3 Multivariate Gaussian andWeighted Least Squares
  • Matrix Factorizations
  • Index
  • Six Great Theorems / Linear Algebra in a Nutshell