You are here

Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares

Stephen Boyd and Lieven Vandenberghe
Cambridge University Press
Publication Date: 
Number of Pages: 
BLL Rating: 

The Basic Library List Committee suggests that undergraduate mathematics libraries consider this book for acquisition.

[Reviewed by
Brian Borchers
, on
Stephen Boyd and Lieven Vandenberghe are well known for their graduate-level textbook, Convex Optimization.  Their new textbook, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares is an introduction to applied linear algebra with applications of least squares problems in data science. 
You’ll find many topics in this book that are commonly covered in a first course in linear algebra, including basic operations on matrices and vectors, systems of linear equations, linear independence and bases, orthogonality, the Gram-Schmidt process, least squares problems, and the normal equations.  However, you might be surprised by the standard topics that are excluded from this book, including complex vectors and matrices, vector spaces, the null space and range of a matrix, eigenvalues and eigenvectors, diagonalization, and systems of linear differential equations.  The authors have clearly chosen to include only those topics from a conventional course that are necessary to support the data science applications in the book.  
The book features numerous applications, including document and image clustering, optimization of advertising purchases, time series prediction and tomography.  In most cases, the application problem is reduced to a linear least squares problem.  The use of the QR factorization to solve least squares problems is emphasized.   The authors discuss least squares problems with linear equality constraints and apply this to portfolio optimization and linear quadratic control.  The last two chapters of the book introduce nonlinear least squares problems, Newton’s method, the Gauss-Newton and Levenberg-Marquardt methods, and constrained nonlinear least squares problems.  
The material is presented with some attention to issues of computational complexity and numerical accuracy but without reference to any particular computer language.  Algorithms are presented in pseudocode.  A course using this textbook should include a substantial programming component.  The authors have prepared a Julia Language Companion for the book, but MATLAB or Python could also be easily used.  
I was somewhat disappointed that the authors restricted themselves to dealing only with least squares problems where the \( A \) matrix has full column or row rank.  This allows for the solution of the least squares problem by the QR factorization, but excludes problems in which \( A \) is rank deficient.  I would have liked to see a discussion of rank deficient problems and the SVD.  This would also have opened up the possibility of discussing applications involving low rank and nonnegative matrix factorizations, principal components, tensor decomposition, etc.     
Although this book is clearly not suitable for a mainstream introduction to linear algebra course, it could be used either as the textbook for a first course in applied linear algebra for data science or (using the first half of the book to review linear algebra basics) the textbook for a course in linear algebra for data science that builds on a prior introduction to linear algebra.  Compare this book with Gilbert Strang’s Linear Algebra and Learning from Data, which definitely requires a prior introductory course in linear algebra and tries to cover many more advanced topics.  In comparison, Introduction to Linear Algebra: Vectors, Matrices, and Least Squares provides excellent coverage of a smaller selection of topics.  
This is a very well written textbook that features significant mathematics, algorithms, and applications.  I recommend it highly.  
Brian Borchers is a professor of mathematics at New Mexico Tech and the editor of MAA Reviews.   

See the publisher's web page.