Introduction

9.1. Introduction#

This chapter serves as an introductory exploration of fundamental concepts in linear algebra with a focus on computational aspects. It also serves as the foundational background for the subsequent chapter, “Linear Algebra and Optimization for Data Analysis”.

Each section begins with a brief overview of the objectives and is followed by several example exercises to enhance understanding and facilitate learning. Moreover, we strive to include numerical notes wherever appropriate. These notes aim to explain and compare different computational approaches to problem-solving. The goal is for users to write their own code (whenever possible) to solve problems. However, for matrices of higher dimensions, we take advantage of the NumPy linear algebra module which provides efficient implementations of standard linear algebra algorithms.

The chapter covers the following topics:

  1. Linear Systems

  2. Matrices and Determinants

  3. Linear Transformations

  4. Eigenvalues and Eigenvectors

  5. Orthogonality

We will begin by exploring systems of linear equations, also known as linear systems, which play a central role in linear algebra. In fact, many problems in linear algebra can be converted into solving a linear system.

Linear systems and their solutions can be described using matrices. We will study the algebraic properties of matrices in Section 2. These properties can be used to solve a linear system and set a groundwork for topics covered in the following sections.

In Section 3, we will discuss linear transformation between subspaces of \(\mathbb{R}^n\), and explore how changes of bases affect the representation of these transformations. We won’t delve into vector spaces in their general form. This approach maintains generality as an \(n\)-dimensional vector space and \(\mathbb{R}^n\) are the same from a computational point of view.

Section 4 is devoted to dissecting the action of linear maps into elements that are easy to visualize, using eigenvalues and eigenvectors. We will delve into how a basis formed by eigenvectors leads to diagonal matrices (diagonalization). We will also examine the application of diagonalization in dealing with large matrix products and discrete dynamical systems.

Section 5 explores some ideas from analytic geometry, which allows us to define intuitive geometric concepts such as length, distance, and perpendicularity in \(\mathbb{R}^n\). It will also discuss orthogonal projection, the Gram-Schmidt algorithm, and the QR- factorization which are key concepts in many calculations involving orthogonality. Finally, we will discuss the regression analysis concept of the least squares problem to approximate the solution of linear systems in which there are more equations than unknowns.

The content of this chapter is based on:

  • Lay, David, et al. Linear Algebra and its Applications. 5th ed., Pearson., 2016

  • Deisenroth, Marc Peter, A. Aldo Faisal, and Cheng Soon Ong. Mathematics for Machine Learning. Cambridge University Press, 2020.