Torrent details for:

Van de Geijn R. Advanced Linear Algebra. Foundations to Frontiers 2023
Textbook in PDF format
Acknowledgements
Preface
Getting Started
Opening Remarks
Welcome
Outline Week 0
What you will learn
Setting Up For ALAFF
Accessing these notes
Cloning the ALAFF repository
MatLAB
Setting up to implement in C (optional)
Enrichments
Ten surprises from numerical linear algebra
Best algorithms of the 20th century
Wrap Up
Additional Homework
Summary
Orthogonality
Norms
Opening Remarks
Why norms?
Overview
What you will learn
Vector Norms
Absolute value
What is a vector norm?
The vector 2-norm (Euclidean length)
The vector p-norms
Unit ball
Equivalence of vector norms
Matrix Norms
Of linear transformations and matrices
What is a matrix norm?
The Frobenius norm
Induced matrix norms
The matrix 2-norm
Computing the matrix 1-norm and -norm
Equivalence of matrix norms
Submultiplicative norms
Condition Number of a Matrix
Conditioning of a linear system
Loss of digits of accuracy
The conditioning of an upper triangular matrix
Enrichments
Condition number estimation
Practical computation of the vector 2-norm
Wrap Up
Additional homework
Summary
The Singular Value Decomposition
Opening Remarks
Low rank approximation
Overview
What you will learn
Orthogonal Vectors and Matrices
Orthogonal vectors
Component in the direction of a vector
Orthonormal vectors and matrices
Unitary matrices
Examples of unitary matrices
Change of orthonormal basis
Why we love unitary matrices
The Singular Value Decomposition
The Singular Value Decomposition Theorem
Geometric interpretation
An "algorithm" for computing the SVD
The Reduced Singular Value Decomposition
SVD of nonsingular matrices
Best rank-k approximation
Enrichments
Principle Component Analysis (PCA)
Wrap Up
Additional homework
Summary
The QR Decomposition
Opening Remarks
Choosing the right basis
Overview Week 3
What you will learn
Gram-Schmidt Orthogonalization
Classical Gram-Schmidt (CGS)
Gram-Schmidt and the QR factorization
Classical Gram-Schmidt algorithm
Modified Gram-Schmidt (MGS)
In practice, MGS is more accurate
Cost of Gram-Schmidt algorithms
Householder QR Factorization
Using unitary matrices
Householder transformation
Practical computation of the Householder vector
Householder QR factorization algorithm
Forming Q
Applying QH
Orthogonality of resulting Q
Enrichments
Blocked Householder QR factorization
Systematic derivation of algorithms
Available software
Wrap Up
Additional homework
Summary
Linear Least Squares
Opening Remarks
Fitting the best line
Overview
What you will learn
Solution via the Method of Normal Equations
The four fundamental spaces of a matrix
The Method of Normal Equations
Solving the normal equations
Conditioning of the linear least squares problem
Why using the Method of Normal Equations could be bad
Solution via the SVD
The SVD and the four fundamental spaces
Case 1: A has linearly independent columns
Case 2: General case
Solution via the QR factorization
A has linearly independent columns
Via Gram-Schmidt QR factorization
Via the Householder QR factorization
A has linearly dependent columns
Enrichments
Rank-Revealing QR (RRQR) via MGS
Rank Revealing Householder QR factorization
Wrap Up
Additional homework
Summary
Solving Linear Systems
The LU and Cholesky Factorizations
Opening Remarks
Of Gaussian elimination and LU factorization
Overview
What you will learn
From Gaussian elimination to LU factorization
Gaussian elimination
LU factorization: The right-looking algorithm
Existence of the LU factorization
Gaussian elimination via Gauss transforms
LU factorization with (row) pivoting
Gaussian elimination with row exchanges
Permutation matrices
LU factorization with partial pivoting
Solving A x = y via LU factorization with pivoting
Solving with a triangular matrix
LU factorization with complete pivoting
Improving accuracy via iterative refinement
Cholesky factorization
Hermitian Positive Definite matrices
The Cholesky Factorization Theorem
Cholesky factorization algorithm (right-looking variant)
Proof of the Cholesky Factorizaton Theorem
Cholesky factorization and solving LLS
Implementation with the classical BLAS
Enrichments
Other LU factorization algorithms
Blocked LU factorization
Formal derivation of factorization algorithms
Wrap Up
Additional homework
Summary
Numerical Stability
Opening Remarks
Whose problem is it anyway?
Overview
What you will learn
Floating Point Arithmetic
Storing real numbers as floating point numbers
Error in storing a real number as a floating point number
Models of floating point computation
Stability of a numerical algorithm
Conditioning versus stability
Absolute value of vectors and matrices
Error Analysis for Basic Linear Algebra Algorithms
Initial insights
Backward error analysis of dot product: general case
Dot product: error results
Matrix-vector multiplication
Matrix-matrix multiplication
Error Analysis for Solving Linear Systems
Numerical stability of triangular solve
Numerical stability of LU factorization
Numerical stability of linear solve via LU factorization
Numerical stability of linear solve via LU factorization with partial pivoting
Is LU with Partial Pivoting Stable?
Enrichments
Systematic derivation of backward error analyses
LU factorization with pivoting can fail in practice
The IEEE floating point standard
Wrap Up
Additional homework
Summary
Solving Sparse Linear Systems
Opening Remarks
Where do sparse linear systems come from?
Overview
What you will learn
Direct Solution
Banded matrices
Nested dissection
Observations
Iterative Solution
Jacobi iteration
Gauss-Seidel iteration
Convergence of splitting methods
Successive Over-Relaxation (SOR)
Enrichments
Details!
Parallelism in splitting methods
Dr. SOR
Wrap Up
Additional homework
Summary
Descent Methods
Opening Remarks
Solving linear systems by solving a minimization problem
Overview
What you will learn
Search directions
Basics of descent methods
Toward practical descent methods
Relation to Splitting Methods
Method of Steepest Descent
Preconditioning
The Conjugate Gradient Method
A-conjugate directions
Existence of A-conjugate search directions
Conjugate Gradient Method Basics
Technical details
Practical Conjugate Gradient Method algorithm
Final touches for the Conjugate Gradient Method
Enrichments
Conjugate Gradient Method: Variations on a theme
Wrap Up
Additional homework
Summary
The Algebraic Eigenvalue Problem
Eigenvalues and Eigenvectors
Opening Remarks
Relating diagonalization to eigenvalues and eigenvectors
Overview
What you will learn
Basics
Singular matrices and the eigenvalue problem
The characteristic polynomial
More properties of eigenvalues and vectors
The Schur and Spectral Decompositions
Diagonalizing a matrix
Jordan Canonical Form
The Power Method and related approaches
The Power Method
The Power Method: Convergence
The Inverse Power Method
The Rayleigh Quotient Iteration
Discussion
Enrichments
How to compute the eigenvalues of a 2 2 matrix
Wrap Up
Additional homework
Summary
Practical Solution of the Hermitian Eigenvalue Problem
Opening Remarks
Subspace iteration with a Hermitian matrix
Overview
What you will learn
From Power Method to a simple QR algorithm
A simple QR algorithm
A simple shifted QR algorithm
Deflating the problem
Cost of a simple QR algorithm
A Practical Hermitian QR Algorithm
Reduction to tridiagonal form
Givens' rotations
Simple tridiagonal QR algorithm
The implicit Q theorem
The Francis implicit QR Step
A complete algorithm
Enrichments
QR algorithm among the most important algorithms of the 20th century
Who was John Francis
Casting the reduction to tridiagonal form in terms of matrix-matrix multiplication
Optimizing the tridiagonal QR algorithm
The Method of Multiple Relatively Robust Representations (MRRR)
Wrap Up
Additional homework
Summary
Computing the SVD
Opening Remarks
Linking the Singular Value Decomposition to the Spectral Decomposition
Overview
What you will learn
Practical Computation of the Singular Value Decomposition
Computing the SVD from the Spectral Decomposition
A strategy for computing the SVD
Reduction to bidiagonal form
Implicitly shifted bidiagonal QR algorithm
Jacobi's Method
Jacobi rotation
Jacobi's method for computing the Spectral Decomposition
Jacobi's method for computing the Singular Value Decomposition
Enrichments
Principal Component Analysis
Casting the reduction to bidiagonal form in terms of matrix-matrix multiplication
Optimizing the bidiagonal QR algorithm
Wrap Up
Additional homework
Summary
Attaining High Performance
Opening Remarks
Simple Implementation of matrix-matrix multiplication
Overview
What you will learn
Linear Algebra Building Blocks
A simple model of the computer
Opportunities for optimization
Basics of optimizing matrix-matrix multiplication
Optimizing matrix-matrix multiplication, the real story
BLAS and BLIS
Casting Computation in Terms of Matrix-Matrix Multiplication
Blocked Cholesky factorization
Blocked LU factorization
Other high-performance dense linear algebra algorithms
Libraries for higher level dense linear algebra functionality
Sparse algorithms
Enrichments
BLIS and beyond
Optimizing matrix-matrix multiplication - We've got a MOOC for that!
Deriving blocked algorithms - We've got a MOOC for that too!
Parallel high-performance algorithms
Wrap Up
Additional homework
Summary
Are you ready?
Notation
Householder notation
Knowledge from Numerical Analysis
Cost of basic linear algebra operations
Computation with scalars
Vector-vector operations
Matrix-vector operations
Matrix-matrix operations
Summary
Catastrophic cancellation
GNU Free Documentation License
References
Answers and Solutions to Homeworks
Index
