Sparse Matrix Algorithms

**
Mon/Wed 9:00-11:00
Trailer 932
**

**
**
Schedule, reading assignments, and slides

Email group (all students should join)

Office hours

References

Final project ideas

Homework 1 (due October 7)

Homework 2 (due October 14)
[rmat.m]

Homework 3 (due October 28)

Homework 4 (due November 4)

Sparse matrices are a basic tool of computational science and engineering. They show up in applications ranging from models of the physical world to web search and information retrieval. Using them efficiently involves techniques from linear algebra, graph algorithms, and computer architecture.

Sparse matrix algorithms are fascinating (to me at least :) because they combine two languages that are often quite different, those of numerical computation and of graph theory. One result is that nobody knows it all -- there is always something new to be learned by trying to speak the language you're not expert in.

Most of the course will concern methods for solving large, sparse systems of linear equations. We will first study direct methods, which are based on Gaussian elimination and use tools from graph theory and discrete data structures. We will then study iterative methods, which treat the matrix as a black-box operator and use eigenvalues and eigenvectors to analyze convergence. Finally, we will study modern preconditioned methods, which combine the discrete structure of direct methods, the numerical structure of iterative methods, and the specifics of the problem domain.

The prerequisites are some knowledge of linear algebra (Gaussian elimination, eigenvalues and eigenvectors) and analysis of algorithms. I expect to have students with a variety of different backgrounds; if you have an application from a scientific or engineering field that includes solving a system of linear equations I encourage you to talk to me about the course.

Students will do homework assignments and a term project. The term project can be either an application to a real computational science problem, an algorithms implementation experiment, or a theoretical or survey paper.

**Approximate course outline:**

- The basics:
- Graphs and matrices
- Data structures for sparse matrix manipulation
- Linear solvers and their complexity

- Sparse Gaussian elimination:
- Cholesky factorization
- Elimination trees, symbolic factorization, structure prediction
- Orderings for low fill
- Symmetric indefinite and nonsymmetric systems

- Krylov-subspace iterations:
- Conjugate gradients
- Convergence analysis of CG
- GMRES, BiCGSTAB, and friends

- General-purpose preconditioners:
- Incomplete factorization
- Sparse approximate inverses
- Support theory

- Hierarchical preconditioning:
- Multigrid
- Domain decomposition
- Fast multipole

- High-performance considerations:
- Cliques and chordal graphs
- Cache behavior, supernodal and multifrontal algorithms
- Parallel matrix-vector multiplication

- Possible topics:
- Block triangular form and Dulmage-Mendelsohn decomposition
- Parallel sparse matrix methods
- Preconditioning symmetric indefinite and nonsymmetric problems
- Least squares problems
- Sparse eigenvalue and singular value problems

**Texts:**

These three books are all excellent. Though they will be on reserve at the library I recommend that you get your own copies. All three are available at significant discounts to SIAM members, and any UCSB student can join SIAM for free.

- T. A. Davis, Direct Methods for Sparse Linear Systems. SIAM, 2006.
- Y. Saad, Iterative Methods for Sparse Linear Systems. SIAM, second edition 2003.
- W. Briggs, V. Henson, and S. McCormick, A Multigrid Tutorial. SIAM, second edition 2000.