Numerical Algebra, methods for large matrices



Staff

Axel Ruhe, lectures and course leader, room D4527

News:

Reporting of the special assignments is now scheduled in D4523.
I have reserved the room and a VGA projector. Send me an email and reserve a time slot that is free!

Dec 15, 13:30



14:00
Amaury Marescaux, Emmanuel Olivi
10
Jacobi is better than QR, high relative precision
14:30
Hrvoje Gotovac
12
Iterative methods for nonsymmetric linear
systems.
15:00
Dag Lindbo
exjobb Finite element computations for a conservative level set method applied to two-phase Stokes flow




Jan 17, 13:15
Indunil Sikurajapathi
13
Laplace matrix for graph partitioning
13:45
Mikael Fallgren
2b
Lanczos and conjugate gradients for linear systems
14:15
Niklas Jansson
1
Recursive BLAS for fastest parallel linear systems
14:45
Rainer Schulz
2
Preconditioned iterations for large linear systems
15:30
Johannes Rydh
8
Lanczos with Cullum's device
16:00
Louise Funke
11
Optical Character recognition with SVD
16:30








Schedule

We will discuss some of the material in the lectures but the main work in the course will be to study the literature and perform assignments. Here is a list of the different subjects and days for meetings.


Meeting
Text
Contents
F1
Wednesday Nov 1, 13:15-15 in 1537
D 2.2, 2.2.1
Linear Systems: Perturbations, relative perturbations


D 2.4.2, 2.4.3
Rounding errors in Gaussian elimination, Condition estimation
F2
Wednesday Nov 8, 13:15-15 in 1537
D 6.6.1
L4.2.1
Krylov subspaces: Arnoldi algorithm, eigenvalues and linear systems


D 7.2, D 5.2 L4.2.2
E 4.4
Symmetric matrices: Lanczos algorithm, Ritz approximations, perturbation theory, Courant Fischer minimax
F3
Wednesday, Nov 15, 13:15-15 in 1537
D 7.3-4
E 4.4.2-4
Lanczos algorithm: Convergence and orthogonality


D 6.6.2-3
L 5.1-2
Krylov subspaces, linear systems: Conjugate gradient algorithm
F4
Wednesday Nov 22, 13:15-15 in 1537
D 6.6.4-5
L 5.3
CG: convergence and preconditioning


D 6.6.6
Linear systems: Further developments, GMRES, QMR
F5
Wednesday Nov 29, 13:15-15 in 1537
E4.4.3,
E 4.5
Eigenvalues: Spectral transformation, implicit restart



Computing the SVD: Bidiagonalization, bidiagonal SVD
F6
Wednesday Dec 6, 13:15-15 in 1537
D 5.3.3
Large tridiagonal matrices: Divide and Conquer, Relative Robust Representation



Large SVD: Hubs and authorities on the web
The largest matrix eigenvalue problem: The Google matrix





Under the heading Text, I give some sections in
The texts are not covering all I intend to discuss. There is more to find in the text books than I will have time to cover. You are advised to have the Demmel book available. I will distribute relevant sections of Eigentemplates at the lectures.

Material distributed in class

Reading instructions and review questions, same procedure as last year!  Rev2Q.pdf

Examination

The examination will have two parts,
  1. a take home exam with a set of review questions, Exam2007.html
  2. Special assignment, done in groups, different tasks chosen by the groups. 

Special assignment 

Take one of the tasks given below! They are not given in any systematic order, so look at them all before you decide. You may also work on a project of your own designation, in that case talk to me, and we can find out what is an appropriate formulation.

You may work in groups of two or individually.  I will help you with copies of some of the papers necessary.  You may work at any time.
Write up a short report readable for the other participants in the course. Describe what is done in simple enough terms, and tell about the experiments you have done, if any.
Your work is to be presented orally in front of the group at an appropriate time towards the end of the course.


Pointers to material needed for the projects given below. Use also the homepage referred to in the Demmel book, I did find it at
http://www.cs.berkeley.edu/~demmel/ma221/Matlab/
Test matrices are available at Matrix Market and University of Florida at http://www.cise.ufl.edu/research/sparse/matrices/

Nr
Task and comments
People
1
Recursive BLAS for fastest parallel linear systems A group from Umeå has worked together with IBM veteran Fred Gustavson with a recursive variant of subdivision of large matrices. See their paper in SIAM Review.
Juha Korkiakangas, Niclas Jansson
2
Preconditioned iterations for large linear systems
Rainer Schulz
Matthias Aechtner

2b
Lanczos and conjugate gradient for linear systems
Mikael Fallgren
3
Multigrid for linear systems Demmel Q 6.16

4
Regularization for image processing, the L-curve Look at works of Per Christian Hansen, Copenhagen, especially his Matlab toolbox regtools and its use for smooth regression by Michael Wendland.
Linus Lind, Peter Weibull
5
Information retrieval with SVD Study Latent semantic indexing LSI, and the Krylov space variant developed with Katarina Blom. Matlab routines are available.

6
Sensitivity of eigenvalues, pseudospectrum The pseudospectrum is used by L N Trefethen, and people in France under the name spectral portraits. Do Demmel Q 4.14 eigscat.m

7
Compare lanso and eigs Lanczos with selective orthogonalization, lanso, and implicit restart Arnoldi, eigs. Etemplates 4.4.4.

8
Lanczos with Cullum's device Etemplates 4.4.4 Lanczos algorithm is much faster with no reorthogonalization but may need many more steps and spurious Ritz values will occur, see D 7.4! Jane Cullum gave a simple test to weed out such values based on looking at the first component of the short eigenvector s.
Johannes Rydh
9
Jacobi Davidson for eigenvalues Etemplates 4.7. This algorithm proposed by Henk Van der Vorst

10
Jacobi is better than QR, high relative precision Demmel 5.3.5 and 5.4.3. Read original article J. Demmel and K. Veselic, Jacobi’s method is more accurate than QR, SIAM J.
Matrix Anal. Appl., 13 (1992), pp. 1204–1245

Amaury Marescaux, Emmanuel Olivi
11
Optical Character recognition with SVD Study the report by Lars Eldén: Numerical linear algebra in data mining, Acta Numerica 2006, pp. 327-384. Look at the digit experiment used in lab 3. Compare the algorithm with 10 matrices proposed there, with another one based on SVD of the entire matrix, and computing distances between test digits and known digits in subspace of leading singular directions. Also see if blind signal separation is possible.
Louise Funke
12
Iterative methods for nonsymmetric linear systems. There are two variants used, GMRES that is based on Arnoldi and QMR based on nonsymmetric Lanczos. GMRES needs to be restarted after  relatively few steps since the entire Krylov basis needs to be stored, this leads to slow convergence since it is like a steepest descent algorithm. QMR can be run for larger Krylov spaces, but needs application with A^T and may loose biorthogonality. Most people run GMRES but I like QMR better. Do some experiments that may help in a decision!
 Hrvoje Gotovac, 
13
Laplace matrix for graph partitioning Alex Pothen uses the second eigenvector of the Laplace matrix of a graph for partitioning. A. Pothen, H. D. Simon, and K.-P. Liou. Partitioning sparse matrices with eigenvectors of graphs. SIAM J. Matrix Anal. Appl., 11:430--452, 1990 Study works by Karypis
Indunil Sikurajapathi
14
Richardson iteration with Leja shifts. For very large matrices, it needs too much work to keep bases of Krylov spaces orthogonal. An algorithm based on a predetermined sequence of polynomials may then be better. Richardson iteration uses shifted matrices with shifts at the zeros of  polynomials. Leja points are shifts determined successively during the computation. Compare an algoritm used by Sonia Gupta to Lanczos on a few cases!