Special assignment 2008

Take one of the tasks given below! They are not given in any systematic order, so look at them all before you decide. You may also work on a project of your own designation, in that case talk to me, and we can find out what is an appropriate formulation.

You may work in groups of two or individually.  I will help you with copies of some of the papers necessary.  You may work at any time.
Write up a short report readable for the other participants in the course. Describe what is done in simple enough terms, and tell about the experiments you have done, if any.
Your work is to be presented orally in front of the group at an appropriate time towards the end of the course.


Pointers to material needed for the projects given below. Use also the homepage referred to in the Demmel book, I did find it at
http://www.cs.berkeley.edu/~demmel/ma221/Matlab/
Test matrices are available at Matrix Market and University of Florida at http://www.cise.ufl.edu/research/sparse/matrices/

Nr
Task and comments
People
1

2
Preconditioned iterations for large linear systems A preconditioner M is chosen to give M^-1 A a better conditioned spectrum than A. Make some tests on some relevant examples and see how the eigenvalue distribution affects the convergence of pcg.


3 Iterative methods for nonsymmetric linear systems. There are two variants used, GMRES that is based on Arnoldi and QMR based on nonsymmetric Lanczos. GMRES needs to be restarted after  relatively few steps since the entire Krylov basis needs to be stored, this leads to slow convergence since it is like a steepest descent algorithm. QMR can be run for larger Krylov spaces, but needs application with A^T and may loose biorthogonality. Most people run GMRES but I like QMR better. Do some experiments that may help in a decision! Maria Nordström
4
Multigrid for linear systems Demmel Q 6.16. The really powerful  iterative methods make use of knowlegdge about the underlying computational problem. In multigrid several nested discretisations are used. The solution over a coarse grid gives a prediction for the solution over a fine grid by interpolation and a smoothing iteration  improves the fine grid solution.
Catherine Herold
Melanie Rochoux
5
Regularization for image processing, the L-curve Look at works of Per Christian Hansen, Copenhagen, especially his Matlab toolbox regtools and its use for smooth regression by Michael Wendland.
Chen Nan
Yuanjun Zhang
6
Laplace matrix for graph partitioning Alex Pothen uses the second eigenvector of the Laplace matrix of a graph for partitioning. A. Pothen, H. D. Simon, and K.-P. Liou. Partitioning sparse matrices with eigenvectors of graphs. SIAM J. Matrix Anal. Appl., 11:430--452, 1990 Study works by Karypis
7
Sensitivity of eigenvalues, pseudospectrum The pseudospectrum is used by L N Trefethen, and people in France under the name spectral portraits. Do Demmel Q 4.14 eigscat.m
Juha Korkiakangas
8
Compare lanso and eigs Lanczos with selective orthogonalization, lanso, and implicit restart Arnoldi, eigs. Etemplates 4.4.4.

9
Page rank and the Google matrix The primary reason for the success of the Google search engine is its weighting of web pages by Page rank. The page rank is the element of the Perron eigenvector of the adjacency matrix of the entire world wide web. This gigantic eigenvalue problem is solved once a month for the current configuration. See a recent talk by Lars Elden from Linköping.
Fatih Ertinaz
Mustafa Kavasoglu
10
Jacobi Davidson for eigenvalues Etemplates 4.7. This algorithm proposed by Henk Van der Vorst
Yuan Changming
11
Optical Character recognition with SVD Study the report by Lars Eldén: Numerical linear algebra in data mining, Acta Numerica 2006, pp. 327-384. Look at the digit experiment used in lab 3. Compare the algorithm with 10 matrices proposed there, with another one based on SVD of the entire matrix, and computing distances between test digits and known digits in subspace of leading singular directions. Also see if blind signal separation is possible.
Mamasz
Jakub Olczak
12 Geometry and eigenvalues Curves and surfaces are described by systems of nonlinear equations in the coordinates. Intersections where a curve meets a surface can be computed by a nonlinear eigenvalue problem. Prepare a demonstration how this comes about. Demmel Q 4.16 Mattias Frånberg
Peter Andersson
13 Multiple Relative Robust Representation for tridiagonal eigenvalues This is what promises to be the ultimate method to compute eigenvalues and orthogonal eigenvectors of a symmetric tridiagonal matrix. See works by Dhillon.