Nr |
Task and comments |
People |
1 |
||
2 |
Preconditioned iterations for large linear
systems A
preconditioner M is chosen to give M^-1 A a better conditioned spectrum
than A. Make some tests on some relevant examples and see how the
eigenvalue distribution affects the convergence of pcg. |
|
3 | Iterative methods for nonsymmetric linear systems. There are two variants used, GMRES that is based on Arnoldi and QMR based on nonsymmetric Lanczos. GMRES needs to be restarted after relatively few steps since the entire Krylov basis needs to be stored, this leads to slow convergence since it is like a steepest descent algorithm. QMR can be run for larger Krylov spaces, but needs application with A^T and may loose biorthogonality. Most people run GMRES but I like QMR better. Do some experiments that may help in a decision! | Maria Nordström |
4 |
Multigrid for linear systems Demmel Q
6.16. The really powerful iterative methods make use of
knowlegdge about the underlying computational problem. In multigrid
several nested discretisations are used. The solution over a coarse
grid gives a prediction for the solution over a fine grid by
interpolation and a smoothing iteration improves the fine grid
solution. |
Catherine Herold Melanie Rochoux |
5 |
Regularization for image processing, the
L-curve Look at works of Per
Christian Hansen,
Copenhagen, especially his Matlab toolbox regtools
and its use for smooth regression by Michael Wendland. |
Chen Nan Yuanjun Zhang |
6 |
Laplace matrix for graph partitioning Alex Pothen uses the second eigenvector of the Laplace matrix of a graph for partitioning. A. Pothen, H. D. Simon, and K.-P. Liou. Partitioning sparse matrices with eigenvectors of graphs. SIAM J. Matrix Anal. Appl., 11:430--452, 1990 Study works by Karypis | |
7 |
Sensitivity of eigenvalues, pseudospectrum
The pseudospectrum
is used by L N
Trefethen, and people in France under the name spectral
portraits. Do Demmel Q 4.14 eigscat.m |
Juha Korkiakangas |
8 |
Compare lanso and eigs Lanczos with
selective orthogonalization, lanso, and implicit restart Arnoldi, eigs.
Etemplates 4.4.4. |
|
9 |
Page rank and the Google matrix
The primary reason for the success of the Google search engine is its
weighting of web pages by Page rank. The page rank is the element of
the Perron eigenvector of the adjacency matrix of the entire world wide
web. This gigantic eigenvalue problem is solved once a month for the
current configuration. See a recent talk by Lars Elden from Linköping. |
Fatih Ertinaz Mustafa Kavasoglu |
10 |
Jacobi Davidson for eigenvalues
Etemplates 4.7. This algorithm proposed by Henk Van der Vorst |
Yuan Changming |
11 |
Optical
Character recognition with SVD Study the report by Lars
Eldén: Numerical linear algebra in data mining, Acta Numerica 2006, pp. 327-384.
Look at the digit experiment used in lab
3. Compare the algorithm with 10 matrices proposed there, with
another one based on SVD of the entire matrix, and computing distances
between test digits and known digits in subspace of leading singular
directions. Also see if blind signal separation is possible. |
Mamasz Jakub Olczak |
12 | Geometry and eigenvalues Curves and surfaces are described by systems of nonlinear equations in the coordinates. Intersections where a curve meets a surface can be computed by a nonlinear eigenvalue problem. Prepare a demonstration how this comes about. Demmel Q 4.16 | Mattias Frånberg Peter Andersson |
13 | Multiple Relative Robust Representation for tridiagonal eigenvalues This is what promises to be the ultimate method to compute eigenvalues and orthogonal eigenvectors of a symmetric tridiagonal matrix. See works by Dhillon. |