I have made up a few Matlab functions that will be useful when generating test cases and visualizing results:
I have prepared 2 simple Matlab codes lanfu.m which performs full reorthogonalization and lanso.mwhich does selective orthogonalization, as it has been developed during the last 25 years by Parlett and coworkers. As of now the routines are just coded as M files, this makes it easier to look at different variables while running, but you have to have certain variables set when you start and give the matrix the name A. Note that the code works also when A is full, but make sure it is represented as a sparse matrix when you run the tests. You will also need the short function normc.m.
The purpose is to compare the 3 different ways of doing reorthogonalization, full, selective and no reorthogonalization. You can easily make a no reorthogonalization code by putting a comment symbol in front of a few lines of lanfu. Test the orthogonality of the basis vectors v ,as well as that of the computed eigenvectors xv. Test also how well the residuals are predicted by the estimated bounds bndv given by the last elements of the eigenvectors of the tridiagonal matrix. I have an M file lanreport.m which does some of these tests. Note that you need to focus on the interesting parts of the spectrum by using the axis command!
1. As a first set of test matrices use the finite difference Laplacian
generated by the Matlab commands numgrid and delsq. If
you run Lanczos directly on this one, you get rather many steps before eigenvalues
start to converge, I used jmax=100 or 300 in my tests. This is the case for
which selective orthogonalization was developed.
If you use a membrane with symmetries, some of the eigenvalues are double, this will cause moderate trouble for this algorithm. Plot the eigenvectors for those eigenvalues using the special command v2g !
How does the variant with no reorthogonalization behave? Do you get good eigenvectors if you disregard that some of them are duplicates? Try Jane Cullum's device of reporting only those vectors that have a nontrivial component in the first row of S, the eigenvector matrix of the tridiagonal T.
2 a. If we run Lanczos on a shifted and inverted operator, the convergence
will be much faster. In those cases selective orthogonalization reorthogonalizes
much too often, and the duplicates rush in, if you use no reorthogonalization,
so full reorthogonalization is the preferred alternative.
Test this on the same Laplacian membrane matrix with shift zero, i. e. take the inverse. By all means avoid forming an explicit inverse, it will be full. Instead, replace the code where A is multiplying , r=A*v(:,j); by an operation with the inverses of the L and U factors r=U\(L\v(:,j));. Note that the paretheses are important here, Matlab works from left to right and if you had no parenteheses it would start forming the full matrix U\L and go on forever. (Long time ago I used APL, A Programming Language, developed by Iverson at IBM, which works from right to left. This is more natural in this case and the language still has some devoted enthusiasts.)
2 b. If you are bored by just running finite difference matrices, try your hand at getting the singular values of the Medline matrix, used by Katarina Blom in her works on information retrieval. Here you are interested in quite many of the largest singular values of X. Do a similar change in the code as in the inverted case, now inserting r=X'*(X*v(:,j));.This case behaves very similar to the shifted and inverted.
Do task 1 and either 2 a or 2 b!