Reading tips: lecture and lab notes
What fundamental problems are there with multi layer perceptrons and BP?
IS there any technique to get guaranteed convergence and at the same time avoid the limitations of linear separability?
How does an RBF-network work?
What are the advantages of using local basis functions?
What is the advantage of using many basis functions?
How is the output layer of an RBF-network trained?
How is the input layer of an RBF-network trained? More specifically, how is the placement and also perhaps the shape of the basis functions specified?
Nettalk -- translation from text to phonemes.
Cell classification -- identification of cancer cells from microscope pictures.
Identification of persons.
ALVINN -- control of a car from images.
Washing mashine -- combination of fuzzy logic and ANN.
Time series analysis -- attempts to predict stock market variables.
Credit card evaluation -- evaluation if a person (bank customer) can get a credit card.