Neural Java Neural Networks Tutorial with Java Applets Introduction

advertisement
Neural Java
Neural Networks Tutorial with Java Applets
Introduction
Neural Java is a series of exercises and demos. Each exercise consists of a short introduction, a
small demonstration program written in Java (Java Applet), and a series of questions which are
intended as an invitation to play with the programs and explore the possibilities of different
algorithms.
The aim of the applets is to illustrate the dynamics of different artificial neural networks.
Emphasis has been put on visualization and interactive interfaces. The Java Applets are not
intended for and not useful for large-scale applications! Users interested in application
programs should use other simulators.
The list below covers standard neural network algorithms like BackProp, Kohonen, and the
Hopfield model. It also includes some models that are more biological, and features
visualizations of the Hodgkin-Huxley and the integrate-and-fire models.
Additional material
The following are available for download:


Spiking Neuron Models (W. Gerstner and W. Kistler, Cambridge University Press 2002)
Supervised Learning for Neural Networks: a tutorial with JAVA exercises (W. Gerstner).
See also

Some Competitive Learning Methods (Bernd Fritzke)
Exercises
If there is this image
on the right of the link, then you can download the applet in order to
execute it at your place. Moreover you can download the source code of the applet. But you must
agree before with the GNU General Public Licence.
If so follow the instructions here to download and install the applets.
Single Neurons
1. Artificial Neuron.
2. McCulloch-Pitts Neuron.
3. Spiking Neuron. (Requires Swing).
4. Hodgkin-Huxley Model.
5. Axons and Action Potential Propagation.
Supervised Learning
Single-layer networks (simple perceptrons)
1. Perceptron Learning Rule.
2. Adaline, Perceptron and Backpropagation.
Multi-layer networks
1. Multi-layer Perceptron (with neuron outputs in {0;1}).
2. Multi-layer Perceptron (with neuron outputs in {-1;1}).
3. Multi-layer Perceptron and C language.
4. Generalization in Multi-layer Perceptrons (with neuron outputs in {0;1}).
5. Generalization in Multi-layer Perceptrons (with neuron outputs in {-1;1}).
6. Optical Character Recognition with Multi-layer Perceptron.
7. Prediction with Multi-layer Perceptron.
Support Vector Machine
1. Support Vector Machine with a polynomial kernel.
Density Estimation and Interpolation
1. Radial Basis Function Network.
2. Gaussian Mixture Model / EM.
3. Mixture model, using unlabeled data
Unsupervised Learning
1. Principal Component Analysis.
2. PCA for Character Recognition.
3. Competitive Learning Methods.
Reinforcement Learning
1. Blackjack and Reinforcement Learning.
Network Dynamics
1. Hopfield Network.
2. Pseudoinverse Network.
3. Network of spiking neurons. (Requires Swing).
4. Retina Simulation. (Runs very slow with some netscape versions).
Miniproject
Miniproject for Postgraduate Training
Useful links
URL: http://lcn.epfl.ch/tutorial/english/
Last updated: 06-October-2000 by Sébastien Baehni
Download