Outline • Neural networks – Multi-layer neural networks - continued – Applications

advertisement
Outline
• Neural networks
– Multi-layer neural networks - continued
• Back-propagation
– Applications
Simple Perceptrons
• Simple perceptrons
– One-layer feed-forward network
• There is an input layer and an output layer and no
hidden layers
– The computation can be described by
Oi  g (hi )  g ( wik k )
k
• Thresholds are omitted because they can always be
treated as connections to an input terminal that is –1
permanently
5/29/2016
Visual Perception Modeling
2
A Simple Learning Algorithm
• There is a learning algorithm for a simple
perceptron network
– Given a training pattern k, the desired output is i
– The learning algorithm, or the procedure to change its
weights, is
wiknew  wikold  wik
wik   ( i  Oi ) k
5/29/2016
Visual Perception Modeling
3
Hebbian Learning Rules
• Hebbian learning
– The strength of a synaptic connection should be
adjusted if its level of activity changes
– An active synapse which repeatedly triggers the
activation of its postsynaptic neuron will grow in
strength
w   x y
• Anti-Hebbian learning rule
w   x y
5/29/2016
Visual Perception Modeling
4
Simple Perceptrons – cont.
• Linear separability
– For simple perceptrons, the condition for correct
operation is that a plane should divide the inputs
that have positive and negative targets
– This means the decision boundary will be a plane
where
w x  0 if x  is a positive example
w x  0 if x  is a negative example
• The plane is w •  = 0
5/29/2016
Visual Perception Modeling
5
Simple Perceptrons – cont.
• Linear units
Oi  g (hi )   wik k
k
• Gradient descent learning
1
1

 2
E[ w]   ( i  Oi )   ( i   wik k ) 2
2 i,
2 i,
k
E
wik  
   ( i  Oi )
wik

5/29/2016
Visual Perception Modeling
6
Multi-layer Perceptrons
• The limitations of perceptrons do not apply to
feed-forward networks with hidden layers
between the input and output layer with
nonlinear activation function
• The problem is to train the network
efficiently
5/29/2016
Visual Perception Modeling
7
Multi-layer Perceptrons – cont.
5/29/2016
Visual Perception Modeling
8
Multi-layer Perceptrons – cont.
• Back-propagation
– Extension of the gradient descent learning rule
1
1

 2
E[ w]   ( i  Oi )   [ i  g ( wij g ( wik k ))] 2
2 i,
2 i,
j
k
– The hidden-to-output layer connections
E
wik  
   g (hi )( i  Oi )V j
wik

5/29/2016
Visual Perception Modeling
9
Multi-layer Perceptrons – cont.
• Back propagation - continued
– Input-to-hidden connections
E
wik  
  [ g (h j ) wij i ( i  Oi )V j
w jk

i
where
 i   g (hi )( i  Oi )

5/29/2016
Visual Perception Modeling
10
Activation Function
• Activation function
– For back-propagation, the activation function
must be differentiable
– Also we want it to saturate at both extremes
– Sigmoid function
1
g ( h) 
1  exp( 2 h)
g (h)  2  g (1  g )
5/29/2016
Visual Perception Modeling
11
Activation Function – cont.
5/29/2016
Visual Perception Modeling
12
Back Propagation Algorithm
1.
2.
3.
4.
5.
Initialize the weights to small random values
Choose a pattern ku and apply it to the input layer
Propagate the signal forward through the network
Compute the deltas (errors) for the output layer
Compute the deltas (errors) for the preceding
layers by propagating the errors backwards
6. Update all the connections according to the
algorithm
7. Go back to step 2 and repeat for the next pattern
5/29/2016
Visual Perception Modeling
13
Using Neural Networks
• Design phase
– The neural network architecture
• Training phase
– Use available examples to train the neural
network
• That is, to use the back-propagation algorithm to learn
the connection weights
• Test phase
– For a new sample, feed the feature through the
neural network and you go the result
5/29/2016
Visual Perception Modeling
14
Other Neural Network Models
• Hopfield model – associate memory model
– You can store many patterns in one neural
network
– Then the network will recall the “correct” pattern
based on the input
• There are many other kinds of neural
networks
– They are generally designed for a more specific
problem
5/29/2016
Visual Perception Modeling
15
Applications
• Application examples
–
–
–
–
–
–
NETtalk
Navigation of a car
Image compression
Recognizing hand-written ZIP codes
Speech recognition
Face recognition
5/29/2016
Visual Perception Modeling
16
Back Propagation Program
• Programs
– Backprop.c – main program
– Propagation.c – contains procedures for BP
– Para-util.h and type-def.h – contain data
structure definitions
– Located at
~liux/public_html/courses/research/programs/neural-networks
• Parameter files
– Control parameter file – network-3-3-1.par
– Training data file – network-3-3-1-training.par
5/29/2016
Visual Perception Modeling
17
Download