Neural Networks in Electrical Engineering

advertisement
Neural Networks in Electrical Engineering
Prof. Howard Silver
School of Computer Sciences and Engineering
Fairleigh Dickinson University
Teaneck, New Jersey
Axon from
Another Neuron
Dendrite of
Another Neuron
Dendrite
Synaptic
Gap
Soma
Axon
Axon from
Another Neuron
Synaptic
Gap
Dendrite of
Another Neuron
Biological Neuron
Example of Supervised Learning Algorithm - Perceptron
Steps in applying Perceptron:
! Initialize the weights and bias to 0.
! Set the learning rate alpha (0 < α <= 1) and threshold θ.
! For each input pattern,
!__Compute for each output
!
yin = b + x1 * w1 + x2 * w2 + x3 * w3 + ......
!
and set
!__y = -1 for yin < -θ
!__y = 0 for -θ<= yin <= θ
!__y = 1 for yin > θ
!__If the jth output yj is not equal to tj, set
!____wij(new) = wij(old) + α * xi * tj
!____bj(new) = bj(old) + α * tj
!__(else no change in wij and bj)
Perceptron Applied to Character Recognition
Character Inputs
("A") (Binary)
..#..
.#.#.
#...#
#####
#...#
00100
01010
10001
11111
10001
Neural net inputs x1 to x25
'0010001010100011111110001'
Binary target output string (t1 to t26)
'10000000000000000000000000'
EDU» a_zlearn
Character training set:
..#..
.#.#.
#...#
#####
#...#
#####
#...#
####.
#...#
#####
#####
#....
#....
#....
#####
###..
#..#.
#...#
#..#.
###..
#####
#....
#####
#....
#####
#####
#....
####.
#....
#....
#####
#....
#..##
#...#
#####
#...#
#...#
#####
#...#
#...#
#####
..#..
..#..
..#..
#####
....#
....#
....#
#...#
#####
#...#
#..#.
###..
#..#.
#...#
#....
#....
#....
#....
#####
#...#
##.##
#.#.#
#...#
#...#
#...#
##..#
#.#.#
#..##
#...#
#####
#...#
#...#
#...#
#####
#####
#...#
#####
#....
#....
#####
#...#
#.#.#
#..##
#####
#####
#...#
#####
#..#.
#...#
#####
#....
#####
....#
#####
#####
..#..
..#..
..#..
..#..
#...#
#...#
#...#
#...#
#####
#...#
#...#
#...#
.#.#.
..#..
#...#
#...#
#...#
#.#.#
.#.#.
#...#
.#.#.
..#..
.#.#.
#...#
#...#
.#.#.
..#..
..#..
..#..
#####
...#.
..#..
.#...
#####
Enter number of training epochs (m) 5
Enter learning rate (alpha) 1
Enter threshold value (theta) 0.1
Final outputs after training:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
10000000000000000000000000
01000000000000000000100000
00000000000000000000000000
00010000000000000000000000
00001000000000000000000000
00100100000000000000000000
000000?0000000100000000000
00000001000000000000000000
00000000100000000000000000
00000000010000000000000000
00000000001000000000000000
00000010000100000000000000
00000000000010000000000000
00000000000001000000000000
00?00000000000000000000000
00000000000000010000000000
00000000000000001000000000
00000000000000000100000000
00000000000000000010000000
00000000000000000001000000
00000000000000000000000000
00000000000000000000010000
00000000000000100000001000
00000000000000000000000100
00000000000000000000000010
00000000000000?00000000001
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
Enter a test pattern (1s and 0s in quotes)
'1010001010100011111110001'
Character entered:
#.#..
.#.#.
#...#
#####
#...#
Resulting outputs:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
10000000000000000000000000
Sorted outputs before activation (strongest first):
AWRODVYUSLKIHJGFECZXTMNPQB
Enter a test pattern (1s and 0s in quotes)
'1010101010100011111110001'
Character entered:
#.#.#
.#.#.
#...#
#####
#...#
Resulting outputs:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
?0000000000000000000000000
Sorted outputs before activation (strongest first):
AWRVSODYUKGIHXJECMLFZTNQBP
Enter a test pattern (1s and 0s in quotes)
'0010001010111111000110001'
Character entered:
..#..
.#.#.
#####
#...#
#...#
Resulting outputs:
ABCDEFGHIJKLMNOPQRSTUVWXYZ
00000?0100101?010010000010
Sorted outputs before activation (strongest first):
YHSPMKNFEWUARODVLJZXBGTQIC
Signal Classification with Perceptron
SN = 5
SN = 5
SN=5
NNET11H.M
EDU» nnet11h
Enter number of training epochs (m) 20
Enter learning rate (alpha) 1
Enter threshold value (theta) 0.2
Final outputs after training:
100
011
000
:
EDU» nnet11h
Enter number of training epochs (m) 30
Enter learning rate (alpha) 1
Enter threshold value (theta) 0.2
Final outputs after training:
101
010
000
:
EDU» nnet11h
Enter number of training epochs (m) 40
Enter learning rate (alpha) 1
Enter threshold value (theta) 0.2
Final outputs after training:
100
010
001
Enter signal to noise ratio 100
Classification of signals embedded in noise
100
010
001
:
Enter signal to noise ratio 10
Classification of signals embedded in noise
100
010
001
:
Enter signal to noise ratio 5
Classification of signals embedded
100
010
000
:
Enter signal to noise ratio 2
Classification of signals embedded
100
010
000
:
Enter signal to noise ratio 1
Classification of signals embedded
100
010
010
:
Enter signal to noise ratio 1
Classification of signals embedded
100
011
001
in noise
in noise
in noise
in noise
Classification of Three Sinusoids of Different Frequency
>> nnet11i
Enter frequency separation in pct. (del) 100
Number of samples per cycle (xtot) 64
Enter number of training epochs (m) 100
Enter signal to noise ratio (SN) - zero to exit 1
Final outputs after training:
Classification of signals embedded in noise
100
010
001
100
010
001
Signals
Noisy Signals
>> nnet11i
Enter frequency separation in pct. (del) 10
Number of samples per cycle (xtot) 64
Enter number of training epochs (m) 100
Enter signal to noise ratio (SN) - zero to exit 10
Final outputs after training:
Classification of signals embedded in noise
100
010
001
100
010
001
Signals
Noisy Signals
>> nnet11i
Enter frequency separation in pct. (del) 5
Number of samples per cycle (xtot) 64
Enter number of training epochs (m) 500
Enter signal to noise ratio (SN) - zero to exit 10
Final outputs after training:
Classification of signals embedded in noise
100
010
001
0?0
010
001
Signals
Noisy Signals
>> nnet11i
Enter frequency separation in pct. (del) 1
Number of samples per cycle (xtot) 1000
Enter number of training epochs (m) 10000
Enter signal to noise ratio (SN) - zero to exit 100
Classification of signals embedded in noise
Final outputs after training:
100
0?0
001
100
010
001
Signals
Noisy Signals
Unsupervised Learning
Kohonen Learning and Self Organizing Maps
Linear array:
#
* #*
* * #* *
* * * #* * *
Rectangular grid:
*
*
*
*
*
(R=0)
(R=1)
(R=2)
(R=3)
#
(R=0)
* **
* #*
* **
(R=1)
*
*
*
*
*
*
*
#
*
*
*
*
*
*
*
*
*
*
*
*
(R=2)
Kohonen Learning Steps
Initialize the weights (e.g. random values).
Set the neighborhood radius (R) and a learning rate (α).
Repeat the steps below until convergence or a maximum number of epochs is reached.
For each input pattern X = [x1 x2 x3 ......]
Compute a "distance"
D(j) = (w1j - x1)2 + (w2j - x2)2 + (w3j - x3)2 + ......
for each cluster (i.e. all j), and find jmin, the value of j corresponding to the minimum D(j).
If j is "in the neighborhood of" jmin,
wij(new) = wij(old) + α [xi - wij(old)]
for all i.
Decrease α (linearly or geometrically) and reduce R (at a specified rate) if R > 0.
Self Organizing Maps for Alphabetic Character Set
NNET19.M
Example 4.5 in Fausett
Character training set:
..##...
...#...
...#...
..#.#..
..#.#..
.#####.
.#...#.
.#...#.
###.###
A1
...#...
...#...
...#...
..#.#..
..#.#..
.#...#.
.#####.
.#...#.
.#...#.
A2
...#...
...#...
..#.#..
..#.#..
.#...#.
.#####.
#.....#
#.....#
##...##
A3
######.
.#....#
.#....#
.#....#
.#####.
.#....#
.#....#
.#....#
######.
B1
######.
#.....#
#.....#
#.....#
######.
#.....#
#.....#
#.....#
######.
B2
######.
.#....#
.#....#
.#####.
.#....#
.#....#
.#....#
.#....#
######.
B3
..####.
.#....#
#......
#......
#......
#......
#......
.#....#
..####.
C1
..###..
.#...#.
#.....#
#......
#......
#......
#.....#
.#...#.
..###..
C2
..###.#
.#...##
#.....#
#......
#......
#......
#.....#
.#...#.
..###..
C3
#####..
.#...#.
.#....#
.#....#
.#....#
.#....#
.#....#
.#...#.
#####..
D1
#####..
#....#.
#.....#
#.....#
#.....#
#.....#
#.....#
#....#.
#####..
D2
#####..
.#...#.
.#....#
.#....#
.#....#
.#....#
.#....#
.#...#.
#####..
D3
#######
.#....#
.#.....
.#.#...
.###...
.#.#...
.#.....
.#....#
#######
E1
#######
#......
#......
#......
#####..
#......
#......
#......
#######
E2
#######
.#....#
.#..#..
.####..
.#..#..
.#.....
.#.....
.#....#
#######
E3
...####
.....#.
.....#.
.....#.
.....#.
.....#.
.#...#.
.#...#.
..###..
J1
.....#.
.....#.
.....#.
.....#.
.....#.
.....#.
.#...#.
.#...#.
..###..
J2
....###
.....#.
.....#.
.....#.
.....#.
.....#.
.....#.
.#...#.
..###..
J3
###..##
.#..#..
.#.#...
.##....
.##....
.#.#...
.#..#..
.#...#.
###..##
K1
#....#.
#...#..
#..#...
#.#....
##.....
#.#....
#..#...
#...#..
#....#.
K2
###..##
.#...#.
.#..#..
.#.#...
.##....
.#.#...
.#..#..
.#...#.
###..##
K3
The units associated with each pattern
(as shown in Fausett for Examples 4.5 and 4.6).
(For initial R=0)
Unit
2
11
14
25
B1,
A1,
C1,
B2,
B3,
A2,
C2,
D2,
Patterns
D1, D3, E1, E3, K1, K3
A3
C3, J1, J2, J3
E2, K2
(For initial R=1)
Unit
2
4
6
8
9
10
11
12
14
16
17
19
20
21
Patterns
C1
C2, C3
J1, J2, J3
D1, D3
B1, B3
E1
E3
K1, K3
K2
D2
B2, E2
A1
A2
A3
NNET19A.M
Same character patterns are organized into a two-dimensional rectangular array of cluster units.
A sample output of this program after 100 epochs
Row:
3
1
5
1
1
3
1
3
5
5
5
5
3
4
4
1
5
1
1
3
1
2
4
1
5
3
4
2
2
3
5
4
2
4
2
1
4
5
5
3
5
2
Column:
1
Row
1
2
3
4
5
A3
C1
Column
2
3
4
5
K1,K3 E1,E3 B1,B3 D1,D3
A1,A2
K2
E2
J1,J2
B2
D2
J3
C2,C3
The Traveling Salesman Problem
Two input neurons - one each for the x and y coordinate numbers for the cities
Distance function:
D(j) = (w1j - x1)2 +
(w2j - x2)2
Example - 4 cities on the vertices of a square
The coordinates of the cities:
!
!
!
!
City A: -1, -1
City B: -1, 1
City C: 1, -1
City D: 1, 1
NNET19C.M
Weight matrices from three different runs:
W = [ 1 1 -1 -1
1 -1 -1 1]
W = [ 1 -1 -1 1
1 1 -1 -1]
W = [-1 1
-1 -1
1 -1
1 1]
100 “Randomly” Located Cities
0.1 < α < 0.25, 100 epochs, Initial R = 2
Download