Tutorial-5-9

advertisement
Computational Intelligence: Tutorial
Computational Intelligence COM907M1
Tutorial 5
1.
Explain the (µ+) Evolution Strategies.
2.
Explain the mechanism of rank-based selection in Evolutionary Algorithm.
3.
What are the advantages of value coding? Explain with an example.
4. The weights of the following feed forward neural network, shown in Figure 5, are to be trained. The
problem of standard backpropagation algorithm is that it may get stuck in local minima. An alternative
is to use evolutionary algorithm (EA). Show how EA can be applied to this neural network.
b1
w1
1
3
w2
b2
w3
2
5
w6
4
w4
b3
w5
Figure 5: Feedforward NN
5.
Explain the problem of competing convention in an evolutionary neural system.
Tutorial 6
1.
A hybrid neuro-fuzzy system with two inputs (x1, x2) and single output (y) is shown in Figure
below, where f i  a i x1  bi x 2  ci , i  1,2,3 . The membership functions for A1 , A2 , B1 ,
and
B2 are bell-shaped functions defined as
1
,
 A1 ( x1 ) 
2
x1  m1
1
1
1
 B ( x2 ) 
1
1
Where

x 2  m3
2
,
1
 A ( x1 ) 
2
1
2
1

wi  min  A j ( x1 ),  B j ( x2 ) and wi 
wi
w
i 1
2
x 2  m4
4
.
3
i
x1 x2
A1
Oi
wi
r1
wi
N
wi f i
x1
A2
r2
N
B1
x2
B2
r3
,
2
1
 B ( x2 ) 
3
x1  m2
N
Figure 4: Hybrid neuro-fuzzy system.

y
2
,
Computational Intelligence: Tutorial
(a)Calculate
the
output
b1  b2  b3  2 ,
x1  2 and x2  3 . Assume a1  a 2  a3  1 ,
and
c1  c 2  c3  2 ,
m1  2, m2  3, m3  3, m4  4
y
for
1   2   3   4  2 .
2.
The rule-base of a two-inputs single-output FLC is given below.
Rule-base for Mamdani-type fuzzy system
X2
X1
B1
C1
C2
A1
A2
B2
C2
C3
Develop a zero-order Sugeno-type Neuro-fuzzy system which is equivalent to the Mamdani-type
fuzzy system described above.
3.
Show the matrix chromosome representation of the neural network below.
b1
1
w1
w2
3
w5
b2
2
w3
w4
4
b3
5
w6
Figure 6: Feedforward NN
4.
Show an example of cooperative combination of NN and FS, e.g. tuning scaling parameters.
5.
Explain how GA can be applied to optimise the rule-base of an FLC.
Tutorial 7
1.
rd
What are the features of 3 generation neural networks?
-
Third generation of neural networks raises the level of biological realism by using individual
spikes.
-
Instead of using rate coding these neurons use pulse coding; mechanisms where neurons
receive and do send out individual pulses.
-
Recent discoveries in the field of neurology have shown that neurons in the cortex perform
analogue computations at incredible speed.
-
Thorpe et al demonstrated that humans analyse and classify visual input (i.e. facial
recognition) in under 100ms.
-
It takes at least 10 synaptic steps from the retina to the temporal lobe; this leaves about 10ms
of processing-time per neuron.
2.
Explain Integrate and Fire model of a spiking neuron.
The Integrate and Fire (IF) model is simplest form of all the spiking neuron models where the neuron
and cell membrane are represented by a single capacitor and a threshold device, circuit diagram is
shown in Figure below.
Computational Intelligence: Tutorial
When a current is injected into the capacitor, it will charge until the threshold value is reached when
the capacitor is allowed to discharge and a spike is released.
It exhibits two of the main characteristics of a neuron: integrate inputs over time and subsequently
produce a spike when threshold reached
Fig. IF Model
The most basic version of an IF neuron, called the Perfect Integrate and Fire model (PIF) and the
behaviour of the cell is described by the charging of the membrane by:
I (t )  C m
dVm
dt
where Cm is the capacitance of the cell membrane and I(t) is the stimulus current.
The PIF model does not include the refractory period exhibited by a real neuron.
The refractory period can easily be modelled by clamping the membrane of the cell to the resting
potential for a fixed duration after firing has taken place.
By introducing this factor the current/frequency (I/f) relationship is modified from:
f 
I (t )
I (t )

C mVth (C mVth  t ref I (t ))
where tref is the refractory period.
3.
Discuss the different learning mechanisms of SNN.
-Synaptic plasticity refers to the ability of synaptic connections to change their strength, which is
thought to be the basic mechanism underlying learning and memory in biological neural networks.
-Three types of learning can be applied to SNN. They are:
-Unsupervised: A general phenomenological model describing various forms of the spike-based
synaptic plasticity has been proposed. This model can be expressed as:
d/dt [wji(t)] = a0+a1Si(t)+a2Sj(t)+a3Si(t)Sj(t)+a4Si(t)Sj(t);
(1)
where wji(t) is the efficacy of the synaptic coupling from neuron i to j; S i(t) and Sj(t) are the pre- and
postsynaptic spike trains, respectively; each spike train is defined as a sum of the Dirac impulses at the
firing times tf , that is S(t) =Pf(t - tf ); terms Si(t) and Sj(t) are the low-pass filtered versions of Si(t) and
Sj(t), respectively; a0,...,a4 are the constant coefficients that control the rate of change in the synaptic
efficacy.
Computational Intelligence: Tutorial
-Supervised: In supervised learning (in ReSuMe algorithm), synaptic weight changes are modified
according to the following equation:
d/dt [wji(t)] = a [Sd(t)Si(t) - Sj(t)Si(t)] = a [Sd(t) - Sj(t)] Si(t);
(2)
where: a is the learning rate, Sd(t) is the target (reference) spike train, Sj(t) is the output spike train and
Si(t) is the low-pass filtered input spike train.
-Reinforcement learning: Reinforced learning can be expressed by the following general formula:
d/dt [wji(t)] = cji(t)d(t);
(3)
where wji is, again, the weight of a synapse from neuron i to neuron j, c ji(t) is an eligibility trace of this
synapse which collects weight changes proposed by STDP , and d(t) = h(t) - ho(t) corresponds to the
concentration of the neuromodulatory signal h(t) around its mean value h o(t).
Tutorial 8
1.
Give a definition of life from the perspective of computational intelligence.
General condition that distinguishes
2.


organisms from inorganic objects and dead organisms,


a means of reproduction, and
being manifested by growth through metabolism,
internal regulation or adaptation in response to the environment.
What are the features of life that distinguishes it from non-living objects?
Traditionally life has been identified with material organizations, which observe certain lists of
properties, such as
3.

Metabolism,

Adaptability,

Self-maintenance (autonomy),

Self-repair,

Growth,

Replicability,

Irritability (reactability),

Evolution, etc.
Explain Von Neumann’s self-reproduction scheme.

Self-reproduction system contains the set of automata (A + B + C) and a description (A + B
+ C);



Description is fed to B, which copies it three times (assuming destruction of the original);
One of these copies is then fed to A which produces another automaton (A + B + C);
Second copy is then handled separately to the new automaton, which together with this
description is also able to self-reproduce;

Third copy is kept so that the self-reproducing capability may be maintained (it is also
assumed that A destroys utilized descriptions).

Notice that the description, or program, is used in two different ways: it is both translated and
copied.
Computational Intelligence: Tutorial

In the first role, it controls the construction of an automaton by causing a sequence of
activities (active role of description).

In the second role, it is simply copied (passive role of description).
D
Instruction
C
Controller
D
D1
B
Duplicator
D2
D
D
D2
C
Controller
D
D1
B
Duplicator
D2
Raw materials
A
Fabric
output
D
D
A
Fabric
Conceptual diagram of Von Neumann’s self-reproduction scheme (from Garden in the Machine)
Download