9.9. What kinds of self-learning we gain thanks to the competition?

advertisement
9.9. What kinds of self-learning we gain thanks to the competition?
(Translation by Weronika Łabaj, weronika.labaj@googlemail.com)
In the application Example 10c you can activate the parameter Soft competition (in the parameters
window – Fig. 9.37, left) that enables „softer” competition. I recommend not using it from the very
start (just leave the default setting). We will experiment with soft competition later, when you learn
both the advantages and the disadvantages of the „hard” competition.
Thanks to the „hard” competition in the application Example 10c you could avoid cliques of neurons
that had exactly the same preferences, just like those formed during experiments with the
applications Example 10a and Example 10b. If you are lucky (especially in the case of working with
networks that are composed of many more neurons than there are classes of recognized objects)
then you could also avoid holes and „dead fields” in representations of input objects formed by
particular neurons, in other words there were no objects that none of the neurons recognized. When
competition is used then there is a high probability that in the network won’t be any neuron
recognizing multiple classes of objects and, at the same time, there won’t be any class of objects not
recognized by any neuron. Just like in an old proverb: „Every Jack has his Jill1”.
When learning with a competition is applied then neurons other than the winner do not change their
location, so they are ready for learning and accepting other patterns. Therefore, if objects from a
completely new class suddenly appear during the learning process then there will still be some free
neurons ready for learning objects from this new class and improving in identifying them. You can
easily see that on your screen, because there is an option in the application Example 10c that allows
you to demand a new class of objects to appear. To do that during simulation click the new pattern
button. By clicking the new pattern button a few times you can lead your network to recognizing
many more classes of objects than 4 types of Martians (do not ask me, however, what those other
classes represent – my imagination has its limits!). Thanks to the learning with competition each of
those many classes of input objects gets a private „guardian” - the neuron which from that moment
on will identify with this class (Fig. 9.41). Usually there will be also some free neurons left, able to
learn new patterns, if any appears in the future.
1
You know that if not for the WTA competition (that stems from monogamy required in our country, that lacks
any biological justification) then the intellectual cream of the crop will attract all the girls (that is computer
scientists). Because of the competition some other guys can also be lucky...
Fig. 9.41. The opportunity to teach recognition of multiple patterns in a network with competition
Unfortunately, the pure self-learning process combined with the „hard” competition may lead to
some abnormalities. When you initiate the self-learning process with a low number of neurons then
for each of the classes of objects (four in my sample application) there will be one neuron, that will
identify with and recognize one class. If new classes of objects will be introduced later (for example
when you click the new pattern button) then it might happen that the „winner” will be a neuron that
already has specialized in recognizing some other class! That in turn might lead to the situation in
which some class of input objects, that gained a strong representation among neurons suddenly
loses it2! It is a well known situation from our daily lives – new information replaces the old one.
Fig. 9.42. Replacing previously learned skills by new information
2
Continuing our matrimonial analogy it can be said that we are watching the birth of a betrayal and a divorce.
In the picture 9.42 is shown an example of such an effect, in this case it is very strong - all previously
trained neurons recognizing some objects after being shown new objects „re-learned” and started
recognizing those new objects. It is not a typical situation. Usually there are more neurons than the
number of recognized classes of objects and the „kidnapping effect” is rather rare – it happens in one
or two classes out of a dozen (Fig. 3.43). That explains why when using our own neural networks –
that is our brains – only some, not so many details disappear from our memory, replaced by other,
more intense memories. Unfortunately, usually things that „get lost” are the most important and
they get lost right when most needed!
Fig. 9.43. In a network with „hard” competition very few previously learned patterns are lost
During experiments with the application Example 10c you will surely notice further „kidnappingrelated” phenomenon. New objects might „kidnap” neurons that already were in a stable
„relationship” with some class of objects; it might happen even if there are still many free neurons, if
those free neurons are more distant. In the sample application the phenomenon gets more intense
when firstly one class of objects is shown and only when neurons start identifying with this group
then new classes of objects are shown. Then those new objects sometimes „steal” neurons from
established classes. If all objects were shown at the same time then a competition will be present,
showing in „attracting” neurons once to the first group then to the other (you can see that in the
application Example 10c when you start the self-learning process with a very low number of neurons,
see Fig. 9.44).
Fig. 9.44. „Attracting” a neuron between objects of classes 1. and 4.
In such a case it is possible that the network will learn nothing, even in a very long self-learning
process. That also probably sounds familiar; think about school problems when a few exams were
scheduled for one day - for example mathematics, physics, geography and history - you were
probably not able to prepare well for any of those subjects!
The solution to that problem might be „softening” the competition. You can do this when you go
back to the parameters window (click the Back button). The window is shown in the Figure 9.37, left.
Find and activate the Soft competition parameter (check the checkbox). From this moment on the
application will apply softer limited competition, that means that the winning neuron will be chosen
only if the value of the output signal will be exceptionally high3. That means that neurons will be
evenly divided between classes of recognized objects. Furthermore, there will be no kidnapping of
neurons that already belong to some other class.
Unfortunately – instead of those you will see another worrying phenomenon: under some
circumstances (especially in networks with a low number of neurons) there might be no winner
among the neurons when a particular picture is shown4 (Fig. 9.45).
3
One can say that the softer competition is like partners that have to love each other a lot in order to get
married. Weak and short fascination does not lead to any long-lived consequences (such as marriage).
4
There is an analogy also for this situation: Surely you know old maids and bachelors. Their existence is exactly
the effect of combining the effect of the competition (monogamy) with its soft version (a strong affection is
needed to get married).
Fig. 9.45. Omission of some classes in a network with softened competition
Use the application Example 10c with a soft competition to examine and analyze this phenomenon
carefully. It will be easy, because with this option the application uses special markers that show the
location of patterns of classes that are omitted (that is classes for which there is no winner among
neurons).
You will discover that when there is no strong competition then such omissions happen very often,
especially in networks with a small number of neurons. In such cases for most of classes in the
network – completely voluntarily! - patterns will be created that allow later for an automatic
recognition of signals that belong to the specific classes, but for the „unlucky” class that no neuron
want to detect - there will be no specialized detectors.
- And what?
- Nothing!
Have you ever seen a person that has a flair for history, geography and languages but mathematics is
beyond their abilities?
Download