An Algorithm for Bootstrapping Communications Jun Wang 03/20/03

advertisement
An Algorithm for Bootstrapping
Communications
Jun Wang
03/20/03
Amorphous Computing

Amorphous computing is the
development of organizational principles
and programming languages for
obtaining coherent behavior from the
cooperation of myriads of unreliable
parts that are interconnected in un
known, irregular, and time-varying
ways.
Aim of amorphous computing
Structuring systems so we get acceptable
answers, with high probability, even in
the face of unreliability
Motivation


“If I were designing the human brain,
how would I have the parts learn to
communicate?”
A kind of amorphous computing
A possible solution
A,B: two agents
Communication Lines: a
bundle of wires with an
arbitrary and unknown
permutation
Feature Lines: connection
with outside world
Work mechanism
Communication lines have four states:
1, -1, 0 , x;
 Feature lines are named by things or
actions and driven by roles;
Typical feature lines: bob, mary, push …
Typical roles: subject, object, verb…

Performance Evaluation
Method


Training cycles – data on the feature lines is sent to both
agents;
Test cycles – one agent has no input. How well the output of it
matches the values of the other agent’s feature line?
Encoding View
An Example (1)
An Example (2)
An Example (3)
A symbol is expressed by a
subset of communication
wires;
 symbol mapping:
(xs,xc,xu,xn)
 Inflection mapping:
(role, value)

Algorithm Analysis
Listen in
Listen out
Talk out
Talk-in
Basic idea: each agent transmits the mapping
information to other by driving the communication lines,
and modify his own mapping information according to
the other’s mapping information. After some cycles, the
two agents will get the same symbol mapping.
Formal Automation
Description(1)


Talk in: Adding new elements into two
mapping relation; updating
communication lines states;
Talk out: Driving the communication
lines
Formal Automation
Description(2)

Listen in:
(3)
Symbol mapping adjustment: two agents listen to
each other in every cycle and modify the “certain”
subsets. Finally, they can reach to a same symbol
mapping eventually;
Preparing the output for listen-out;
Inflection mapping adjustment;

Listen out:
(1)
(2)
output the (symbol,role) pairs to
feature lines according to the understanding.
Results




Nw = 10000; Nwps = 100;
Training cycles = 1000;
Thematic frames: 50 nouns, 20 verbs
Results: 200 cycles for share
vocabulary; 500 cycles for inflection
stability
Algorithm feature



Structural parsimony
Robustness
Shallow computation
Performance Degradation
Dissimilar Features

The algorithm can share vocabulary
despite the handicaps imposed like
some non-shared vocabulary between
them.
Download